Home

Enterprise DevSecOps Data Engineer - 100% REMOTE at Enterprise, Utah, USA
Email: [email protected]
From:

pankaj,

Stellent IT

[email protected]

Reply to:   [email protected]

Enterprise DevSecOps Data Engineer

100% REMOTE

Phone and Skype

Long Term

Job description:

SQL, Data Quality, Thought Leadership, Application Development, Continuous Integration/Delivery, Database, JIRA, Best Practices, Maintenance, Life Cycle, Deployment, Amazon Web Services, DEV OPS, Scripting, SDLC, Operations, Telecommunications, SAP, Identity and Access Management, Python, Big Data, Shell Scripting, ETL, Data Pipelines, Version Control, Salesforce, Terraform, System Integration, Real Time, Golang, Amazon Elastic Compute Cloud, Teradata, Data Profiling, IT Infrastructure Library, ITIL, Business Intelligence, Writing Skills, Business Writing

Job Description

Enterprise DevSecOps Data Engineer

Selected candidates will coordinate Enterprise Data projects and services delivered to internal clients and will participate on other Data projects that involve a wide range of activities from capability assessment, through architecture and tools deployment, and configuration up to operations and maintenance of relevant processes and technologies.

Education:  Degree in Computer Science, Telecommunications, Information Technology, DevSecOps

Responsibilities:

Design and develop data ingestion pipelines

Establish and maintain the necessary cloud infrastructure.

Collaborate with other DevSecOps Team members and leadership

Architect, Design, implement, and maintain scalable and reliable infrastructure solutions in cloud environments; AWS , Snowflake and Databricks

Build data profiling and ETL jobs using heterogeneous sources such as Salesforce, SAP, Teradata and others

Work with the infrastructure, network, database, application and business intelligence teams to guarantee high data quality and availability

Accountable for Big Data and batch/real time analytical solutions leveraging Cloud technologies

Stay up to date with emerging technologies, industry trends, and best practices in DevOps, DataOps, and cloud computing

Contribute to thought leadership in DevSecOps journey

Identify the needs for build automation, designing, and implementing CICD solutions

Create plug-and-play/reusable solutions and patterns for CICD pipelines

Create, develop, and implement automation and system integration for various build platforms

Publish and disseminate CICD best practices, patterns, and solutions

Build or maintain CICD building blocks and shared libraries proactively for app and development teams to enable quicker build and deployment

Design action plans to address CICD platform/tools/solutions shortcomings and difficulties

Actively participate in bridge calls with team members and contractors/vendors to prevent or quickly address problems

Troubleshoot, identify, and fix problems in the DevSecOps domain

Ensure incident tracking tools are updated in accordance with established norms and processes, gather all essential data and document any discoveries and concerns

Identify management concerns and problems, assess them, and offer prompt solutions and/or escalation

Align with technological Systems/Software Development Life Cycle (SDLC) processes and industry-standard service management principles (such as ITIL)

Required Skills & Experience:

5+ Data Engineering experience in multi cloud environments (AWS, Azure. etc..)

At least 8 years of experience in the Application Development (DevOps)

Professional experience managing infrastructure and data pipelines using Cloud Formation, Terraform, PySpark, Airflow, or similar technologies

Expert knowledge with CI/CD tools, Version control system, JIRA. Develop and maintain CI/CD pipelines, integrating testing and monitoring frameworks to enable rapid and continuous software delivery.

Working knowledge of common AWS technologies EC2, S3, IAM, Lambda, Glue, SNS, others

Working knowledge of SQL, Python, Spark,Hands on experience.

Proficiency in scripting languages, such as HCL, GoLang, Python, Bash, PowerShell.

Understanding of infrastructure and data pipelines using Cloud Formation, Terraform, PySpark, Airflow, or similar technologies

Working knowledge of common AWS technologies EC2, S3, IAM, Lambda, Glue, SNS, others

Good understanding on release management standard methodologies and data security

Good understanding on release management standard methodologies and data security

Have a non-technical debt mindset; Shift Left mindset

Preferred Skills & Experience:

Nice to have

Internal client focus experience

Communications skills including the ability to understand client process in any area in detail

Excellent coordination and communication skills

Business writing skills (capturing needs and writing it down on formal documents)

Reliable and with attention to detail

Ability to work alone and bring results

Assist us with creating and implementing an IAC Framework

**This position will be required to work standard business hours in the U.S. Eastern Time Zone**

Keywords: continuous integration continuous deployment sthree information technology
[email protected]
View all
Thu Sep 28 03:06:00 UTC 2023

To remove this job post send "job_kill 691376" as subject from [email protected] to [email protected]. Do not write anything extra in the subject line as this is a automatic system which will not work otherwise.


Your reply to [email protected] -
To       

Subject   
Message -

Your email id:

Captcha Image:
Captcha Code:


Pages not loading, taking too much time to load, server timeout or unavailable, or any other issues please contact admin at [email protected]
Time Taken: 0

Location: ,