Home

Sr. Data Engineer ETL Developer at VA at Remote, Remote, USA
Email: [email protected]
From:

Sunil,

GB Tech Inc

[email protected]

Reply to:   [email protected]

Hi All,

Immediate Hire.

Job Description

Job Requisition:

Sr. Data Engineer / Extract, Load, Transform (ETL) Developer

Job Description:

Leidos is seeking a Sr. Data Engineer / ETL Developer with strong systems, software, and AWS cloud experience to support Data Analytics Platform initiative for a federal customer. Based in DevOps framework, participate in and/or direct major deliverables of projects through all aspects of the software development lifecycle including scope and work estimation, architecture and design, coding and unit testing.

Primary Responsibilities:

Develop data pipelines using Cloud-Native tools like AWS Glue based on Apache Spark and Step Functions.

Assemble large, complex data sets that meet functional / non-functional business requirements.

Leverage serverless cloud services to prepare (extract and transform) and load large numbers of datasets for data processing.

Extend standard ETL tool capabilities using Glue, Python/PySpark, Step Functions, SQS and Athena.

Implement overall solution comprising of ETL jobs, Lambda and Python.

Support the implementation of data analytics products.

Develop and integrate custom developed software solutions to leverage automated deployment technologies.

Develop, prototype, and deploy solutions in AWS Cloud.

Coordinate closely with functional team to ensure requirements are clearly understood.

Analyze (through proof of concept, performance, and end-to-end testing) and effectively coordinate the infrastructure/service needs working with the architecture and Data Center teams.

Closely work with the architecture team to review the design and ETL code.

Use industry leading DevOps tools such as AWS CloudFormation.

Communicate key project data to team members and build team cohesion and effectiveness.

Leverage Atlassian tool suite like JIRA and Confluence to track activities.

Identify and apply best practices and standard operating procedures.

Create innovative solutions to meet the technical needs of customers.

Basic Qualifications:

Experience working in the software development lifecycle, with strong experience on ETL based development.

Experience working with databases like DynamoDB. Experience with querying tools like SQL.

Experience with containerization tools like Kubernetes.

Experience using Delta Lake.

Experience with data catalog tools like AWS Glue Catalog and DataHub.

Experience working with programming languages (Python required)

Experience working in a fast-paced development environment with drive to completion.

Experience with development using Amazon Web Services (AWS) big data technologies.

Well versed with using version control systems (CodeCommit preferred)

Well versed with using issue/problem tracking systems (Jira preferred)

Candidate must have bachelors with 8-12 years of prior relevant experience or masters with 6-10 years of prior relevant experience.

Preferred Qualifications:

Working experience on AWS Glue and Glue studio.

Experience building processes supporting data transformation, data structures, metadata, dependency and workload management.

Experience with Amazon QuickSight.

Prior experience working with federal government contract.

Keywords:
[email protected]
View all
Wed Oct 25 23:11:00 UTC 2023

To remove this job post send "job_kill 790060" as subject from [email protected] to [email protected]. Do not write anything extra in the subject line as this is a automatic system which will not work otherwise.


Your reply to [email protected] -
To       

Subject   
Message -

Your email id:

Captcha Image:
Captcha Code:


Pages not loading, taking too much time to load, server timeout or unavailable, or any other issues please contact admin at [email protected]
Time Taken: 0

Location: ,