Home

Data Engineer Pipeline/ETL (AWS) || Remote || We are looking only Ex Otsuka candidate at Remote, Remote, USA
Email: [email protected]
We are looking
only Ex Otsuka candidate

Share resume at
[email protected]

Position:
Data Engineer Pipeline/ETL (AWS)

Duration:
6+ Months

Location:
Remote

Job
Summary:

1.

ODH Inc. is looking for a Data Pipeline Engineer
to join our growing Data Engineering team and participate in design and build
of data ingestion and transformation pipelines based on the specific needs
driven by Product Owners and Analytics consumers. The candidate should possess
strong knowledge, interest in data processing, and have a background in data
engineering. Candidate will also have to work directly with senior data
engineers, solution architects, DevOps engineers, product owners and data
consumers to deliver data products in a collaborative and agile environment.
They will also have to continuously integrate and push code into our cloud
production environments.

Job
Description:

1.

As a key contributor to the data engineering
team, the candidate is expected to:

2.

Build
and deploy modular data pipeline components such as Apache Airflow DAGs, AWS
Glue jobs, AWS Glue crawlers through a CI/CD process.

3.

Translate
Business or Functional Requirements to actionable technical build
specifications.

4.

Collaborate
with other technology teams to extract, transform, and load data from a wide
variety of data sources.

5.

Work
closely with product teams to deliver data products in a collaborative and
agile environment.

6.

Perform
data analysis and onboarding activities as new data sources are added to the
platform.

7.

Proficient
in data modeling techniques and concepts to support data consumers in designing
the most efficient method of storage and retrieval of data.

8.

Evaluate
innovative technologies and tools while establishing standard design patterns
and best practices for the team.

Qualifications:

Required:

1.

Experience in AWS Data processing, Analytics,
and storage Services such as Simple Storage Service (s3), Glue, Athena and
Lake Formation

2.

Experience
in extracting and delivering data from various databases such as MongoDB,
DynamoDB, SnowFlake, Redshift, Postgres, RDS

3.

Coding
experience with Python, SQL, yaml, spark programming (pyspark)

4.

Hands
on experience with Apache Airflow as a pipeline orchestration tool

5.

Experience
in AWS Serverless services such as Fargate, SNS, SQS, Lambda

6.

Experience
in Containerized Workloads and using cloud services such as AWS ECS, ECR and
Fargate to scale and organize these workloads.

7.

Experience
in data modeling and working with analytics teams to design efficient data
structures.

8.

Applied
knowledge of working in agile, scrum, or DevOps environments and teams

9.

Applied
knowledge of modern software delivery methods like TDD, BDD, CI/CD

10.

Applied
knowledge of Infrastructure as Code (IAC)

11.

Experience
with development lifecycle (development, testing, documentation, and
versioning)

Preferred:

1.

AWS Certified Developer Associate

2.

AWS
Certified Big Data Specialty

3.

Gitlab
CI/CD

Thanks &
Regards

Mohd Saif Khan

Recruitment 
Manager

Direct : 814-651-6955

[email protected]
 || 
www.signinsol.com

To follow and receive more updates please 
Click Here

To be removed from our mailing list, please
reply with "REMOVE" in the subject heading and your email address in
the body.

Include complete address and/or domain/ aliases
to be removed. this mail cannot be considered Spam as long as we include
contact information

--

Keywords: continuous integration continuous deployment sthree information technology
[email protected]
View all
Tue Oct 24 00:05:00 UTC 2023

To remove this job post send "job_kill 781284" as subject from [email protected] to [email protected]. Do not write anything extra in the subject line as this is a automatic system which will not work otherwise.


Your reply to [email protected] -
To       

Subject   
Message -

Your email id:

Captcha Image:
Captcha Code:


Pages not loading, taking too much time to load, server timeout or unavailable, or any other issues please contact admin at [email protected]
Time Taken: 0

Location: ,