Home

ETL Developer - Hybrid in NJ - NO H1B at Remote, Remote, USA
Email: [email protected]
From:

jyoti,

kpg99

[email protected]

Reply to:   [email protected]

ETL Developer

Hybrid in Iselin, NJ

Client - Mizuho

Important qualifications:

At least 5 years of full-time development experience using Python.

Designing and implementing highly performant data ingestion pipelines from multiple sources using Azure Databricks.

Direct experience of building data pipelines using Azure Data Factory and (preferably Databricks).

Extensive experience in software development and the entire SDLC.

Solid understanding of a variety of programming tools and development platforms.

Experience in creating high-level product specifications and design documents.

Experience in writing Python applications with the benefit of frameworks like Django, Flask, Pyramid, or Tornado.

Experience in Python testing and code analysis tools like Pytest and Pylint.

Integration experience (ETL, ELT) with Python.

Strong SQL skills.

Familiarity with SSIS.

General development expertise, use of version control, ticketing, and continuous integration systems.

Experience in using an Enterprise Scheduler (Tidal)

Experience in an Agile Development environment

On-point communication skills to concisely report status of work/issues and next action, and articulate technical complexity to Business analysts, Project manager and Business users

Responsibilities:

Migrate existing SSIS ETL scripts to Python; develop new ETL scripts

Support existing SSIS SQL Projects

Maintain ETL pipelines in and out of data warehouses using a combination of Python and

Snowflakes SnowSQL

Write SQL queries against Snowflake.

Understanding data pipelines and modern ways of automating data pipelines using cloud-

based

Work closely with existing senior integration staff to flush out design, priority, and build.

Scaffolding and framework will be needed for staging and transforming datasets.

Use the existing DevOps pipeline for Python and enhance it if necessary.

Automate Python scripts on our enterprise scheduler.

Strong troubleshooting skills to identify root cause and resolve production issues

Skills:

At least 5 years of full-time development experience using Python.

Designing and implementing highly performant data ingestion pipelines from multiple

sources using Azure Databricks.

Direct experience of building data pipelines using Azure Data Factory and (preferably

Databricks).

Extensive experience in software development and the entire SDLC.

Solid understanding of a variety of programming tools and development platforms.

Experience in creating high-level product specifications and design documents.

Experience in writing Python applications with the benefit of frameworks like Django,

Flask, Pyramid, or Tornado.

Experience in Python testing and code analysis tools like Pytest and Pylint.

Integration experience (ETL, ELT) with Python.

Strong SQL skills.

Familiarity with SSIS would be helpful.

General development expertise, use of version control, ticketing, and continuous

integration systems.

Experience in using an Enterprise Scheduler (Tidal)

Experience in an Agile Development environment

On-point communication skills to concisely report status of work/issues and next action,

and articulate technical complexity to Business analysts, Project manager and Business

users

Education:

Bachelor's degree in Computer Science or Finance

Keywords: information technology New Jersey
[email protected]
View all
Sat Feb 24 00:02:00 UTC 2024

To remove this job post send "job_kill 1152995" as subject from [email protected] to [email protected]. Do not write anything extra in the subject line as this is a automatic system which will not work otherwise.


Your reply to [email protected] -
To       

Subject   
Message -

Your email id:

Captcha Image:
Captcha Code:


Pages not loading, taking too much time to load, server timeout or unavailable, or any other issues please contact admin at [email protected]
Time Taken: 2

Location: , Indiana