Home

ETL Developer with Azure - Onsite at Remote, Remote, USA
Email: [email protected]
From:

Nadeem,

Msys Inc

[email protected]

Reply to:   [email protected]

Title: ETL Developer with Azure - Onsite

Location: Washington, DC, United States

Length: Long term

Restriction: w2 or c2c or 1099

Send resumes to:[email protected]

Description:

**** W2 or 1099 or  c2c *** webcam interview ***** Long term  project  usually the project goes for multiple years with this customer. *** Onsite ***

Short Description:

The Enterprise Data team at OCTO requires an ETL data engineer to support data operations for its Cloud Data Exchange.  The resource will utilize native Azure tools to perform ETL, data loading and data transformation tasks.

Job Description:

The ETL data engineer will support the OCTO Enterprise Data team for data curation, processing, and transformation tasks.  Specifically, the ETL data engineer will be responsible for the following tasks:

Responsibilities:

Analyzes, designs, develops, implements, replicates, and supports complex enterprise data projects.

Interfaces with other agencies, consult with and inform user departments on system requirements, advise on environment constraints and operating difficulties for the current state and advise and resolve problems using cloud solutions and develop and replicate future enhancements to Districts data systems.

Strong knowledge of Extraction, Transformation and Loading (ETL) processes using frameworks like Azure Data Factory, Synapse, Databricks, Informatica by gathering requirements from the stake holders or analyzing the existing code and perform enhancements or new development.

Establishing the cloud and on premise connectivity in different systems like ADLS, ADF, Synapse, Databricks

Hands on experience in Azure cloud services like Azure Data Factory or Azure Synapse, MSSQL Db, Azure SQL DB, Azure Data Lake Storage Gen2, Blob Storage, Python etc.

Worked on creating end to end pipelines to load data by reading it from multiple sources or source systems and load to landing layer or SQL tables.

Familiarity/experience with data integration and data pipeline tools (e.g., Informatica, Synapse, Apache NiFi, Apache Airflow)

Familiarity/experience with various data formats including database specific (Oracle, SQL Server, DB2, Quickbase), text formats(CSV, XML) and Binary(Parquet, AVRO)

Develops, standardizes and optimizes existing data workflow/pipelines adhering to best practices.

Adhere and contribute to enterprise data governance standards by ensuring data accuracy, consistency, security, and reliability.

Automates, monitors, alerts, and manages data pipelines and workflows.

Analyzes and evaluates system changes to determine feasibility, provides alternative solutions, back up, and rollback procedures.

Works on the development of new systems, upgrades and enhancements to existing systems and ensure systems follow approved standards and consistency after the changes.

Develops complex programs and reports in database query language.

Familiarity/experience with data visualization tools.

Familiarity/experience handling and securing sensitive data based on the level of the sensitivity

Demonstrates expertise in conveying technical and functional concepts for a specific technical specialty.

Identifies improvements to project standards to achieve high quality services/ products. This is a professional position which may require subject matter expertise consistent with demanding and rare technological skills.

May require coordination of programming activities being conducted by the application development team

Confers with other business and technical personnel to resolve problems of intent, inaccuracy, or feasibility of computer processing and project design.

Works with necessary personnel to determine if modifications are necessary with interested personnel to determine necessity for modifications or enhancements.

Leverages excellent written and verbal communication skills to develop new business process and programming solutions as directed by business and technical stakeholders.

May coordinate activities of application developers.

Able to identify best practices and standards for the use of the product.

Proven track record of hands on technical design and code work within large complex systems.

Proven hands on technical work with a variety of technologies.

Demonstrated technical expertise integrating a variety of diverse technical environments and cross platform technologies.

Delivers support and design for industry specific applications that require integration with statewide systems or applications.

Interacts with executive level business users or technical experts.

Advanced experience in the required technical subject matter.

May function as a niche technical SME (Subject Matter Expert).

Has proven experience across large and complex implementations and systems.

Minimum Education/Certification Requirements :

Bachelors degree in Information Technology or related field or equivalent experience

Required Skills:

Strong knowledge for development of Extract Transform Load (ETL) processes, including end to end pipelines with data loading from multiple sources 15 Years

Ability to gather and document requirements for data extraction, transformation and load processes 15 Years

Understanding of data warehousing, data lake, business intelligence and information management concepts and standards. 15 Years

Ability to advise internal and external customers on appropriate tools and systems for complex data processing challenges. 15 Years

Knowledge and use of SQL for relational databases 11 Years

Experience with various data formats including database specific (Oracle, SQL, Postgres, DB2), text formats (CSV, XML) and Binary (Parquet, AVRO) 11 Years

Contribute to enterprise data governance standards by ensuring accuracy, consistency, security and reliability 7 Years

Strong experience with Microsoft Azure tools such as Azure Data Factory, Synapse, SQL Database, Data Lake Storage Gen2, Blob Storage 5 Years

Experience with data integration and data pipeline tools such as Informatica PowerCenter, Apache NiFi, Apache Airflow and FME 5 Years

Strong communication skills, both oral and written 3 Years

Ability to provide excellent customer service to external partners 3 Years

Ability to work independently or as part of a larger team 3 Years

Highly Desired Skills:

Experience with visualization and reporting software such as MicroStrategy, Tableau, and Esri ArcGIS 3 Years

Experience performing data functions with Databricks 3 Years

Keywords: database information technology wtwo
ETL Developer with Azure - Onsite
[email protected]
[email protected]
View all
Wed Apr 17 23:56:00 UTC 2024

To remove this job post send "job_kill 1319338" as subject from [email protected] to [email protected]. Do not write anything extra in the subject line as this is a automatic system which will not work otherwise.


Your reply to [email protected] -
To       

Subject   
Message -

Your email id:

Captcha Image:
Captcha Code:


Pages not loading, taking too much time to load, server timeout or unavailable, or any other issues please contact admin at [email protected]
Time Taken: 1

Location: ,