Home

Azure Data Engineer- Need GC, USC, GC-EAD, H4EAD. at Remote, Remote, USA
Email: [email protected]
From:

sana,

Vrddhi Solutions, LLC

[email protected]

Reply to:   [email protected]

Position:  Azure  Data Engineer

Location:  Chicago/Beloit, however fully remote is fine too since this is a senior role (CST AND EST hours only, does not want PST)

Visa: GC, USC, GC-EAD, H4EAD.

Experience:  10 + Years

Start Date: ASAP

Interview Process/Times: First round will be an hour interview with hiring manager which will include a light coding exercise around SQL and Python, 2nd round will be an interview with the Director which may include coding depending on the candidates performance on the first interview

Manager Notes:

must be strong in data warehousing
strong knowledge of core data concepts
strong in SQL and python
must have experience in data modelling
experience with Spark SQL
looking at candidates with 10+ years of exp
because its contract-to-hire, hiring manager wants to see longer term roles, 2+ year contracts more recently

The Data Warehouse (DW) Data Engineer is responsible for developing batch integrations .The Data Engineer is expected to have deep knowledge of the EDW, Data modelling, integration patterns (ETL, ELT, etc) and may work with one or a range of tools depending on project deliverables and team resourcing.  The Data engineer will also be expected to understand traditional relational database systems and be able to assist in administering these systems.  Candidates must be interested in working in a collaborative environment and possess great communication skills, experience working directly with all levels of a business and able to work in both a team environment as well as individually.  Responsibilities range from batch application\\client integration, aggregating data from multiple sources into a data warehouse, automate integration solution generation using reusable patterns\\scripting, prototyping integration solutions, and security.

 ACCOUNTABILITIES:

Develops batch integration solutions for ABC Supply.  This includes traditional DW workloads and nightly large extracts that are scheduled.
Design and Build Data models star schema, snowflake
Create ADF pipelines to bring new data from various sources
Create Data bricks notebooks for Data transformation
Documents all solutions as needed using ABC standard documentation.
Plans, reviews, and performs the implementation of database changes for integrations/DW work.
Maintain integration documentation and audit tools.  To include developing/updating the integration dashboard.
Work with BI team, PO to build required tables and transform data to load into Snowflake
Provides support for database/database servers as a member of the Data Management team.
Works with project management and business analysis team to provide estimates and ensure documentation of all requirements.
Provide logical layers (database views) for end-user access to data in database systems.
Partners with functional support and help desk teams to ensure communication, collaboration and compliance with support process standards at ABC.
Performs data management tasks as needed.

QUALIFICATIONS

A Bachelor's Degree in Computer Science or a related field is required. A high school diploma and/or equivalent combination of education and work experience may be substituted.
A minimum of 7 years relevant experience of development experience in Data warehousing and various ELT or ETL tools
Preferred experience in Data bricks, Azure, ADF (Azure Data Factory)
A minimum of 5 years' experience building database tables and models.
Must be able to write Complex SQL for DDL and DML operations fluently.
Must have hands on experience with Python, Pyspark, Spark SQL
Strong understanding of enterprise integration patterns (EIP) and data warehouse modeling.
Experience with development and data warehouse requirements gathering analysis and design.
Possess strong business acumen and consistently demonstrates forward thinking.
Eagerly and proactively investigates new technologies.
Is able to effectively work with ambiguous or incomplete information.
Must have a strong working knowledge of technical infrastructure, protocols and networks.
Must have strong experience on multiple hardware and software environments and be comfortable in heterogeneous systems.
Must be able to routinely work with little or no supervision.
Must be able to effectively and efficiently handle multiple and shifting priorities.

Keywords: business intelligence purchase order green card
[email protected]
View all
Mon Feb 06 20:42:00 UTC 2023

To remove this job post send "job_kill 331430" as subject from [email protected] to [email protected]. Do not write anything extra in the subject line as this is a automatic system which will not work otherwise.


Your reply to [email protected] -
To       

Subject   
Message -

Your email id:

Captcha Image:
Captcha Code:


Pages not loading, taking too much time to load, server timeout or unavailable, or any other issues please contact admin at [email protected]
Time Taken: 0

Location: ,