Home

Senior Azure Data Engineer with Snowflake exp - Remote ||ONLY 10+ EXP || CLIENT-HCL ||NEED PP NUMBER at Snowflake, Arizona, USA
Email: [email protected]
Greetings from Shrive Technologies LLC!! 

Shrive Technologies LLC is an IT Development & IT Staffing firm with more than a decade of experience in providing IT Staffing Solutions & Services. Our expertise is in sourcing and deploying highly skilled IT Specialist into mainstream and niche technologies to meet clients Temporary, Permanent & SOW project needs. 

Role : 
Senior Azure Data Engineer

Location: Remote

Duration: Long term

Mandatory skills:  

Azure (Azure Data Factory, Azure SQL DB or any other Azure tool - Logic app, Databricks workspaces, Key vault)

Snowflake

SQ DB

Job Description:

Position Summary:

The Senior Data Engineer will be responsible for the design, development, implementation, and support of the Data Initiatives throughout Gallagher, to ensure that optimal data delivery architecture is consistent throughout ongoing projects.  You will engage in supporting the data analysts and data scientists, and data needs of multiple teams, systems and products. Do you find the prospect of optimizing or even re-designing our companys integration and data architecture to support our next generation of products and data initiatives most exciting We really should explore together. 

Essential Duties and Responsibilities

Drive requirements, scope, and technical design of the integration workflows, to make sure the build is conducted accurately and according to spec.  Develop and maintain requirements, design documentation and test plans.

Seek out, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability.

Coordinate with BI Engineers, Financial Applications and Oracle HR teams around data management including schemas, failure conditions, reconciliation, test data set up, etc.

Build the infrastructure required for optimal ETL/ELT pipelines to ingest data from a wide variety of data sources using Microsoft Azure technologies such as Azure Data Factory and Databricks.

Construct and maintain of enterprise level integrations using the Snowflake platform, Azure Synapse, Azure SQL and SQL Server.

Create data tools for data analytics and data science team members that assist them in building and optimizing our product into an innovative industry leader.

Design analytics tools that utilize the data pipeline to deliver actionable insights into customer acquisition, operational efficiency and other key business performance metrics.

Troubleshoot issues helping to drive root-cause analysis, and work with infrastructure teams to resolve incidents and arrive to a permanent resolution.

Partner with data and analytics teams to strive for greater functionality in our data systems.

Provide direction and coordination for development, and support teams, including globally located resources.

Understand the layout and working of existing integrations that send and receive data between Oracle, Concur, JDE, Corporate Data Platform and other systems.

Required:

A relevant technical BS Degree in Information Technology

5+ years writing SQL queries against any RDBMS with query optimization.

5 years of data engineering experience leveraging technologies such as Snowflake, Azure Data Factory, ADLS Gen 2, Logic Apps, Azure Functions, Databricks, Apache Spark, Scala, Synapse, SQL Server

Experience with scripting tools such as Power Shell, Python, Scala, Java and XML

Understanding the pros and cons, and best practices of implementing Data Lake, using Microsoft Azure Data Lake Storage

Experience structuring Data Lake for the reliability, security and performance.

Experience implementing ETL for Data Warehouse and Business intelligence solutions.

Skills to read and write effective, modular, dynamic, parameterized and robust code, establish and follow already established code standards, and ETL framework.

Strong analytical, problem solving, and troubleshooting abilities.

Good understanding of unit testing, software change management, and software release management

Knowledge of Dev-Ops processes (including CI/CD) and Infrastructure as Code fundamentals

Experience performing root cause analysis on data and processes to answer specific business questions and identify opportunities for improvement.

Experience working within an agile team.

Excellent communication skills

 Please provide the below details for submission along with Visa copy & Photo ID copy, I94/Passport# 

Candidate Details

Candidate Full Name  (As per SSN)

Current location (City and State)

Open for relocation to work location (Yes/No)

Contact/Phone #

E-mail Address

Visa Type Visa Validity (Month & Year) /Work Authorization  

Date of Birth (Month & Year Only)

Passport Number

Bachelors details (University /College & Passing year)

Masters details (University/College & Passing year)

Total IT experience (Years)

Onsite [USA/Canada] Experience (Years)

LinkedIn Profile

Thanks & Regards,

Sr. Recruiter 

Jeshwini (Jessy) 

Email: 
[email protected]

linkedin.com/in/jeshwini-u-1ab7311b6

Shrive Technologies LLC

1300 West Walnut Hill Lane 155-H, Irving, Texas 75038, United States 

www.shrivetechnologies.com 

--

To post to this group, send email to [email protected].

Keywords: continuous integration continuous deployment business intelligence database information technology Idaho
Senior Azure Data Engineer with Snowflake exp - Remote ||ONLY 10+ EXP || CLIENT-HCL ||NEED PP NUMBER
[email protected]
[email protected]
View all
Fri Apr 12 19:14:00 UTC 2024

To remove this job post send "job_kill 1306062" as subject from [email protected] to [email protected]. Do not write anything extra in the subject line as this is a automatic system which will not work otherwise.


Your reply to [email protected] -
To       

Subject   
Message -

Your email id:

Captcha Image:
Captcha Code:


Pages not loading, taking too much time to load, server timeout or unavailable, or any other issues please contact admin at [email protected]
Time Taken: 0

Location: ,