Home

Aws Data Engineer with Snowflake Job Description at Snowflake, Arizona, USA
Email: [email protected]
From:

Ranjith,

Securekloud

[email protected]

Reply to:   [email protected]

Hi, 

Greetings from SecureKloud!!

Job Title:
Aws Data Engineer with Snowflake

Experience: 14+ Years 

Duration: Long Term   

Location:

Onsite Position(Minnesota)

Preferred only local to MN

Requirements

Finance and Treasury Data Engineering team constructs pipelines that contextualize and provide easy access to data by the entire enterprise. As a Data Engineer, you will play a key role in growing and transforming our analytics landscape. You will leverage your ability to design, build and deploy data solutions that capture, explore, transform, and utilize data to support business intelligence/insights, Machine Learning, and Artificial Intelligence.

Develop, implement, and manage auto-ingestion of high-volume data pipelines to integrate Snowflake with AWS S3 data lakes and mainframe data sources (VSAM, DB2, GDG, COBOL).

Designing a data integration solution to create pipelines for pushing or pulling data from oracle database application.

Building Data Flow using AWS Glue ETL or any ETL tool compatible with Snowflake for data transformation. Handle ETL transformations to process and transform data from COBOL and other mainframe sources.

Managing Data Lake using AWS Lake Formation service for data governance and Lineage

Build and manage data APIs in Python to facilitate data exchange and integration with Snowflake.

Work closely with clients to understand their data needs and design appropriate Snowflake solutions.

Provide expert guidance on the implementation and optimization of Snowflake solutions.

Optimize Snowflake environments for performance, scalability, and cost-efficiency.

Ensure data security, privacy, and compliance within Snowflake solutions.

Conduct training sessions and workshops for clients on Snowflake best practices and usage.

Provide AWS and snowflake data solutions that align with US Bank's policies for network access and data security by configuring access controls (RABC), encryption, data protection, and monitoring solutions.

Experience:

7+ years of experience in data engineering, data warehousing, or related roles.

Extensive hands-on experience with Snowflake implementations and optimizations. Experience with real-time data processing, particularly involving AWS S3 data lakes and mainframe data sources (VSAM, DB2, GDG, COBOL).

Experience with performance tuning and query optimization in Snowflake. Handling high-volume data pipelines and ensuring their reliability and efficiency.

Experience with ETL transformations for COBOL data sources, understanding and converting JCL code to Python-based ETL processes.

Experience using AWS Glue ETL, AWS Lake Formation or any ETL tool compatible with Snowflake for data transformation.

Hands on experience with snowflake features like SnowPipe, Bulk Copy, Tasks, Streams, Stored procedures, and UDFs

Have experience with Snowflake cloud data warehouse and AWS S3 bucket or Azure blob storage container for integrating data from multiple source system.

Should have 4+ years of experience on AWS services (S3, Glue, Lambda) or Azure services (Blob Storage, ADLS gen2, ADF)

Good to have experience in deployment of code using CI/CD for AWS service, and Snowflake solutions, and exp in repositories like Gitlab, GitHub etc.

Good to have experience in deployment of infrastructure as service (IaC) using tools like Terraform or equivalent tools for AWS service, and Snowflake solutions.

4+ years of experience in building and managing APIs using Python/Pyspark integration with Snowflake and cloud (AWS/Azure).

Knowledge of Snowpark for advanced data processing and analytics within Snowflake.

Experience in Finance and Treasury projects is preferred

Expertise:

Extensive knowledge of Snowflake and its ecosystem.

Deep understanding of data warehousing principles and best practices.

Advanced SQL skills and database design experience.

Proficiency with ETL tools and data integration processes.

Strong knowledge of cloud platforms, specifically AWS, and their integration with Snowflake.

Expertise in data security, privacy, and compliance requirements.

Solid experience working with AWS S3 data lakes and understanding of data lake architecture (experience with AWS S3 data lakes with Snowflake is advantageous).

Basic Qualifications:

Bachelors degree in computer science, Information Technology, or a related field.

Advanced SQL programming skills.

Knowledge of programming languages such as Python or Java

Solid understanding of data warehousing concepts and practices.

Familiarity with ETL tools and data integration processes.

Strong knowledge of cloud platforms, specifically AWS, and their integration with Snowflake.

Experience with AWS S3 data lakes and Snowflake

Strong communication and interpersonal skills.

Ability to manage multiple projects and priorities simultaneously.

Preferred Qualifications:

Snowflake certification (e.g., SnowPro Core Certification).

Experience with other data warehousing tools and technologies (e.g., Azure Synapse, Redshift).

Experience with data visualization tools (e.g., Power BI, Tableau).

Familiarity with data governance and data quality frameworks.

Kindly provide the below details

and revert us with a Visa and DL copy for the submission process.

Full Name    

Mobile & Phone No    

Email ID    

Linkedin Link    

Current Location    

Willing to Relocate/Travel (Y/N)    

Education Credentials    

Legal/Visa Status    

Currently on Project (Y/N)    

Total Years of IT Experience    

Total Years of US Experience    

Interview Availability    

Tech Skills    

Any two reference  

Name:

Designation:

Email:

Contact Number:

Name:

Designation:

Email:

Contact Number:

Thanks & Regards,   

Ranjith      

Technical Recruiter                                                 

Work: 

2143902103 

Email:  

[email protected]

Keywords: continuous integration continuous deployment business intelligence sthree information technology Idaho Minnesota
Aws Data Engineer with Snowflake Job Description
[email protected]
[email protected]
View all
Wed Sep 04 23:48:00 UTC 2024

To remove this job post send "job_kill 1720733" as subject from [email protected] to [email protected]. Do not write anything extra in the subject line as this is a automatic system which will not work otherwise.


Your reply to [email protected] -
To       

Subject   
Message -

Your email id:

Captcha Image:
Captcha Code:


Pages not loading, taking too much time to load, server timeout or unavailable, or any other issues please contact admin at [email protected]
Time Taken: 1

Location: ,