Home

Snowflake Database Engineer, USC or GC Only at Snowflake, Arizona, USA
Email: [email protected]
From:

Shikha,

KPG99

[email protected]

Reply to:   [email protected]

Hi,

Hope you are doing well.

Please find the job description below and let me know your interest.

Position

: Snowflake Database Engineer, USC or GC Only

Location: 100% remote

Duration: 6+ Month

MOI:  Phone and Video

After careful consideration and discussion with the team, we have refined the skills and use cases for the cloud EDW build effort. Below is the proposed update to the required skills:

Top Skills Required:

Minimum 3 years of Snowflake Data Engineering experience, specializing in data ingestion and transformation using Snowflake native tools like "COPY INTO," Snowpipe, and Stored Procedures.

Understand data modeling in Snowflake

Knowledge of normalization and denormalization techniques

Familiarity with Snowflake's data format support and structure optimization

Understanding of clustering for query performance improvement

Handling large-scale data ingestion into Snowflake

Experience with Snowpipe for data loading

Experience using COPY INTO command for batch processes

Optimization of file sizes and formats for efficient loading

Implementing ELT processes in Snowflake

Understanding of loading raw data into Snowflake using Snowpipe or bulk loading

Familiarity with transforming data using Snowflake's SQL capabilities and virtual warehouses

Optimizing SQL queries in Snowflake

Knowledge of using clustering keys for data storage alignment

Efficient SQL writing with WHERE clause filters and JOIN optimizations

Utilization of caches effectively for performance improvement

Implementing data partitioning and clustering in Snowflake

Understanding of automatic clustering and its impact on query performance

Knowledge of partitioning for efficient data management and querying subsets

Capability to apply previous experience with challenging data integration projects using Snowflake

Problem-solving skills and experience in handling complex data integration tasks

Adaptability and creativity in overcoming challenges

Ensuring data quality and integrity in Snowflake

Implementing data validation checks during ingestion

Utilizing transactions and Snowflake's features for data quality like constraints and data profiling

Managing and monitoring data storage costs in Snowflake

Knowledge of monitoring warehouse usage and adjusting sizes

Familiarity with resource monitors and setting up alerts

Practices for managing storage costs through data archiving or purging

Not required but desired: previous experience using Snowflakes Time Travel feature for data recovery and historical data analysis

Understanding of Time Travel for accessing historical data and object restoration

Practical applications in data analysis and recovery scenarios

Handling data security and compliance in Snowflake

Implementation of role-based access control

Utilization of encryption for data at rest and in transit

Regular auditing for compliance with data protection regulations, especially for sensitive data like PHI Data

Minimum 3 years of AWS experience in data integration and transformation, including proficiency with services like Glue, DMS, and Stored Procedures. Experience in orchestration and monitoring using tools such as AWS MWAA-Airflow, Glue workflow, SNS, and SQS.

Strong proficiency in SQL, particularly SnowSQL, and PostgreSQL.

Proficient in Python coding for ETL scripting and automation.

Competence in CI/CD practices, utilizing platforms like GitHub, Bitbucket, Azure DevOps, and/or AWS Code (e.g., star, commit, build, deploy, pipeline).

The appointed Snowflake Data Engineers will receive guidance from Envision's Principal Data Architect and Lead Data Engineer, focusing on several key use cases:

Large File Ingestion: This may entail leveraging CDC to capture changes and loading only incremental data into Snowflake using native copy into batch processes.

Small Incremental Files: This could involve utilizing Glue, Snowflake batching, and/or Snowpipe.

Data Transformation and Enrichment: Implementing data transformation and enrichment processes to prepare raw data for analysis and reporting. This could involve cleansing, aggregating, and joining data from multiple sources before loading it into Snowflake.

Data Quality Monitoring: Implementing data quality monitoring processes to ensure the integrity and accuracy of data within Snowflake. This could include setting up automated checks and alerts for anomalies or discrepancies in the data.

Data Governance and Compliance: Establishing data governance policies and procedures to ensure compliance with regulatory requirements and industry standards. This could include managing data access controls, auditing data usage, and enforcing data privacy regulations.

Performance Tuning and Optimization: Identifying and implementing performance tuning and optimization strategies to improve the efficiency and scalability of Snowflake data pipelines. This could involve optimizing SQL queries, tuning virtual warehouse configurations, and optimizing data storage and partitioning.

Our tech stack is based on AWS and Snowflake native tools, utilizing Airflow for workflow orchestration and Snowflake copy into and Glue for ETL. Our strategy emphasizes maximizing the use of Snowflake's native tools and services within our ecosystem.

The ideal candidates for these roles will have experience working in diverse environments with varying migration and ingestion strategies and native AWS and Snowflake tools.

Keywords: continuous integration continuous deployment information technology green card
[email protected]
View all
Thu Mar 07 04:27:00 UTC 2024

To remove this job post send "job_kill 1192393" as subject from [email protected] to [email protected]. Do not write anything extra in the subject line as this is a automatic system which will not work otherwise.


Your reply to [email protected] -
To       

Subject   
Message -

Your email id:

Captcha Image:
Captcha Code:


Pages not loading, taking too much time to load, server timeout or unavailable, or any other issues please contact admin at [email protected]
Time Taken: 33

Location: , Remote