Home

Looking for Snowflake Data Engineer - Remote - PP Num Must at Snowflake, Arizona, USA
Email: mounikaj@brillius.com
From:

Mounika,

Brillius

mounikaj@brillius.com

Reply to:   mounikaj@brillius.com

Role: Snowflake Data Engineer

Location: Remote

Top Skills Required:
Minimum 3 years of Snowflake Data Engineering experience, specializing in data ingestion and transformation using Snowflake native tools like "COPY INTO," Snowpipe, and Stored Procedures.
Understand data modeling in Snowflake
Knowledge of normalization and denormalization techniques
Familiarity with Snowflake's data format support and structure optimization
Understanding of clustering for query performance improvement
Handling large-scale data ingestion into Snowflake
Experience with Snowpipe for data loading
Experience using COPY INTO command for batch processes
Optimization of file sizes and formats for efficient loading
Implementing ELT processes in Snowflake
Understanding of loading raw data into Snowflake using Snowpipe or bulk loading
Familiarity with transforming data using Snowflake's SQL capabilities and virtual warehouses
Optimizing SQL queries in Snowflake
Knowledge of using clustering keys for data storage alignment
Efficient SQL writing with WHERE clause filters and JOIN optimizations
Utilization of caches effectively for performance improvement
Implementing data partitioning and clustering in Snowflake
Understanding of automatic clustering and its impact on query performance
Knowledge of partitioning for efficient data management and querying subsets
Capability to apply previous experience with challenging data integration projects using Snowflake
Problem-solving skills and experience in handling complex data integration tasks
Adaptability and creativity in overcoming challenges
Ensuring data quality and integrity in Snowflake
Implementing data validation checks during ingestion
Utilizing transactions and Snowflake's features for data quality like constraints and data profiling
Managing and monitoring data storage costs in Snowflake
Knowledge of monitoring warehouse usage and adjusting sizes
Familiarity with resource monitors and setting up alerts
Practices for managing storage costs through data archiving or purging
Not required but desired: previous experience using Snowflakes Time Travel feature for data recovery and historical data analysis
Understanding of Time Travel for accessing historical data and object restoration
Practical applications in data analysis and recovery scenarios
Handling data security and compliance in Snowflake
Implementation of role-based access control
Utilization of encryption for data at rest and in transit
Regular auditing for compliance with data protection regulations, especially for sensitive data like PHI Data
Minimum 3 years of AWS experience in data integration and transformation, including proficiency with services like Glue, DMS, and Stored Procedures. Experience in orchestration and monitoring using tools such as AWS MWAA-Airflow, Glue workflow, SNS, and SQS.
Strong proficiency in SQL, particularly SnowSQL, and PostgreSQL.
Proficient in Python coding for ETL scripting and automation.
Competence in CI/CD practices, utilizing platforms like GitHub, Bitbucket, Azure DevOps, and/or AWS Code (e.g., star, commit, build, deploy, pipeline).

Keywords: continuous integration continuous deployment
https://jobs.nvoids.com/job_details.jsp?id=1191373
mounikaj@brillius.com
View All
01:29 AM 07-Mar-24


To remove this job post send "job_kill 1191373" as subject from mounikaj@brillius.com to usjobs@nvoids.com. Do not write anything extra in the subject line as this is a automatic system which will not work otherwise.


Your reply to mounikaj@brillius.com -
To       

Subject   
Message -

Your email id:

Captcha Image:
Captcha Code:


Pages not loading, taking too much time to load, server timeout or unavailable, or any other issues please contact admin at me@nvoids.com


Time Taken: 0

Location: ,