BigData Architect at Denver, Colorado, USA |
Email: [email protected] |
From: karthik abaka, Yochana [email protected] Reply to: [email protected] Job Title : Big Data Operation Lead Location: Denver CO Onsite Job Details Having 9+ years of relevant experience in delivering Data Solutions on variety of Data Warehousing, Big Data and Cloud Data Platforms. Implementing SCD Techniques like slowly changing dimensions (type1, type2 and type3). Experience working with distributed data technologies (e.g. Spark, Kafka etc) for building efficient, large scale big data pipelines. Strong Software Engineering experience with proficiency in at least one of the following programming languages: PYspark, Python, Scala or equivalent Experience in working with AWS S3 and Snowflake cloud data warehouse. Experience in Transforming/integrating the data in Redshift/Snowflake. Handling large and complex data sets like JSON, ORC, PARQUET, CSV files from various sources like AWS S3. Good exposure in Snowflake Cloud Architecture and SnowSQL and SNOWPIPE for continuous data ingestion. Hands on experience in bulk loading and unloading data into Snowflake tables. Experience in writing complex SQL scripts using Statistical Aggregate functions and Analytical functions to support ETL in snowflake cloud data warehouse. Used COPY/INSERT,PUT,GET commands for loading data into Snowflake tables from internal,external stages. Experience with performance tuning of SnowFlake data warehouse with Query Profiler, Caching and Virtual data warehouse scaling Extensively worked in ETL process consisting of data transformation, data sourcing, mapping, conversion and loading. Hands on exp loading to snowflake. Knowledge on redshift (Good to have) Knowledge on AWS EC2, lambda Strong Datawarehouse exp Good to have python Should strong on SQL queries Kafka not must. Good to have Keywords: sthree Colorado |
[email protected] View all |
Wed Jan 03 21:33:00 UTC 2024 |