Immediate Job opening || Data Engineer with ETL or Snowflake Technical Lead || Hybrid || Wayne, PA || NO H1B || USC or GC or Any EAD at Wayne, New York, USA |
Email: [email protected] |
From: Yogesh Chauhan, Sibitalent [email protected] Reply to: [email protected] Job Title ETL/Snowflake Technical Lead Job Location Hybrid in Wayne, PA End Client Confidential Duration 6+ Months Visa Any Visa Except H1 Mode of Interview Phone/Skype Job description - The Sr. Data Engineer is responsible for playing the role of ETL/Snowflake Technical Lead. The chosen candidate will lead the team of data engineers in understanding and supporting the businesses through the design, development, and execution of Extract, Transform, and Load (ELT/ETL), data integration, and data analytics processes for a migration of an Azure Data Lake to a new Snowflake database. The Lead data engineer will work closely with various customers including their immediate project teams, business domain experts and other technical staff members. Work daily within a project team environment, taking direction from project management and technical leaders. Responsible for design, development, administration, support, and maintenance of the Snowflake Platform. Participates in the full systems life cycle and cloud data lake/data warehouse design and build including recommendation of code development, integration with data marketplace or reuse and buy versus build solutions. Job Responsibilities: Technical Leadership Lead data integration across the enterprise through design, build and implementation of large scale, high-volume, high-performance data pipelines for both on-prem and cloud data lake and data warehouses. Lead the development and documentation of technical best practices for ELT/ETL activities. Create a new Snowflake database environment including migration of data from Azure Data Lake. Solution Design Lead the design of technical solutions including code, scripts, data pipelines, processes/procedures for integration of data lake and data warehouse solutions in an operative IT environment. Code Development Ensures data engineering activities are aligned with scope, schedule, priority, and business objectives. Oversees code development, unit, and performance testing activities. Responsible for coding and tech-lead the team to implement the solution. Testing Leads validation efforts by verifying the data at various middle stages that are being used between source and destination and assisting others in validating the solution performs as expected. Meets or exceeds all operational readiness requirements (e.g., operations engineering, performance, and risk management). Job Requirements: High School Diploma, GED, or foreign equivalent required. Bachelors in computer science, Mathematics or related field + 7 years of development experience preferred, or 10 years comparable work experience required. 10 years of experience designing, developing, testing, and implementing Extract, Transform and Load (ELT/ETL) solutions using enterprise ELT/ETL tools. 12 years of comparable work experience. 10 years of experience developing and implementing data integration, data lake and data warehouse solutions in an on-premises and cloud environment. 5 years of experience working with Business Intelligence tools QuickBase is preferred. 7 years of experience working with API s, data as a service, data marketplace and data mesh. 10 years of experience with various Software Development Life Cycle methods such as Agile, SCRUM, Waterfall, etc. Proven experience developing and maintaining data pipelines and ETL jobs using Snowflake ETL tools (Snow SQL, Snowpark, etc.). Knowledge of AWS cloud services such as S3, EMR, Lambda, Glue, Sage Maker, Redshift & Athena and/or Snowflake. Experienced in data modelling for self-service business intelligence, advanced analytics, and user application. Experience with migrating, architecting, designing, building, and implementing cloud data lake, data warehouses (cloud/on-prem), data mesh, data as a service, and cloud data marketplace. Ability to communicate complex technical concepts by adjusting messaging to the audience: business partners, IT peers, external stakeholders, etc. Proven ability to design and build technical solutions using applicable technologies; ability to demonstrate exceptional data engineering skills. Ability to prioritize work by dividing time, attention and effort between current project workload and on-going day to day activities. Demonstrates strength in adapting to changes in processes, procedures and priorities. Proven ability to establish a high level of trust and confidence in both the business and IT communities. Must be team-oriented and have excellent oral and written communication skills. Strong analytic and problem-solving skills. Good organizational and time-management skills. Must be a self-starter to understand existing bottlenecks and come up with innovative solutions. Experience with data model design, writing complex SQL queries, etc., and should have a good understanding of BI/DWH principles. Expertise in Relational Database Management System, Data Mart and Data Warehouse design. Expert-level SQL development skills in a multi-tier environment. Expertise in flat file formats, XML within PL/SQL, and file format conversion. Strong understanding of SDLC and Agile Methodologies. Strong understanding of model driven development. Strong understanding of ETL best practices. Proven strength in interpreting customer business needs and translating them into application and operational requirements. Strong problem-solving skills and analytic skills with proven strength in applying root cause analysis. Thanks & Regards Yogesh Chauhan Techincal Recuriter Mob :- +1(972)-736-8304 : [email protected] Website : www.sibitalent.com Office : 101 E. Park Blvd., Suite 600, Plano, TX -75074 Keywords: business intelligence sthree information technology procedural language Pennsylvania Texas |
[email protected] View all |
Sat Mar 09 02:45:00 UTC 2024 |