Looking for GC or USC as Data Engineer at San Antonio, TX; Chicago, IL; New York, NY- onsite at York, New York, USA |
Email: [email protected] |
Hi, Siddharth here from Vajraasys, Hope you are doing good, I have below C2H Position do reply if have any suitable resume with you, Job Title: Data Engineer (ETL Talend | Snowflake) Location: San Antonio, TX; Chicago, IL; New York, NY- onsite Duration: 12 Months - contract to hire $60- 65/hr on C2C Position Summary We are seeking a skilled Data Engineer with expertise in ETL development, particularly using Talend and Snowflake within AWS environments. The ideal candidate should have a solid understanding of Property & Casualty (P&C) insurance data structures, with the ability to ensure data integrity and address any data discrepancies throughout the claims and policy lifecycle. This role is hands-on and will involve designing, developing, and deploying new ETL pipelines, including setting up CI/CD processes with tools such as Argo. Candidates should have a blend of development and quality assurance skills, as well as experience working with data lakes. Key Responsibilities Design and develop robust ETL pipelines using Talend and Snowflake to support P&C insurance data needs. Architect, build, and maintain data pipelines from scratch, incorporating best practices for data warehousing and data lake implementations. Leverage CI/CD practices and assist in setting up the CI/CD environment with Argo. Identify, analyze, and resolve complex data issues and discrepancies within P&C data structures. Collaborate with cross-functional teams to meet data engineering and warehousing requirements. Ensure effective data pipeline orchestration and automate workflows as needed. Required Skills 6-10 years of experience as a Data Warehouse Engineer with a focus on ETL and data engineering. P&C Insurance Knowledge : Deep understanding of data structures, claims, and policy lifecycle in P&C insurance. ETL Tools : Proficiency with Talend and experience building data pipelines. Data Warehousing : Expertise in Snowflake and advanced SQL for data manipulation. AWS Cloud : Familiarity with cloud architecture and services, specifically on AWS. Python : Strong scripting and automation capabilities. CI/CD : Hands-on experience with Argo or similar CI/CD tools for data engineering. Data Orchestration : Experience with data pipelines, orchestration, and monitoring tools. Desired Attributes Development & QA Mindset : Ability to design with both development and quality assurance perspectives. Leadership Potential : Ability to take a lead role in project execution and guide teams on best practices. Problem-Solving Skills : Strong analytical skills to troubleshoot data and pipeline issues. Additional Information Day-to-Day: Focused on designing and building ETL pipelines, inheriting and enhancing existing processes. Challenges: Balancing development and quality assurance perspectives for robust, high-integrity data solutions. Thanks and Regards, Siddharth |Recruiter |Vajraasys Limited E: [email protected] 3515 Plymouth Blvd, Suite 205, Plymouth, MN 55447 -- Keywords: cprogramm continuous integration continuous deployment quality analyst information technology Illinois Minnesota New York Texas Looking for GC or USC as Data Engineer at San Antonio, TX; Chicago, IL; New York, NY- onsite [email protected] |
[email protected] View all |
Mon Oct 28 20:02:00 UTC 2024 |