GCP Data Engineer - Richardson, TX (Onsite) - Long term at Richardson, Texas, USA |
Email: [email protected] |
From: HIGHNI KATRU, DVG Tech Solutions LLC [email protected] Reply to: [email protected] HI, Greetings from DVG Tech Solutions.!! We have an immediate opening with my client. If you are looking for a new project , please send me a resume. Role : GCP Data Engineer Location: Richardson TX C2C/W2 Job Description: We are seeking a highly skilled and motivated ETL/ELT GCP Pipeline Engineer to join our team and contribute to the development of a brand-new product/application. As an ETL/ELT Pipeline Engineer, you will be responsible for designing and building efficient data integration pipelines using GCP native tools or open-source technologies such as Python and PySpark. A candidate with expertise in building APIs would be advantageous. Responsibilities: Designing and developing robust ETL/ELT pipelines to extract, transform, and load data from various sources into our new product/application. Collaborating with cross-functional teams to understand data requirements and translate them into effective pipeline designs. Implementing data quality checks and ensuring the accuracy, completeness, and consistency of data throughout the pipeline process. Optimizing and tuning pipeline performance to ensure efficient data processing and delivery. Working experience in Airflow DAG Working with cloud-based technologies, particularly Google Cloud Platform (GCP), and leveraging its native tools for data integration. Integrating data from diverse sources and formats, including structured, semi-structured, and unstructured data. Identifying and resolving data integration and transformation issues, ensuring the smooth and reliable flow of data. Staying up to date with industry trends and emerging technologies in the ETL/ELT space to continuously improve pipeline efficiency and effectiveness. Requirements: Strong experience in designing and building ETL/ELT pipelines using GCP native tools or open-source technologies such as Python and PySpark. Familiarity with ETL tools like Ab Initio is a plus, but not a must. Proven ability to work on a brand-new product/application, demonstrating adaptability and problem-solving skills in a dynamic environment. Solid understanding of data integration concepts, data modelling, and database systems. Proficiency in SQL and experience working with various databases and data formats. Excellent analytical and problem-solving skills, with the ability to troubleshoot and resolve complex data-related issues. Strong communication and collaboration skills to effectively work with cross-functional teams and stakeholders. Detail-oriented with a focus on delivering high-quality results within project timelines. Bachelor's or masters degree in computer science, Engineering, or a related field. If you are a motivated and talented ETL/ELT Pipeline Engineer looking to join a dynamic team working on an exciting new product, we would love to hear from you. Apply now and be part of our journey in revolutionizing data integration and analytics. Keywords: wtwo Texas |
[email protected] View all |
Tue Mar 19 22:27:00 UTC 2024 |