Hot W2 opening for Quality Engineer Remote Role at Remote, Remote, USA |
Email: [email protected] |
From: Gobi, PiplNow LLC [email protected] Reply to: [email protected] Hi, We have an urgent W2 opening for Quality Engineer Level 3 Remote Role Our client is looking to fill this role immediately. If you are interested in this role, Please share the updated resume, filled skill matrix, consultant details, visa and dl copy ASAP. Skill Matrix: Skills Years of experience Over all experience Total years of work exp in US As Quality Engineer ETL Process Validation Data Bricks Data Quality Assurance Test Case Development Python Collaboration Automation Workflow Management GCP Issue Tracking and Resolution Documentation Consultant Details: Full Name Phone Rate/hr on W2 Month & Date of Birth Last 4 Digits of SSN Available time Slots to take an Interview FTE/Contract/C2H Current Location (City, State & ZIP) Willing to work Onsite/Remote/Hybrid If non-local - Open to relocate Availability from Date of Offer Last date of the project Any offers in the pipeline Linked In Profile Highest Education Details Visa Proof of Work Authorization attached Driver's License Attached Total Years of Experience. What Youll Do We are seeking a detail-oriented ETL Tester with expertise in Python, Databricks, and Google Cloud Platform (GCP) to join our data engineering team. The ideal candidate will be responsible for validating ETL processes, ensuring data quality, and supporting data integration initiatives. You will work closely with data engineers, analysts, and stakeholders to ensure that our data pipelines are robust, accurate, and reliable. ETL Process Validation : Validate and verify ETL processes implemented in GCP, ensuring data integrity during extraction, transformation, and loading. Develop and execute comprehensive test cases to confirm that data transformations meet business requirements. Data Quality Assurance : Conduct data profiling and perform quality checks to identify and resolve discrepancies in datasets. Monitor data quality metrics and report on data integrity and quality issues. Test Case Development : Create and maintain detailed test plans and test cases based on ETL specifications and business needs. Ensure full coverage of all ETL processes, including data extraction, transformation, and loading. Collaboration : Work closely with data engineers, data scientists, and other stakeholders to understand ETL workflows and data flows. Participate in design reviews to provide input on testing strategies and best practices. Automation : Use Python to develop automated testing scripts for ETL validation and data quality checks. Leverage Databricks notebooks for testing and validating ETL processes efficiently. Workflow Management : Utilize Apache Airflow for scheduling, monitoring, and managing ETL workflows. Collaborate with teams to troubleshoot and optimize Airflow DAGs related to ETL processes. Issue Tracking and Resolution : Identify, document, and track defects and data quality issues throughout the ETL process. Work with engineering teams to diagnose and resolve data-related problems quickly. Documentation : Maintain clear and comprehensive documentation of testing processes, test cases, and results. Document data mappings, transformation rules, and data flow diagrams for reference. Continuous Improvement : Contribute to the enhancement of ETL testing methodologies and data management practices. Stay updated on GCP, Databricks, and industry trends to continuously improve testing strategies. What Youll Need Strong knowledge of Google Cloud Platform, specifically BigQuery, GCS, and Airflow. 4+ years of Experience with Databricks for data processing and analysis. Proficiency in Python for developing testing scripts and automating testing processes. Keywords: wtwo Hot W2 opening for Quality Engineer Remote Role [email protected] |
[email protected] View all |
Sat Nov 09 00:56:00 UTC 2024 |