Data Engineer in Dallas, TX ==Must come for F2F Interview == Locals only at Dallas, Texas, USA |
Email: [email protected] |
From: Anand, Archon [email protected] Reply to: [email protected] Position: Data Engineer Location: Dallas, TX Duration: Long Term Contract 100% onsite Position Must come for F2F Interview ONLY LOCALS Note: We are actively looking for Data Engineering (Python) profiles and good to have Testing experience. We need more Data Engineer skills. Work Description SDET ETL, Advanced SQL, UNIX/LINUX/ Python and Data Cloud Work with business stakeholders, Business Systems Analysts and Developers to ensure quality delivery of software. Interact with key business functions to confirm data quality policies and governed attributes. Follow quality management best practices and processes to bring consistency and completeness to integration service testing Designing and managing the testing AWS environments of data workflows during development and deployment of data products Provide assistance to the team in Test Estimation & Test Planning Design, development of Reports and dashboards. Analyzing and evaluating data sources, data volume, and business rules. Should be well versed with the Data flow and Test Strategy for Cloud/ On Prem ETL Testing. Interpret and analyses data from various source systems to support data integration and data reporting needs. Experience in testing Database Application to validate source to destination data movement and transformation. Work with team leads to prioritize business and information needs. Develop and summarize Data Quality analysis and dashboards. Knowledge of Data modeling and Data warehousing concepts with emphasis on Cloud/ On Prem ETL. Execute testing of data analytic and data integration on time and within budget. Troubleshoot & determine best resolution for data issues and anomalies Experience in Functional Testing, Regression Testing, System Testing, Integration Testing & End to End testing. Has deep understanding of data architecture & data modeling best practices and guidelines for different data and analytic platforms Job Requirements: Must Haves Skills Extensive experience with Python scripting and Cloud Technologies Extensive experience with AWS components like S3, Athena, EMR, Glue, Redshift, Kinesis and Sagemaker Experience in Automating ETL process with Python and Automation around AWS Data & Infrastructure Extensive Experience with SQL/Unix/Linux scripting is a must Extensive experience on Developing/testing Cloud/On Prem ETL (Ab Initio, AWS Glue, Informatica, Alteryx) Extensive Experience in Data migration is a must (Teradata to Redshift preferred) Experienced in large-scale application development testing Cloud/ On Prem Data warehouse, Data Lake, Data Science Experience in building Data flow CI/CD pipelines in GitLab Extensive experience in DevOps/Data Ops space. Experience in Data Science platforms like SageMaker/Machine Learning Studio/ H2O. Strong experience of Kafka. Nice to Have Skills Experience using Jenkins and Gitlab Experience using both Waterfall and Agile methodologies. Experience in testing storage tools like S3, HDFS Experience with one or more industry-standard defect or Test Case management Tools Soft Skills Great communication skills (regularly interacts with cross functional team members) Who takes Ownership to complete the tasks on time with less supervision Guiding developers and automation teams in case of an issue Monitoring, reviewing, and managing technical operations Effective problem-solving expertise, trouble shooting, code debugging, and root cause analysis skills. Regards! Anand Sr Technical Recruiter Email: [email protected] Keywords: continuous integration continuous deployment sthree Texas |
[email protected] View all |
Wed Sep 20 18:59:00 UTC 2023 |