Dayone Onsite requirement:- Senior Data Engineering (Python) with Testing needed at Dallas TX, USA (In-person for L1 & L2 interview) !! at Dallas, Texas, USA |
Email: [email protected] |
From: Pradeep Peddireddy, Adbakx [email protected] Reply to: [email protected] Hi, I hope you are doing great today. Our client is looking to procure qualified candidates for the Senior Data Engineering (Python) with Testing experience. Enclosed below is the Job Description for your reference. Kindly forward the resumes of qualified candidates in Word format, Contact details, Submission process format and Skill Matrix for further processing. Title: Senior Data Engineering (Python) with Testing experience Location: Dallas TX, USA ( Dayone Onsite role.. No Remote No Hybrid. Local Consultants only ) Duration: 12+ Months Contract Hourly billing rate: 6 5 /hr C2C maxx Number of Position: 1 Its for Prime Vendor Note: Initial OPT are workable.. but no Stem OPT, CPT and "H1B-Transfer cases" consultants are not workable for this role. . Note: dont submit resume if consultant is not ready to come for In-person for both L1 & L2 interview in Dallas TX. Dont waste each other time. . Must Have Skills Note: We are actively looking for Data Engineering (Python) profiles and good to have Testing experience. We need more Data Engineer skills. Job Requirements: Minimum 10+ years of experienced consultant needed for this role. Extensive experience as Data Engineer with Python Language and Cloud Technologies (AWS preferably). Experience in Automating ETL process/Pipelines and AWS Data & Infrastructure with Python. Extensive experience with AWS components like S3, Athena, EMR, Glue, Redshift, Kinesis and SageMaker. Extensive Experience with SQL/Unix/Linux scripting. Developing/testing Experience on Cloud/On Prem ETL Technologies (Ab Initio, AWS Glue, Informatica, Alteryx). Experience in Data migration from Onprem to Cloud is Plus. Experienced in large-scale application development testing Cloud/ On Prem Data warehouse, Data Lake, Data Science. Extensive experience in DevOps/Data Ops space. Having experience in Data Science platforms like SageMaker/Machine Learning Studio/ H2O is a plus. Work Description SDET Python, AWS, Unix and ETL. Work with business stakeholders, Business Systems Analysts and Developers to ensure delivery of Data Applications. Building Automation Frameworks using Python. Designing and managing the data workflows using Python during development and deployment of data products Design, development of Reports and dashboards. Analyzing and evaluating data sources, data volume, and business rules. Should be well versed with the Data flow and Test Strategy for Cloud/ On Prem ETL Testing. Interpret and analyse data from various source systems to support data integration and data reporting needs. Experience in testing Database Application to validate source to destination data movement and transformation. Working with a team leads to prioritizing business and information needs. Develop and summarize Data Quality analysis and dashboards. Knowledge of Data modeling and Data warehousing concepts with emphasis on Cloud/ On Prem ETL. Execute testing of data analytic and data integration on time and within budget. Troubleshoot & determine best resolution for data issues and anomalies Has deep understanding of data architecture & data modeling best practices and guidelines for different data and analytic platforms Nice to Have Skills Experience using Jenkins and Gitlab Experience using both Waterfall and Agile methodologies. Experience in testing storage tools like S3, HDFS Experience with one or more industry-standard defect or Test Case management Tools Soft Skills Great communication skills (regularly interacts with cross functional team members) Who takes Ownership to complete the tasks on time with less supervision Guiding developers and automation teams in case of an issue Monitoring, reviewing, and managing technical operations Effective problem-solving expertise, troubleshooting, code debugging, and root cause analysis skills. Submission process format First Name Middle Name Last Name Contact Number Email Address Skype Id / Zoom ID Available Start Date Best time to call you in Working hours Your preferred Interview time Slot Work Authorization Visa expire date Highest Qualification Year of Passing University Comfortable working Dayone in Dallas TX, USA Yes / No :- Comfortable to come In-person for both L1 & L2 interview Yes / No :- Last 4 Digits of SSN Total Years of Experience Total U.S.A. Years of Experience Current Location Willing to Relocate (Yes/No) Willing to Travel (Yes/No) Pay Rate (W2/1099/C2C) Profile Sourced from Vendor/Partner company is Consultant on there W2/1099/H1b Are you Currently working Yes/No: Which Date/Month have you done with your Current project Or When is your project going to End Any possibility of Extension of the Project When we scheduled the Interview for you and at the same time you got a meeting then which one you preferred Skills Years of Experience Data Engineer Python Language Cloud Technologies (AWS preferably) Automating ETL process AWS Data & Infrastructure S3, Athena EMR, Glue Redshift, Kinesis SageMaker. Unix/Linux ETL Technologies (Ab Initio, AWS Glue, Informatica, Alteryx). Data migration Data Science platforms Data Modeling Jenkins Gitlab Waterfall and Agile methodologies. Test Case management Tools Thanks & Regards Pradeep Peddireddy Adbakx Senior Client Account Manager Direct Line: 732-802-6899 Board Line : 732-654-7004 Ext 117 Certified Minority Business Enterprise (MBE) E-Verify Employer | Keywords: sthree wtwo Idaho Texas |
[email protected] View all |
Wed Mar 06 03:34:00 UTC 2024 |