Fully Remote Opportunity Data Engineer with Payment Domain at Remote, Remote, USA |
Email: [email protected] |
Hi , I have a job opportunity for you if you are looking for new job you can call me on 248-306-8434 , or you can send me your updated profile at [email protected] . Job Title: Senior Data Engineer with Payment Domain Location: Remote 10+ -15 years of experience candidate Key Skills required: We are looking for Senior Data Engineering Candidates based out of the US Expertise in SQL, Python and ETL flows is a must have. Medium expertise/working knowledge in Scala is also needed, as our data pipelines are written in both Scala and Hive/Spark SQL, with new pipelines being written in Scala and legacy ones on SQL (which require ongoing maintenance). Payments Data Platform team is looking to add Data Engineers from supplier to work in building scalable Spark data pipelines leveraging Airflow scheduler/executor framework. Candidates will develop data pipelines and analyze large data sets to identify gaps and inconsistencies in the data. What the candidate will do: Work closely with Engineering stakeholders to build and upgrade data pipelines for MICP (Multi- Item-Checkout). Candidates will also work on deprecating legacy data pipelines and replacing them with new ones. This includes identifying the downstream pipelines consuming legacy data, modifying the fields in the certified data if needed, and validating the data consistency after the Migration. Minimum Requirements: 5 years experience building scalable Spark data pipelines (preferably using Scala) 3-5 years experience in high level programming languages such as Java, Scala, or Python Proficiency in Spark/MapReduce development and expertise with data processing (ETL) technologies to build and deploy production-quality ETL pipelines Good understanding of distributed storage and compute (S3, Hive, Spark) Experience using ETL framework (ex: Airflow, Flume, Oozie etc.) to build and deploy production-quality ETL pipelines Demonstrated ability to analyze large data sets to identify gaps and inconsistencies, provide data insights, and advance effective product solutions Working knowledge of relational databases and expertise in query authoring (SQL) on large datasets Experienced with big data technologies such as Hadoop, Spark, Hive, etc. Experience working with Git and Jira (or other source control and task management tools) Good communication skills that allows smooth collaboration with stakeholders Warm Regards Shivi Sharma Technical Recruiter Tekshapers Inc 850 Stephenson Hwy, Suite 205, Troy, MI 48083 D: 248-306-8434 W: 248.565.4747 Ext. 134 Email: [email protected] MBE Certified | ISO 9001:2008 | * Disclaimer: This E-Mail may contain Confidential and/or legally privileged Information and is meant for the intended recipient(s) only. If you have received this e-mail in error and are not the intended recipient/s, kindly notify us at [email protected] and then delete this e-mail immediately from your system. You are also hereby notified that any use, any form of reproduction, dissemination, copying, disclosure, modification, distribution, and/or publication of this e-mail, its contents, or its attachment/s other than by its intended recipient/s is strictly prohibited and may be unlawful. Internet communication cannot be guaranteed to be secured or error-free as information could be delayed, intercepted, corrupted, lost, or contain viruses. Tekshapers. does not accept any liability for any errors, omissions, viruses or computer problems experienced by any recipient as a result of this e-mail. -- Keywords: sthree information technology Michigan |
[email protected] View all |
Fri Dec 15 21:35:00 UTC 2023 |