100% Remote || Big Data/Hadoop Engineer with Strong Azure || Long Term Contract at Strong, Arkansas, USA |
Email: [email protected] |
Azure must have in Recent years || Need H1B and H4 Only Rate : $60/hr on C2C || Only for H1B / H4 || Linked IN and Passport # must. Please mention Visa status while sharing resume To get fast response mention if anyone is Ex LTI / Mindtree / Virtusa / Cognizant / TCS / Accenture / Infosys / HCL / Wipro / Capgemini / Tech M / IBM / ATOS / DELOITTE / SYNTEL / L&T / MPHASIS / BIRLASOFT / OR NTT OR HEXAWARE / GENPACT OR Any Implementation,. Job Title- Big Data/Hadoop Engineer with Strong Azure Location: 100% Remote Type: : long Term Contract Healthcare domain exp nice to have. Job Description Major Responsibilities/Activities Build a highly functional and efficient Big Data platform that brings together data from disparate sources and allow Client to design and run complex algorithms providing insights to Healthcare business operations. Build ETL Data Pipelines in Azure Cloud using Azure ADF and Databricks using PySpark and Scala. Migrate ETL Data pipelines from On Prem Hadoop Cluster to Azure Cloud. Build Data Ingestion Pipelines in Azure to pull data from SQL Server. Perform Automated and Regression Testing. Partner with internal business, product and technical teams to analyze complex requirements and deliver solutions. Participate in development, automation and maintenance of application code to ensure consistency, quality, reliability, scalability and system performance. Deliver data and software solutions working on Agile delivery teams Requirements: Bachelor's degree in Computer science or a related discipline 6+ years of data engineering in an enterprise environment 6+ years of experience writing production code in Python, PySpark or Scala Strong knowledge of Azure platform. Should have worked in Azure ADF, Deployed ADF and Databricks code to production and be able to troubleshoot production issues. Experience with SQL. Experience with Big Data technologies in Azure such as Spark, Hive, Sqoop, Databricks or any other equivalent components. Experience working with git and CI/CD tools Proven background in Distributed Computing, ETL development, and large-scale data processing Travel: None. Preferred Skills: Healthcare experience preferred Proficiency in SQL and query optimization Proficiency in Linux and Bash shell scripting Experience with Azure ADF, Azure Databricks, Terraform templates, ADF Automated pipelines. Experience migrating applications from an On Prem Hadoop to Cloud. Experience with SQL Server. Knowledge and passion for software development including software architecture, functional and non-functional aspects Any background in ETL tools such as Ab-Initio, Data Stage Your sincerely , Ajay Sharma | Sr. Technical Recruiter. Net 2Source Inc. Fax: (201) 221-8131 | Email: [email protected] Global HQ Address: 270 Davidson Ave, Suite 704, Somerset, NJ 08873, USA Web: www.net2source.com | Social: | | Disclaimer: Information contained and transmitted by this E-MAIL including any attachment is proprietary to Net2Source (along with its subsidiaries and affiliates) and is intended solely for the addressee/s, and may contain information that is privileged, confidential or exempt from disclosure under applicable law. Access to this e-mail and/or to the attachment by anyone else is unauthorized. If this is a forwarded message, the content and the views expressed in this E-MAIL may not reflect those of N2S. If you are not the intended recipient, an agent of the intended recipient or a person responsible for delivering the information to the named recipient, you are notified that any use, distribution, transmission, printing, copying or dissemination of this information in any way or in any manner is strictly prohibited. If you are not the intended recipient of this mail kindly delete from your system and inform the sender with a copy to [email protected]. There is no guarantee that the integrity of this communication has been maintained and nor is this communication free of viruses, interceptions, or interference. The GDPR (General Data Protection Regulation) came into effect on May 25, 2018, we have your e-mail address in our database to keep you updated with news, updated and notifications. Please view our Data Privacy Policy for more information. -- Keywords: continuous integration continuous deployment information technology New Jersey |
[email protected] View all |
Thu Feb 29 21:22:00 UTC 2024 |