REMOTE - AZURE BigData Engineer at Remote, Remote, USA |
Email: [email protected] |
From: Vick Singh, PEOPLE FORCE CONSULTING, INC. [email protected] Reply to: [email protected] Role: AZURE BigData Engineer Location: Remote / EST work hours 8+ years experience required VISA COPY, PHOTO ID, BACHELORS DEGREE REQUIRED BY THE CLIENT ALONG WITH EVERY RESUME BY THE CLIENT. Job Description: Build a highly functional and efficient Big Data platform that brings together data from disparate sources and allows FinThrive to design and run complex algorithms providing insights to Healthcare business operations. Build ETL Data Pipelines in Azure Cloud using Azure ADF and Databricks using PySpark and Scala. Migrate ETL Data pipelines from On Prem Hadoop Cluster to Azure Cloud. Build Data Ingestion Pipelines in Azure to pull data from SQL Server. Perform Automated and Regression Testing. Partner with internal business, product and technical teams to analyze complex requirements and deliver solutions. Participate in development, automation and maintenance of application code to ensure consistency, quality, reliability, scalability and system performance. Deliver data and software solutions working on Agile delivery teams Requirements: Bachelor's degree in Computer science or a related discipline 6+ years of data engineering in an enterprise environment 6+ years of experience writing production code in Python, PySpark or Scala Strong knowledge of Azure platform. Should have worked in Azure ADF, Deployed ADF and Databricks code to production and be able to troubleshoot production issues. Experience with SQL. Experience with Big Data technologies in Azure such as Spark, Hive, Sqoop, Databricks or any other equivalent components. Experience working with git and CI/CD tools Proven background in Distributed Computing, ETL development, and large-scale data processing Travel: None. Preferred Skills: Healthcare experience preferred Proficiency in SQL and query optimization Proficiency in Linux and Bash shell scripting Experience with Azure ADF, Azure Databricks, Terraform templates, ADF Automated pipelines. Experience migrating applications from an On Prem Hadoop to Cloud. Experience with SQL Server. Knowledge and passion for software development including software architecture, functional and non-functional aspects Any background in ETL tools such as Ab-Initio, Data Stage Keywords: continuous integration continuous deployment Idaho |
[email protected] View all |
Thu Sep 14 23:07:00 UTC 2023 |