GCP Data Engineer - Remote - No GC at Remote, Remote, USA |
Email: [email protected] |
From: Abhilash s h, Blue ocean ventures [email protected] Reply to: [email protected] Role:GCP Data Engineer Location: Remote As a Data Engineer, you will be a part of an Agile team to build healthcare applications and implement new features while adhering to the best coding development standards . Responsibilities: - We are seeking a highly skilled and experienced Lead Data\\Cloud Engineer to join our healthcare interoperability platform team. The ideal candidate will possess a strong background in data engineering, cloud infrastructure (primarily GCP and Azure), ETL processes, and API development. This role requires expertise in managing and optimizing data pipelines and working seamlessly across multiple cloud and on-premises infrastructures. The Lead Data\\Cloud Engineer will play a pivotal role in designing and implementing a robust, scalable, and secure data infrastructure to support interoperability solutions in the healthcare domain, interacting with numerous internal and external applications, ensuring compliance Healthcare standards such as CDEX, UDAP and integrating applications using relevant healthcare protocols such as FHIR, CCDA, HL7 etc. Design, build, and maintain scalable data pipelines using StreamSets, Azure Data Factory, and possibly related services on GCP. Work seamlessly across a multi-cloud environment, with 50% of infrastructure on GCP, 25% on Azure, and 25% on-premises. Manage and optimize API development within the .NET environment, and to a lesser extent, Python. Develop and maintain event streaming services using Kafka and StreamSets. Administer and work with databases including Snowflake, MongoDB, and FHIR servers (Firely and Azure FHIR). Ensure compliance with healthcare data standards such as UDAP, CDEX, FHIR, HL7, and ADT. Collaborate with cross-functional teams to gather requirements and deliver tailored data solutions that meet business and healthcare regulatory needs. Implement API Gateways using tools like Apigee and Datapower. Provide technical leadership and mentorship to junior data engineers. Monitor, troubleshoot, and optimize the performance and scalability of data solutions. Stay updated with the latest trends and technologies in healthcare interoperability and cloud services. Experience: - 7 Years Location: - Remote Educational Qualifications: - Engineering Degree BE/ME/BTech/MTech/BSc/MSc. Technical certification in multiple technologies is desirable. Skills: - Mandatory skills: Expertise in working with multiple cloud infrastructures, including 50% on GCP, 25% on Azure, and 25% on-premises setups. Strong understanding of container orchestration K8s, Docker Proven experience with StreamSets, Azure Data Factory, and related ETL tools including Databricks Proficiency in managing API development primarily using .NET and familiarity with Python. API Gateway management (e.g., Apigee, Datapower). Hands-on experience with streaming technologies like Kafka and StreamSets. Expertise in databases such as Snowflake, MongoDB, and FHIR servers (Firely, Azure FHIR). Understanding of Firely and Azure FHIR services. Good to have skills: Familiarity with healthcare payor systems and regulatory standards such as HIPAA etc. Knowledge of Programming languages such Dot Net, Python and standards such as, HL7 V2/V3, ADT, FHIR, and CCDA. Experience with other cloud services related to GCP and Azure ecosystems. Regards, Abhilash SH https://www.linkedin.com/in/abhilash-s-h-ab07a5231/ Blue Ocean Ventures 5555 Glenridge Connector Suite 200 Atlanta, GA 30342 Email:[email protected] Will To Serve Will To Win Will To Lead Keywords: Georgia GCP Data Engineer - Remote - No GC [email protected] |
[email protected] View all |
Tue Oct 15 21:51:00 UTC 2024 |