Home

Urgent requirement for GCP Data Engineer at Remote at Remote, Remote, USA
Email: [email protected]
From:

Praveen Kumar,

Magicforce

[email protected]

Reply to: [email protected]

Job Title: GCP Data Engineer

Location: Remote

Duration: 1+ Year

Job Description:

Mandatory skills:

Expertise in working with multiple cloud infrastructures, including 50% on GCP, 25% on Azure, and 25% on-premises setups.

Strong understanding of container orchestration K8s, Docker

Proven experience with StreamSets, Azure Data Factory, and related ETL tools including Databricks

Proficiency in managing API development primarily using .NET and familiarity with Python.

API Gateway management (e.g., Apigee, Datapower).

Hands-on experience with streaming technologies like Kafka and StreamSets.

Expertise in databases such as Snowflake, MongoDB, and FHIR servers (Firely, Azure FHIR).

Understanding of Firely and Azure FHIR services.

Good to have skills:

Familiarity with healthcare payor systems and regulatory standards such as HIPAA etc.

Knowledge of Programming languages such Dot Net, Python and standards such as, HL7 V2/V3, ADT, FHIR, and CCDA.

Experience with other cloud services related to GCP and Azure ecosystems.

Responsibilities: -

We are seeking a highly skilled and experienced Lead Data\\Cloud Engineer to join our healthcare interoperability platform team. The ideal candidate will possess a strong background in data engineering, cloud infrastructure (primarily GCP and Azure), ETL processes, and API development. This role requires expertise in managing and optimizing data pipelines and working seamlessly across multiple cloud and on-premises infrastructures.

The Lead Data\\Cloud Engineer will play a pivotal role in designing and implementing a robust, scalable, and secure data infrastructure to support interoperability solutions in the healthcare domain, interacting with numerous internal and external applications, ensuring compliance Healthcare standards such as CDEX, UDAP and integrating applications using relevant healthcare protocols such as FHIR, CCDA, HL7 etc.

Design, build, and maintain scalable data pipelines using StreamSets, Azure Data Factory, and possibly related services on GCP.

Work seamlessly across a multi-cloud environment, with 50% of infrastructure on GCP, 25% on Azure, and 25% on-premises.

Manage and optimize API development within the .NET environment, and to a lesser extent, Python.

Develop and maintain event streaming services using Kafka and StreamSets.

Administer and work with databases including Snowflake, MongoDB, and FHIR servers (Firely and Azure FHIR).

Ensure compliance with healthcare data standards such as UDAP, CDEX, FHIR, HL7, and ADT.

Collaborate with cross-functional teams to gather requirements and deliver tailored data solutions that meet business and healthcare regulatory needs.

Implement API Gateways using tools like Apigee and Datapower.

Provide technical leadership and mentorship to junior data engineers.

Monitor, troubleshoot, and optimize the performance and scalability of data solutions.

Stay updated with the latest trends and technologies in healthcare interoperability and cloud services

Keywords:
Urgent requirement for GCP Data Engineer at Remote
[email protected]
[email protected]
View all
Tue Oct 15 19:15:00 UTC 2024

To remove this job post send "job_kill 1842008" as subject from [email protected] to [email protected]. Do not write anything extra in the subject line as this is a automatic system which will not work otherwise.


Your reply to [email protected] -
To       

Subject   
Message -

Your email id:

Captcha Image:
Captcha Code:


Pages not loading, taking too much time to load, server timeout or unavailable, or any other issues please contact admin at [email protected]
Time Taken: 0

Location: ,