Home

GCP Data Engineer at 100% Remote at Remote, Remote, USA
Email: [email protected]
From:

Vinkal Dhaka,

Siri Info

[email protected]

Reply to: [email protected]

Only H1b candidate Passport number is mandatory for i94.

Title: GCP Data Engineer

Location: Remote 100%

Contract/C2C

Job Description:

Qualifications (Required Technical skills/Experience):

Proficiency in GCP and in-depth knowledge of GCP services, including Dataflow, Big Query, Cloud Functions, Pub/Sub and Composer

Strong programming skills in Python

Experience with data modeling, SQL, and EDW design.

Excellent problem-solving and analytical skills.

Strong communication and collaboration skills.

Proficiency in version control systems, particularly Git

Strong understanding of Datawarehouse concepts, data lakes

Good to Have Azure(Azure SQL, Data Factory, Azure Blob) , Terraform, Experience with Continuous Integration/Continuous Delivery (CI/CD) tools and practices

Roles and Responsibilities:

Data Engineering, Design and Development:

Collaborate with cross-functional teams to design and implement scalable and reliable systems on Google Cloud Platform considering optimal performance, security, and cost-effectiveness.

Build Data ingestion pipelines to extract data from various sources (Azure Blob, Azure SQL, Flat files, Semi structure sources, AWS S3) into the data warehouse in GCP.

Utilize GCP services to build robust and scalable data solutions.

Design, develop, and maintain data pipelines and implement data architecture on GCP using services such as Big Query, Dataflow, Pub/Sub, Cloud Storage, and Cloud Composer.

Expertise in the tools and technology that helps in the process of data collection, cleaning, transforming, and modelling data to achieve useful information.

Leveraging GCP capabilities and technologies for migrating existing databases to cloud.

Collaborate with cross-functional teams to understand data requirements and implement scalable solutions.

Implement and optimize Big Query tables and Complex SQL queries for efficient data retrieval, performance, and efficiency.

EDW (Enterprise Data Warehouse) and Data Model Designing:

Experience with Data modelling, Data warehousing and ETL processes

Work closely with Business and analysts to design and implement data models for effective data representation and analysis.

Ensure data models meet industry standards and compliance requirements in the health-care domain.

Contribute to the design and development of the enterprise data warehouse architecture.

Implement best practices for data storage, retrieval, and security within the EDW.

Health-care Domain Knowledge:

Apply domain-specific knowledge to ensure that data solutions comply with health-care industry regulations and standards.

Stay updated on industry trends and advancements in health-care data management.

Collaboration:

Work collaboratively with cross-functional teams, including Business Teams, analysts, and software engineers, to deliver integrated and effective data solutions.

Participate in code reviews and provide constructive feedback to team members

Keywords: continuous integration continuous deployment sthree
GCP Data Engineer at 100% Remote
[email protected]
[email protected]
View all
Thu Aug 08 20:24:00 UTC 2024

To remove this job post send "job_kill 1641443" as subject from [email protected] to [email protected]. Do not write anything extra in the subject line as this is a automatic system which will not work otherwise.


Your reply to [email protected] -
To       

Subject   
Message -

Your email id:

Captcha Image:
Captcha Code:


Pages not loading, taking too much time to load, server timeout or unavailable, or any other issues please contact admin at [email protected]
Time Taken: 1

Location: ,