Home

GCP Data Engineer at Dallas, Texas, USA
Email: [email protected]
From:

Sanjeev Kumar Singh,

Tek Inspirations LLC

[email protected]

Reply to:   [email protected]

Job Description -

GCP Data Engineer

Duration: 6 months, with potential extension linked to performance

Location: Dallas TX preferred, but open to Pleasanton CA and Phoenix AZ Hybrid (2-3 days/week at the office)

MOI: Skype (Technical Interview)

Must :

We need to have someone who can own a project and advise the client during the migration.

It needs to be someone really decisive and with experience leading this type of project.

Please adjust and look for candidates with around 10-12+ years of experience in addition to the 7+ with the Google cloud experience required.

10+ Years experience in IT

7+ years experience with GCP cloud

2+ Years experience with Azure

Mandatory Certifications Required: GCP Professional Architect or GCP Data Engineer.

Projects Information: (Information to share during an introductory interview E.g Project`s Goals, relationship between client and GFT, location of the team members, etc) This is for a portfolio company of Cerberus Technology solutions, who we have worked with for over 4 years. We dont know the name of the portfolio company yet. They are in the process of migrating their data analytics platform from Snowflake to GCP.

Do candidates need to have specific industry experience No, but experience in data transformations for Retail and e-commerce business use cases will be a plus

Technical and General Skillset Required:

Required Qualifications

The candidate must be an SME who is able to advise the client on best ways to proceed regarding Azure and\\or GCP technologies;

7+ years proven experience in developing and deploying data pipelines in GCP or Azure. Must have experience in both at least 7+ in either and minimum 2-3 with the other;

5+ years of Snowflake, BigQuery and/or Databricks experience

5+ years proven experience in building frameworks for data ingestion, processing, and consumption using GCP Data Flow, GCP Data Composer, Big Query.

4+ years of strong experience with SQL, Python, Java, API development

2+ years of proven expertise in creating real-time pipelines using Kafka, Pub/sub.

Building high quality data pipelines with monitoring and observability

2+ years of experience building dashboards and reports with PowerBI and/or Thoughtspot

Preferred Qualifications

Extensive experience in data transformations for Retail and e-commerce business use cases will be a plus

Bachelors or Masters in computer engineering, computer science or related area.

Knowledge of Github Actions for CICD

Knowledge of building machine learning models

Mandatory Certifications Required: GCP Professional Architect or GCP Data Engineer

Keywords: information technology Arizona California Texas
[email protected]
View all
Tue Sep 19 19:22:00 UTC 2023

To remove this job post send "job_kill 654591" as subject from [email protected] to [email protected]. Do not write anything extra in the subject line as this is a automatic system which will not work otherwise.


Your reply to [email protected] -
To       

Subject   
Message -

Your email id:

Captcha Image:
Captcha Code:


Pages not loading, taking too much time to load, server timeout or unavailable, or any other issues please contact admin at [email protected]
Time Taken: 8

Location: Dallas, Texas