Home

Job Title GCP Data Engineer Location Remote Duration 6 Months at Remote, Remote, USA
Email: [email protected]
From:

Suresh,

VYZE INC

[email protected]

Reply to:   [email protected]

Hi, hope you are doing great, please go through the below job description and provide me your consultant updated resume with visa and current location.

Mandatory Certifications Required: GCP Professional Architect or GCP Data Engineer

Job Title: GCP Data Engineer

Location: Remote

Duration: 6+ Months

Client: Cerberus

Visa: GC/USC Only

Target Start Date: Immediate start

Job description:

Client need to have someone who can own a project and advise the client during the migration.

It needs to be someone really decisive and with experience leading this type of project.

Please adjust and look for candidates with around 10-12+ years of experience in addition to the 7+ with the Google cloud experience required.

10+ Years experience in IT

7+ years experience with GCP cloud

2+ Years experience with Azure

Projects Information: (Information to share during an introductory interview E.g Project`s Goals, relationship between client and GFT, location of the team members, etc) This is for a portfolio company of Cerberus Technology solutions, who we have worked with for over 4 years. We dont know the name of the portfolio company yet. They are in the process of migrating their data analytics platform from Snowflake to GCP.

Do candidates need to have specific industry experience No, but experience in data transformations for Retail and e-commerce business use cases will be a plus

Technical and General Skillset Required
:

Required Qualifications:

The candidate must be an SME who is able to advise the client on best ways to proceed regarding Azure and\\or GCP technologies;

7+ years proven experience in developing and deploying data pipelines in GCP or Azure. Must have experience in both at least 7+ in either and minimum 2-3 with the other;

5+ years of Snowflake, BigQuery and/or Databricks experience

5+ years proven experience in building frameworks for data ingestion, processing, and consumption using GCP Data Flow, GCP Data Composer, Big Query.

4+ years of strong experience with SQL, Python, Java, API development

2+ years of proven expertise in creating real-time pipelines using Kafka, Pub/sub.

Building high quality data pipelines with monitoring and observability

2+ years of experience building dashboards and reports with PowerBI and/or Thoughtspot

Thanks and Regards

.

Suresh Nayak

Technical Recruiter

Vyze INC

Email:

[email protected]

25179 Methley Plum Place, Aldie, VA 20105

www.vyzeinc.com

Disclaimer:
This communication, along with any documents, files or attachments, is intended only for the use of the addressee and may contain confidential information. If you are not the intended recipient, you are hereby notified that any dissemination, distribution or copying of any information contained in or attached to this communication is strictly prohibited, To remove your email address permanently from future mailings, please send REMOVE to
[email protected]

Keywords: information technology golang green card Virginia
[email protected]
View all
Tue Sep 19 03:20:00 UTC 2023

To remove this job post send "job_kill 653377" as subject from [email protected] to [email protected]. Do not write anything extra in the subject line as this is a automatic system which will not work otherwise.


Your reply to [email protected] -
To       

Subject   
Message -

Your email id:

Captcha Image:
Captcha Code:


Pages not loading, taking too much time to load, server timeout or unavailable, or any other issues please contact admin at [email protected]
Time Taken: 0

Location: ,