Home

Very Urgent GCP Data Architect at Phoenix, Arizona, USA
Email: [email protected]
From:

vivek pandey,

tek inspirations

[email protected]

Reply to:   [email protected]

Title:  GCP Data Architect

Location: Phoenix, AZ (Day 1 onsite / Hybrid Model

Duration: 12+months

Customer: Virtusa / American Express

Visa :- USC , GC , H1B 

Mandatory Skills:

Extensive experience working with GCP Data-related Services such as Cloud Storage, Dataflow, Dataproc, BigQuery, Bigtable

Very strong experience with Google Composer and Apache Airflow; ability to set up, monitor, and debug a complex environment running a large number of concurrent tasks

Good Exposure to RDBMS / SQL fundamentals

Exposure to Spark, Hive, GCP Data Fusion, GCP Astronomer, Pub/Sub Messaging, Vertex, and the Python Programming Language

Minimum Qualifications:

Bachelor degree in Engineering or Computer Science or equivalent OR Master in Computer Applications or equivalent.

A solid experience and understanding of considerations for large scale architecting, solutioning and operationalization of data warehouses, data lakes and analytics platforms on GCP is a must.

Create detailed target state technical, security, data and operational architecture and design blueprints incorporating modern data technologies and cloud data services demonstrating modernization value proposition

Minimum of 12 years of designing, building and operationalizing large-scale enterprise data solutions and applications using one or more of GCP data and analytics services in combination with 3rd parties - Spark, Hive, Cloud DataProc, Cloud Dataflow, Apache Beam/ composer, Big Table, Cloud BigQuery, Cloud PubSub, Cloud storage Cloud Functions & GitHub performing detail assessments of current state data platforms and creating an appropriate transition path to GCP cloudA solid experience and understanding of considerations for large scale architecting, solutioning and operationalization of data warehouses, data lakes and analytics platforms on GCP is a must. Create detailed target state technical, security, data and operational architecture and design blueprints incorporating modern data technologies and cloud data services demonstrating modernization value proposition Minimum of 8 years of designing, building and operationalizing large-scale enterprise data solutions and applications using one or more of GCP data and analytics services in combination with 3rd parties - Spark, Hive, Cloud DataProc, Cloud Dataflow, Apache Beam/ composer, Big Table, Cloud BigQuery, Cloud PubSub, Cloud storage Cloud Functions & GitHub performing detail assessments of current state data platforms and creating an appropriate transition path to GCP cloudExperience with Data lake, data warehouse ETL build and designExperience with Google Cloud Services such as Streaming + Batch, Cloud Storage, Cloud Dataflow, Data Proc , DFunc, Big Query & Big TableProven ability in one or more of the following programming or scripting languages- Python, JavaScript, Java. 

Vivek Pandey

Technical Recruiter

Email: 
[email protected] 
 | 

TEK Inspirations LLC : 13573 Tabasco Cat Trail, Frisco, TX 75035

Disclaimer:
 If you are not interested in receiving our e-mails then please reply with a "REMOVE" in the subject line to [email protected]. And mention all the e-mail addresses to be removed with any e-mail addresses, which might be diverting the e-mails to you. We are sorry for the inconvenience

Keywords: green card Arizona Texas
[email protected]
View all
Wed Mar 13 19:47:00 UTC 2024

To remove this job post send "job_kill 1212300" as subject from [email protected] to [email protected]. Do not write anything extra in the subject line as this is a automatic system which will not work otherwise.


Your reply to [email protected] -
To       

Subject   
Message -

Your email id:

Captcha Image:
Captcha Code:


Pages not loading, taking too much time to load, server timeout or unavailable, or any other issues please contact admin at [email protected]
Time Taken: 11

Location: Phoenix, Arizona