Home

GCP Data Architect @ Phoenix, AZ (Onsite) at Phoenix, Arizona, USA
Email: [email protected]
Hi

Position:  GCP Data Architect

Location: Phoenix, AZ (Day 1 Onsite)

Duration: 12+months

Mode: Contract

Job Description:

Mandatory
Skills:

Extensive experience
working with GCP Data-related Services such as Cloud Storage, Dataflow,
Dataproc, BigQuery, Bigtable

Very strong experience
with Google Composer and Apache Airflow; ability to set up, monitor, and
debug a complex environment running a large number of concurrent tasks.

Good Exposure to RDBMS
/ SQL fundamentals

Exposure to Spark,
Hive, GCP Data Fusion, GCP Astronomer, Pub/Sub Messaging, Vertex, and the
Python Programming Language

Minimum
Qualifications:

Bachelor degree in
Engineering or Computer Science or equivalent OR Master in Computer
Applications or equivalent.

A solid experience and
understanding of considerations for large scale architecting, solutioning
and operationalization of data warehouses, data lakes and analytics
platforms on GCP is a must.

Create detailed target
state technical, security, data and operational architecture and design
blueprints incorporating modern data technologies and cloud data services
demonstrating modernization value proposition

Minimum of 12 years of
designing, building and operationalizing large-scale enterprise data
solutions and applications using one or more of GCP data and analytics
services in combination with 3rd parties - Spark, Hive, Cloud DataProc,
Cloud Dataflow, Apache Beam/ composer, Big Table, Cloud BigQuery, Cloud
PubSub, Cloud storage Cloud Functions & GitHub performing detail
assessments of current state data platforms and creating an appropriate
transition path to GCP cloud A solid experience and understanding of
considerations for large scale architecting, solutioning and
operationalization of data warehouses, data lakes and analytics platforms
on GCP is a must.

Create detailed target
state technical, security, data and operational architecture and design
blueprints incorporating modern data technologies and cloud data services
demonstrating modernization value proposition.

Minimum of 8 years of
designing, building and operationalizing large-scale enterprise data
solutions and applications using one or more of GCP data and analytics
services in combination with 3rd parties - Spark, Hive, Cloud DataProc,
Cloud Dataflow, Apache Beam/ composer, Big Table, Cloud BigQuery, Cloud
PubSub, Cloud storage Cloud Functions & GitHub performing detail
assessments of current state data platforms and creating an appropriate
transition path to GCP cloud Experience with Data lake, data warehouse ETL
build and design Experience with Google Cloud Services such as Streaming +
Batch, Cloud Storage, Cloud Dataflow, Data Proc , DFunc, Big Query &
Big Table Proven ability in one or more of the following programming or
scripting languages- Python, JavaScript, Java,

Regards,

Priyanka

Lead Recruiter

Net
2
Source Inc. |
Address:
270 Davidson Ave, Suite 704, Somerset,
NJ 08873, USA

--

Keywords: information technology Arizona New Jersey
[email protected]
View all
Mon Mar 18 21:26:00 UTC 2024

To remove this job post send "job_kill 1226620" as subject from [email protected] to [email protected]. Do not write anything extra in the subject line as this is a automatic system which will not work otherwise.


Your reply to [email protected] -
To       

Subject   
Message -

Your email id:

Captcha Image:
Captcha Code:


Pages not loading, taking too much time to load, server timeout or unavailable, or any other issues please contact admin at [email protected]
Time Taken: 9

Location: Phoenix, Arizona