Home

GCP Data Architect || Remote at Remote, Remote, USA
Email: [email protected]
From:

Naveen,

Smart IT Frame

[email protected]

Reply to:   [email protected]

Hi,

Greetings from Smart IT Frame, Hope you are doing well!!!

Smart IT Frame specializes in enabling you with your most critical line of resources. Whether its for permanent staffing, contract staffing, contract-to-hire or executive search, we understand the importance of delivering the most suitable talent; on time and within budget. With our Core focus in emerging technologies, we have provided global technology workforce solutions in North America, Canada & India. We take pride in delivering specialized talent, superior performance, and seamless execution to meet the challenging business needs of customers worldwide.

Role: GCP Data Architect

Location: Remote

Type: Contract

Shift Day 9AM TO 6PM EST  

Required Skills : Data Architect

Roles & Responsibilities 

Guide team of data engineers in designing, developing ETL & Data pipeline, testing, and deploying high performance data analytics solutions in GCP.

Experience in building solution architecture, provision infrastructure, secure and reliable data-centric services and application in GCP

Implement end-to-end data analytics solutions (from data ingestion through visualization) for large-scale, complex client environments

Analyze and understand Big Data and analytical technologies on GCP and provide leadership to clients and team members with defining their architecture components for data and analytics.

Need to ensure data security by implementing desired encryption, role-based access and stay compliance.

Guide team on performing data optimization, data migration, data movement & orchestration.

Work with implementation teams from concept to operations, providing deep technical subject matter expertise for successfully deploying large scale data solutions in the enterprise, using modern data/analytics technologies on premise and cloud.

Knowledge on Google services to build data pipelines and leverage intelligence.

Minimum 7 years of designing, building and operationalizing large-scale enterprise data solutions and applications using one or more of GCP data and analytics services in combination with 3rd parties - Spark, Cloud DataProc, Cloud Dataflow, Apache Beam, Apache Airflow, Bigtable, Cloud Big Query, Cloud Pub Sub, Cloud Functions, etc.

Minimum 2 years of hands-on experience analyzing, re-architecting and re-platforming on-premise data warehouses to data platforms on GCP cloud using GCP/3rd party services

Minimum 2 years of designing and building production data pipelines from ingestion to consumption within a hybrid big data architecture, using Java, Python, Scala etc.

Minimum 2 years of architecting and implementing next generation data and analytics platforms on GCP cloud

Minimum 2 years of designing and implementing data engineering.

\\

Thanks & Regards  

Navaneetha Krishnan

Senior Technical Recruiter

Smart IT Frame LLC

Direct: +1 201-201-4497 

[email protected]

https://www.linkedin.com/in/naveen-krishna-840a1619b/

www.smartitframe.com

Keywords: information technology
[email protected]
View all
Tue Nov 07 00:00:00 UTC 2023

To remove this job post send "job_kill 830215" as subject from [email protected] to [email protected]. Do not write anything extra in the subject line as this is a automatic system which will not work otherwise.


Your reply to [email protected] -
To       

Subject   
Message -

Your email id:

Captcha Image:
Captcha Code:


Pages not loading, taking too much time to load, server timeout or unavailable, or any other issues please contact admin at [email protected]
Time Taken: 0

Location: ,