Home

Must have Certified GCP Data Engineer with 10-12+ years at Phoenix, AZ (Hybrid) - USC/H1B only at Phoenix, Arizona, USA
Email: [email protected]
Hello Connections,

Please find the below Job Position and share suitable profiles ASAP.

Role: Certified GCP Data Engineer with 10-12+ years

Location: Phoenix, AZ (Day 1 onsite, 3 days office 2 days WFH)

Duration: Long Term Contract

Visa Only USC And H1B

Note - PP Number is must for H1B visa and LinkedIn URL for all.

Must Skills: Data Engineering, GCP, Data Governance, Metadata Management, DevOps, CI/CD, Python, Scala

Strong experience with GCP (not AWS/Azure)

Must be certified in GCP

Job Summary & Principal Duties:

A solid experience and understanding of considerations for large-scale solutioning and operationalization of data warehouses, data lakes and analytics platforms on GCP is a must.

Monitors the Data Lake and Warehouse to ensure that the appropriate support teams are engaged at the right times.

Design, build and test scalable data ingestion pipelines, perform end to end automation of ETL process for various datasets that are being ingested.

Participate in peer review and provide feedback to the engineers keeping development best practices, business and technical requirements in view

Determine best way to extract application telemetry data, structure it, send to proper tool for reporting (Kafka, Splunk).

Work with business and cross-functional teams to gather and document requirements to meet business needs.

Provide support as required to ensure the availability and performance of ETL/ELT jobs.

Provide technical assistance and cross training to business and internal team members.

Collaborate with business partners for continuous improvement opportunities.

Requirements

JOB SPECIFICATIONS:

Education: Bachelor degree with 10 to 12+ years of experience

Experience, Skills & Qualifications:

6+ years of experience in Data Engineering with an emphasis on Data Warehousing and Data Analytics.

4+ years of experience with one of the leading public clouds.

4+ years of experience in design and build of salable data pipelines that deal with extraction, transformation, and loading.

4+ years of experience with Python, Scala with working knowledge on Notebooks.

2+ years hands on experience on GCP Cloud data implementation projects (Dataflow, DataProc, Cloud Composer, Big Query, Cloud Storage, GKE, Airflow, etc.).

At least 2 years of experience in Data governance and Metadata Management.

Ability to work independently, solve problems, update the stake holders.

Analyze, design, develop and deploy solutions as per business requirements.

Strong understanding of relational and dimensional data modeling.

Experience in DevOps and CI/CD related technologies.

Excellent written, verbal communication skills, including experience in technical documentation and ability to communicate with senior business managers and executives.

Thanks & Regards

Reena Felix

Recruitment Manager

Email: [email protected]

EliteTech Talent LLC

30 N Gould St, Sheridan, Wyoming 82801

Website / LinkedIn

--

Keywords: continuous integration continuous deployment information technology Arizona
Must have Certified GCP Data Engineer with 10-12+ years at Phoenix, AZ (Hybrid) - USC/H1B only
[email protected]
[email protected]
View all
Tue Jul 16 03:06:00 UTC 2024

To remove this job post send "job_kill 1563465" as subject from [email protected] to [email protected]. Do not write anything extra in the subject line as this is a automatic system which will not work otherwise.


Your reply to [email protected] -
To       

Subject   
Message -

Your email id:

Captcha Image:
Captcha Code:


Pages not loading, taking too much time to load, server timeout or unavailable, or any other issues please contact admin at [email protected]
Time Taken: 26

Location: Phoenix, Arizona