Home

Urgent Requirement - Senior GCP Data Engineer with Hive SQL, Spark, SCALA - Sunnyvale, CA-(onsite 2-3 days) at Sunnyvale, California, USA
Email: [email protected]
From:

Ben,

Spar Info Sys

[email protected]

Reply to:   [email protected]

Hi There, 

Please share a suitable profile for this role along with Visa, Current location and expected pay rate for this role.

Role - Senior GCP Data Engineer with Hive, SQL, Spark, SCALA

Location - Sunnyvale, CA-(onsite 2-3 days)

Duration: Long Term (12+ months)

Open to C2C/W2

Must Have Skills

Data Engineer-11 Years Experience
Spark 4-8+ Yrs of Exp
Scala 4- 8+ Yrs of Exp
GCP 2-5+ Yrs of Exp
Hive 8+Yrs of Exp
SQL - 8+ Yrs of Exp
ETL Process / Data Pipling - 8+ Years of experience

Job Description: 

Client is looking for a highly energetic and collaborative Senior Data Engineer (10+ yrs) for a 12-month engagement.

Responsibilities: 

As a Senior Data Engineer, you will Design and develop big data applications using the latest open source technologies.
Desired working in offshore model and Managed outcome 
Develop logical and physical data models for big data platforms.
Automate workflows using Apache Airflow. 
Create data pipelines using Apache Hive, Apache Spark, Scala, Apache Kafka.
Provide ongoing maintenance and enhancements to existing systems and participate in rotational on-call support.
Learn our business domain and technology infrastructure quickly and share your knowledge freely and actively with others in the team.
Mentor junior engineers on the team 
Lead daily standups and design reviews.
Groom and prioritize backlog using JIRA.
Act as the point of contact for your assigned business domain Requirements:
8+ years of hands-on experience with developing data warehouse solutions and data products.
4+ years of hands-on experience developing a distributed data processing platform with Hadoop, Hive, Scala, Airflow or a workflow orchestration solution are required . 4 + years of experience in GCP,GCS Data proc, BIG Query
2+ years of hands-on experience in modeling(Erwin) and designing schema for data lakes or for RDBMS platforms.
Experience with programming languages: Python, Java, Scala, etc.
Experience with scripting languages: Perl, Shell, etc.
Practice working with, processing, and managing large data sets (multi TB/PB scale). 
Exposure to test driven development and automated testing frameworks.
Background in Scrum/Agile development methodologies.
Capable of delivering on multiple competing priorities with little supervision.
Excellent verbal and written communication skills.
Bachelor's Degree in computer science or equivalent experience.

The most successful candidates will also have experience in the following:
Gitflow
Atlassian products - BitBucket, JIRA, Confluence etc.
Continuous Integration tools such as Bamboo, Jenkins, or TFS 

Keywords: wtwo California
[email protected]
View all
Thu Nov 30 01:34:00 UTC 2023

To remove this job post send "job_kill 894955" as subject from [email protected] to [email protected]. Do not write anything extra in the subject line as this is a automatic system which will not work otherwise.


Your reply to [email protected] -
To       

Subject   
Message -

Your email id:

Captcha Image:
Captcha Code:


Pages not loading, taking too much time to load, server timeout or unavailable, or any other issues please contact admin at [email protected]
Time Taken: 1

Location: ,