Home

Requirement for Big Data Engineer with Pyspark and Hive - 2 positions - Phoenix-AZ - Onsite from Day1 at Phoenix, Arizona, USA
Email: [email protected]
From:

SAPNA,

ITECS

[email protected]

Reply to:   [email protected]

Position Title: Big Data Engineer with PySpark and Hive (2 roles)

Start Date: ASAP

Location: Phoenix, AZ (onsite from day1)

 Skills required:

Technical Skills:

Big Data Engineering
PySpark, Hive, ETL automation, Data Pipeline optimization. 

Must have 7+ years (preferably more) in working with these technologies and understand the details of
how PySpark and Hive are used to optimize SQL queries and data pipelines. 

Entry-level experience will not work for these roles. 

Other Key Skills:

Must be able to work autonomously. 

Collect requirements from business stakeholders, driving solution design and implementation. 

Can potentially lead other team members.

IMPORTANT:
Please include pre-screened assessment record along with submission for the respective skillset when sharing the profile
.

Keywords: Arizona
[email protected]
View all
Tue Jul 18 13:06:00 UTC 2023

To remove this job post send "job_kill 418017" as subject from [email protected] to [email protected]. Do not write anything extra in the subject line as this is a automatic system which will not work otherwise.


Your reply to [email protected] -
To       

Subject   
Message -

Your email id:

Captcha Image:
Captcha Code:


Pages not loading, taking too much time to load, server timeout or unavailable, or any other issues please contact admin at [email protected]
Time Taken: 8

Location: Phoenix, Arizona