Home

Immediate Job opportunity for Big Data Engineer (GCP, Spark, Java) hoenix, AZ (Open to Remote with willingness to relocate) at Phoenix, Arizona, USA
Email: [email protected]
From:

Ravikanth,

Roha Technologies LLC

[email protected]

Reply to:   [email protected]

Dear Partners,

I hope this email finds you well.

My name is Ravikanth Naraharisetty, and I am reaching out to share an exciting job opportunity at RohaTech. We are currently looking for a skilled Big Data  with expertise in GCP, Hadoop, Spark, AND Java to join our team in Phoenix AZ.

About Us:

Rohatech LLC is a leading Recruitment company specializing in Technical and Functional Roles . We are at the forefront of innovation, constantly striving to improve our technology stack and provide exceptional solutions to our clients. 

Job Title: Big Data Engineer (GCP, Spark, Java)

Location: Phoenix, AZ (Open to Remote with willingness to relocate)

Employment Type:  Contract

Job Summary:

We are seeking a skilled and experienced Big Data Engineer with expertise in Google Cloud Platform (GCP), Apache Spark, and Java to join our dynamic team. The ideal candidate will have hands-on experience in building, managing, and optimizing large-scale data processing systems using GCP services and Spark, combined with strong Java programming skills. You will be responsible for designing and implementing data pipelines, ensuring the scalability, reliability, and performance of data processing workflows.

Key Responsibilities:

Design, develop, and maintain large-scale data processing systems using Apache Spark on GCP.

Build and optimize data pipelines to process structured and unstructured data from various sources.

Develop and implement data solutions that are scalable, resilient, and secure.

Collaborate with data scientists, data analysts, and other engineers to understand data requirements and translate them into technical solutions.

Optimize and troubleshoot complex data processing workflows for performance and reliability.

Write efficient and maintainable Java code for data processing and integration tasks.

Implement best practices for data governance, data security, and data quality.

Work with GCP services such as BigQuery, Dataflow, Pub/Sub, and Cloud Storage to manage and process data.

Monitor, debug, and optimize Spark jobs and GCP-based data solutions to ensure they meet business requirements.

Stay updated with the latest industry trends and technologies in Big Data and Cloud Computing.

Required Qualifications:

Bachelors degree in Computer Science, Information Technology, or related field.

5+ years of hands-on experience in Big Data technologies, particularly with Apache Spark.

3+ years of experience with Google Cloud Platform (GCP), including services like BigQuery, Dataflow, Pub/Sub, and Cloud Storage.

Strong proficiency in Java programming with experience in building data processing applications.

Proven experience in designing and implementing ETL/ELT pipelines in a big data environment.

Solid understanding of distributed computing principles and data modeling techniques.

Familiarity with data warehousing concepts and tools.

Strong problem-solving skills and ability to troubleshoot complex data processing issues.

Experience with version control systems like Git.

Excellent communication skills and ability to work in a collaborative environment.

Preferred Qualifications:

Masters degree in Computer Science, Information Technology, or related field.

Experience with other programming languages such as Python or Scala.

Knowledge of other big data tools and technologies, such as Hadoop, Kafka, or Flink.

Experience with CI/CD pipelines and containerization technologies like Docker and Kubernetes.

Certifications in Google Cloud Platform (GCP) or related technologies.

This is a friendly reminder to include the following documents with your submission:
Visa
Driver's License
Passport Proofs

For a successful screening, it is mandatory to provide all the requested details below. Please avoid submitting incomplete or junk profiles.

We appreciate your help in this matter!

Submission Details

Full Name:-  

Phone Number:-  

Email:-                            

Linkedin:-

Current Location:-  

Open to Relocation:-

Passport Number:-  

Visa Status:-

DOB(Including Year):-  

Total Experience:-

US Experience:-

Relevant Experience:-

Complete Education:-

Availability to Join:-

H1 Employer Details

Name:

Email:

Contact : 

Your prompt response to this request is highly appreciated. If you have any additional information or specific requirements, 

please feel free to include them in your submission.

Thank you for your cooperation, and we look forward to receiving the necessary details.

Best regards, 

Ravikanth Naraharisetty RohaTech.

+91-7036012291 WhatsApp Me 

[email protected]

https://www.linkedin.com/in/raviroha/

Keywords: continuous integration continuous deployment access management information technology Arizona
Immediate Job opportunity for Big Data Engineer (GCP, Spark, Java) hoenix, AZ (Open to Remote with willingness to relocate)
[email protected]
[email protected]
View all
Fri Sep 06 19:15:00 UTC 2024

To remove this job post send "job_kill 1727055" as subject from [email protected] to [email protected]. Do not write anything extra in the subject line as this is a automatic system which will not work otherwise.


Your reply to [email protected] -
To       

Subject   
Message -

Your email id:

Captcha Image:
Captcha Code:


Pages not loading, taking too much time to load, server timeout or unavailable, or any other issues please contact admin at [email protected]
Time Taken: 13

Location: Phoenix, Arizona