Hybrid Req** // GCP Data Engineer (Big data) // Contract at Remote, Remote, USA |
Email: [email protected] |
Job Description Role: GCP Data Engineer (Big Data) Location: Sunnyvale, CA (Hybrid Mode) Job Type: Contract Should have good experience in Spark or Scala, GCS (Google Cloud Storage), Dataproc, and BigQuery Required skills/experience : 5+ years experience, including 3+ years Spark or Scala, 2+ years Hadoop/Big Data using tools like Hive, Spark, PySpark, Scala, and RDBMS/SQL. Strongly Preferred : GCP including GCS (Google Cloud Storage), Dataproc, and BigQuery Full Job Description: Designs, develops, and implements Hadoop eco-system based applications to support business requirements. Follows approved life cycle methodologies, creates design documents, and performs program coding and testing. Resolves technical issues through debugging, research, and investigation. Experience/Skills Required: 5 years experience in computer programming, software development or related. 3+ years of solid Scala or Spark and 2+ years experience in design, implementation, and support of big data solutions in Hadoop using Hive, Spark, Scala, SQL. Hands-on experience with Unix, Teradata and other relational databases. Experience with @Scale a plus. Strong communication and problem-solving skills Strongly Preferred: GCP including GCS (Google Cloud Storage), Dataproc, and BigQuery Regards, Vishal Pathak Senior IT Recruiter Work: 571-507-8631 Email ID: [email protected] https://www.linkedin.com/in/vishalpathak28091997/ Arkhya Tech Inc. | www.arkhyatech.com Sender notified by Mailtrack 11/22/23, 08:46:19 PM -- Keywords: information technology California Idaho |
[email protected] View all |
Wed Nov 22 20:54:00 UTC 2023 |