Need GCP Data Engineer // Remote // 10+ Years Exp Only at Remote, Remote, USA |
Email: [email protected] |
Role - GCP Data Engineer Location: This position will be based in Bentonville, AR (Remote ) 12+ Months Skills required: Spark,Scala 50% GCP,Data Proc,GCS ,Data lake 25% Airflow,Hive,BIG Query 25% Description: UST Global is looking for a highly energetic and collaborative Senior Data Engineer(10+ yrs) for a 12-month engagement. Requirements: 8+ years of hands-on experience with developing data warehouse solutions and data products. 4+ years of hands-on experience developing a distributed data processing platform with Hadoop, Hive,Scala, Airflow or a workflow orchestration solution are required . 4 + years of experience in GCP,GCS Data proc, BIG Query 2+ years of hands-on experience in modeling(Erwin) and designing schema for data lakes or for RDBMS platforms. Experience with programming languages: Python, Java, Scala, etc. Experience with scripting languages: Perl, Shell, etc. Practice working with, processing, and managing large data sets (multi TB/PB scale). Exposure to test driven development and automated testing frameworks. Background in Scrum/Agile development methodologies. Capable of delivering on multiple competing priorities with little supervision. Excellent verbal and written communication skills. Bachelor's Degree in computer science or equivalent experience. The most successful candidates will also have experience in the following: Gitflow Atlassian products BitBucket, JIRA, Confluence etc. Continuous Integration tools such as Bamboo, Jenkins, or TFS -- Keywords: information technology Arkansas |
[email protected] View all |
Mon Sep 25 21:05:00 UTC 2023 |