GCP Data Engineer with strong Data Fusion (REMOTE) at Strong, Arkansas, USA |
Email: [email protected] |
From: Sharath Kumar, Predicaz [email protected] Reply to: [email protected] GCP Data Engineer with strong Data Fusion. REMOTE Location: USA - Remote Mandatory Skill: GCP Data Fusion Exp Need: 9-10yrs Key Roles and Responsibilities: We are looking for a GCP Data Engineer who has experience in building enterprise level experience in designing, building and operationalizing large-scale enterprise data solutions and applications using one or more of GCP data and analytics services in combination with 3rd parties - Spark, Hive, Databrick, Cloud DataProc, Cloud Dataflow, Composer, BigQuery, Cloud PubSub, Cloud Storage Cloud Functions, Data Fusion Experience in Google Cloud platform (especially BigQuery) Experience developing scripts for flowing data into GBQ from external data sources Strong proficiency in writing SQL and BigQuery SQL tuning Expert-level knowledge of BigQuery and is experienced in Big Query optimization Experience extracting data from a variety of sources, and a desire to expand those skills (working knowledge in SQL and Spark is mandatory). Experience on GCP Data FUsion Experience working in GCP and Google Big Query Strong SQL knowledge - able to translate complex scenarios into queries Strong Programming experience in Python Architectures with Data Modeling and Mapping Design, build and deploy data pipelines (batch and streaming) in the data lake using Hadoop technology stacks and programming languages such as Hive, PySpark, Python, Spark, Spark Streaming Design, build and deploy error handling, data reconciliation, audit log monitoring, job scheduling using PySpark Develop and implement coding best practices using Spark, Python, and PySpark Develop data model and structure for the data lake to ensure alignment with the data domain, integration needs and efficient access to the data Uses strong programming skills in Python, Java or any of the major languages. Must have: SQL BQ/SF GCP Data Fusion Python Bigdata Hadoop Good to have: Kafka Containerization Kubernetes API Regards, sharath kumar [email protected] Predica Inc. T: +1- 609-473-9146 | A: 26211 Central Park Boulevard | Suite 502 | Southfield | MI 48076 W: www.predicaz.com Keywords: |
[email protected] View all |
Wed Jan 04 22:00:00 UTC 2023 |