GCP Data Engineer at Remote, Remote, USA |
Email: [email protected] |
Please share only GCP DATA ENGINEER profiles only Role: GCP Data Engineer Job Description: Required Qualification: Experience writing Python and Pyspark scripts. Knowledge of Apache Beam, apache spark, Pubsub or Kafka and IAM. Basic understanding of cloud networking and infrastructure. Have hands on experience with building streaming and batch pipelines. Experience with Cloud function or cloud run. Working knowledge of CICD tools like Jenkins, cloud build and a code management tool like git Possess excellent knowledge of SQL along with its variation for popular cloud database like BigQuery,cloud sql,spanner etc. experience with relational data bases like mysql,oracle,postgress. Development experience building ETL pipelines using cloud tools like dataflow, lambda. Experience in tuning SQL queries to maximize performance. Working knowledge on implementing Data quality checks. Experience with Airflow/Composer or Tidal to orchestrate the data pipelines. Excellent critical reasoning, problem-solving skills and teamwork skills. Solid written and verbal communication skills and able to articulate complex solutions to technical and non-technical personnel. Experience working for clients in healthcare space. Must to Have Experience: Experienced in healthcare domain. Hands-on with Google Cloud services for data engineering- Dataflow, Data Proc, Big Query, Composer, Pubsub. Manage end to end data pipeline from extraction from source, landing on google platform and transformation Implemented data management best practices Data quality, Capture of metadata Experience with DevOps and GKE patterns -- Keywords: information technology |
[email protected] View all |
Fri Mar 08 00:59:00 UTC 2024 |