GCP Data Engineer with Python and Batch processing at Remote, Remote, USA |
Email: [email protected] |
From: vandana, CCIT [email protected] Reply to: [email protected] Hi, Hope you are doing great! Please check below job description and let me know your interest along with sharing your updated resume. GCP Data Engineer with Python and Batch processing Locations: Remote (San Francisco, CA) Once in a quarter they need to Visit San Francisco, CA Duration: Contract/Fulltime Job Description: 4+ years of professional experience in Stream/Batch Processing systems at scale. Strong Programming skills in Java, Python. Experience in Public Cloud is a must. Experience with GCP and GCP managed services is a strong plus. i. Experience in Messaging/Stream Processing systems on Cloud such as Pub/Sub, Kafka, Kinesis, DataFlow, Flink etc, and/Or ii. Experience in Batch Processing systems such as Hadoop, Pig, Hive, Spark. Experience with Dataproc is a strong plus. Knowledge of DevOps principles and tools (e.g. CI/CD, IaC/Terraform). Strong understanding of Containerization technologies (e.g., Docker, Kubernetes). Strong problem-solving and critical thinking skills. Strong written/verbal communication skills with the ability to thrive in a remote work environment (For Senior leads/architects) Ability to explore new areas/problems as well as design and architect scalable solutions in Stream/Batch Processing at scale. Ability to technically lead a team of engineers on a project/component. Thanks & Regards, Vandana | | [email protected] Technical Recruiter Keywords: continuous integration continuous deployment California GCP Data Engineer with Python and Batch processing [email protected] |
[email protected] View all |
Mon Apr 15 23:32:00 UTC 2024 |