Urgent need GCP Data Engineer // Remote // 12+ Years Exp Only at Remote, Remote, USA |
Email: [email protected] |
GCP Data engineer Location: Remote (Candidates around Grand Rapids, MI will be an advantage) 6+ Months Primary Skill : Gcp, Bigquery, GCP Data Engineering Stacks JD: 10 to 15 years of proven experience in modern cloud data engineering, broader data landscape experience and exposure and solid software engineering experience. Prior experience architecting and building successful self-service enterprise scale data platforms in a green field environment with microservices based architecture. Proficiency in building end to end data platforms and data services in GCP is a must. Proficiency in tools and technologies: BigQuery, Cloud Functions, Cloud Run, Dataform, Dataflow, Dataproc, SQL, Python, Airflow, PubSub. Experience with Microservices architectures - Kubernetes, Docker. Our microservices are build using TypeScript, NestJS, NodeJS stack. Prefer candidates with this experience. Proficiency in architecting and designing and development experience with batch and real time streaming infrastructure and workloads. Solid experience with architecting and implementing metadata management including data catalogues, data lineage, data quality and data observability for big data workflows Hands-on experience with GCP ecosystem and data lakehouse architectures. Strong understanding of data modeling, data architecture, and data governance principles. Excellent experience with DataOps principles and test automation. Excellent experience with observability tooling: Grafana, Datadog. Experience building Symantec layers. Experience with Data Mesh architecture. Experience building Semantic layers for data platforms. Experience building scalable IoT architectures -- Keywords: information technology Michigan Urgent need GCP Data Engineer // Remote // 12+ Years Exp Only [email protected] |
[email protected] View all |
Tue Jun 11 22:22:00 UTC 2024 |