Senior GCP Data Engineer: Remote: 13+ Years only at Remote, Remote, USA |
Email: [email protected] |
Senior Data Engineer - 10 to 15 years of proven experience in modern cloud data engineering, broader data landscape experience and exposure and solid software engineering experience. - Prior experience architecting and building successful self-service enterprise scale data platforms in a green field environment with microservices based architecture. - Proficiency in building end to end data platforms and data services in GCP is a must. - Proficiency in tools and technologies: BigQuery, Cloud Functions, Cloud Run, Dataform, Dataflow, Dataproc, SQL, Python, Airflow, PubSub. - Experience with Microservices architectures -Kubernetes, Docker. Our microservices are build using TypeScript, NestJS, NodeJS stack. Prefer candidates with this experience. - Experience building Symantec layers. - Proficiency in architecting and designing and development experience with batch and real time streaming infrastructure and workloads. - Solid experience with architecting and implementing metadata management including data catalogues, data lineage, data quality and data observability for big data workflows - Hands-on experience with GCP ecosystem and data lakehouse architectures. - Strong understanding of data modeling, data architecture, and data governance principles. - Excellent experience with DataOps principles and test automation. - Excellent experience with observability tooling: Grafana, Datadog. Thanks & Regards, Trayambkeshwer Dwivedi (Trayam), Sr. Technical Recruiter LinkedIn: linkedin.com/in/trayambkeshwar-dwivedi-792283218 -- Keywords: information technology Senior GCP Data Engineer: Remote: 13+ Years only [email protected] |
[email protected] View all |
Thu Jun 13 00:53:00 UTC 2024 |