Need - Data Fusion with GCP - San Jose, CA at San Jose, California, USA |
Email: [email protected] |
From: pradeep, shrive technologies [email protected] Reply to: [email protected] Role : Data Fusion with GCP EXP : 10+ Location :San Jose, CA Job Description: Act as a subject matter expert in data engineering and GCP data technologies. Work with client teams to design and implement modern, scalable data solutions using a range of new and emerging technologies from the Google Cloud Platform. Work with Agile and DevOps techniques and implementation approaches in the delivery. Be required to showcase your GCP Data engineering experience when communicating with clients on their requirements, turning these into technical data solutions. Be required to build and deliver Data solutions using GCP products and offerings. Skill: Hands on and deep experience working with Google Data Products (e.g. BigQuery, Dataflow, Dataproc, AI Building Blocks, Looker, Cloud Data Fusion, Dataprep, etc.). Experience in Spark (Scala/Python/Java) and Kafka. Experience in MDM, Metadata Management, Data Quality and Data Lineage tools. E2E Data Engineering and Lifecycle (including non-functional requirements and operations) management. Regulatory and Compliance work in Data Management. E2E Solution Design skills - Prototyping, Usability testing and data visualization literacy. Experience with SQL and NoSQL modern data stores. Keywords: artificial intelligence computer associates |
[email protected] View all |
Wed Dec 07 01:40:00 UTC 2022 |