Requirement // GCP Data Architect // C2H// 100% Remote at Remote, Remote, USA |
Email: [email protected] |
Position GCP Data Architect Location Remote Position Type Contract to Hire Job Description GCP Data Architect The primary role of the Data Architect is to support and expand the Data & Analytics platforms that process, store, and organize the data critical for the data and analytics team. This role will have expertise in data movement and database technologies support data analytics and Machine Learning (ML) / Artificial Intelligence (AI) capabilities in the cloud. This person can identify performance bottle necks in complex enterprise applications and identify tuning opportunities. Responsibilities Possess deep functional and technical understanding of the Machine Learning technologies (Googles Cloud Platform) and leverage it across Clients large, complex, and diverse landscape. Analyze performance and scalability characteristics to identify bottlenecks in large-scale distributed systems. Perform root cause analysis of performance issues identified by internal testing and from customers and suggest corrective actions. Business acumen (strong understanding of how business operates, and how to harness data and analytics to meet business needs) expert at designing as well as developing all layers of an application and platform. Leads the development of technology transitions or architecture evolutions by creating foundational examples of working solutions and coaches teams on how to build on those examples. Engage early in project efforts to analyze current solutions, provide solution options and recommendations, understand business process impact, provide accurate estimates. Design and Architect applications across multiple domains including Data Engineering, Data Science, Data Viz and App development. Demonstrates deep understanding of business processes and technology building blocks. Lead collaboration with project teams, application, data, integration, infrastructure, and security enterprise architects to develop and deploy comprehensive solution architectures. Ensure projects are delivered in-line with design, roadmap and to the defined standards and best practices. Is skilled as a Data Architect who can design end-to-end data driven solutions. Influences adoption working IT, Business and Architecture groups. Creates design considering data sources & dependencies, development landscape, deployment landscape. Generate ideas and suggestions for process and technical improvements for platforms and processes supported by the team. Architect the GCP data platform to meet the non-functional requirements of the consumptions layer and solutions built on top of the data platform. Identify potential performance and availability challenges proactively, implement recommendations, and ensure that the systems capacity and availability exceed requirements while ensuring the platform achieves business results. Partner with solution architects, data engineers, platform engineers and other team roles to assess the platforms needs, help design new capabilities, establish architectural roadmaps, design, and run tests/proof-of-concepts, help troubleshoot problems, identify risks, and make recommendations. Comfortable with communication across all levels of department and create artifacts to clearly deliver the message. Qualifications 10+ Experience of large-scale implementation programs preferred. Advanced experience with data platforms including GCP, Teradata and SAP HANA. Expert in SQL, Data Engineering, Data pipelines, Data Visualization and Data Science Experience with cloud computing including knowledge of Google Cloud Platform (GCP) and Amazon Web Services (AWS) platform is preferred. Proven ability to manage multiple tasks, respond quickly to emergent problems, and to focus both on long-range projects and immediate tasks required to maintain system functionality. Experience with designing, developing, and deploying Machine Learning models. Demonstrated expertise in database design and modeling. Expert knowledge of BI Reporting and Data Discovery tools. Experience with business-critical applications. Delivery of related information software solutions such as data warehouses and integration platforms. Agile development skills and experience. BQ expert, query performance tuning techniques Familiarity with performance monitoring tools Familiarity with integration patterns with consumption layer Real-time data streaming tools such as Kafka Understanding of Tableau, Looker, At Scale, LookML Capacity modeling/planning processes and tools Understand high availability, Resiliency planning. Excellent verbal and written communication skills, bias towards action, proactive and self-driven, can work independently. Architecting high performance application data services against data stores with very large amounts of data Thanks & Regards, Ashirwad Chauhan Technical IT Recruiter Office: 732-807-8336 [email protected] -- Keywords: artificial intelligence machine learning business intelligence information technology |
[email protected] View all |
Mon Aug 28 21:48:00 UTC 2023 |