GCP Data Architect :Santa Clara, CA at Santa Clara, California, USA |
Email: [email protected] |
From: Rathna, HLSolutions [email protected] Reply to: [email protected] Title: GCP Data Architect Location: Santa Clara, CA - who can go for a face-to-face interview. Mode: Hybrid (3 days work from office) Job Summary: The GCP Data Architect will be responsible for architecting, designing, and implementing end-to-end cloud data solutions on GCP. This includes data ingestion, storage, processing, and analytics. You will also design and manage data workflows using Apache Airflow to automate and orchestrate complex data pipelines. You will collaborate with cross-functional teams to ensure data solutions are scalable, secure, and meet business requirements. Key Responsibilities: Design & Architecture: Design cloud-native, scalable, and cost-effective data architectures on Google Cloud Platform (GCP) leveraging services like Big Query, Cloud Storage, Pub/Sub, Dataflow, Dataproc, and more. Lead the end-to-end design of data pipelines, storage strategies, and processing frameworks. Develop and maintain data models, including logical and physical data models for large-scale data solutions. Data Pipeline Orchestration with Apache Airflow: Architect, implement, and maintain data workflows using Apache Airflow for orchestrating complex ETL/ELT processes and ensuring seamless data pipeline automation. Design dynamic DAGs (Directed Acyclic Graphs) in Airflow for scheduling and managing tasks across different data platforms (GCP, on-premises, hybrid environments). Integrate Apache Airflow with GCP services like BigQuery, Cloud Storage, Dataflow, and Pub/Sub for scalable pipeline execution and monitoring. Cloud Data Solutions: Architect and implement data solutions for data ingestion, data lakes, data warehouses, and real-time data processing using GCP tools. Design and optimize ETL/ELT pipelines using tools like Dataflow, Apache Beam, and custom solutions. Data Governance & Security: Ensure data security, privacy, and compliance (GDPR, HIPAA, etc.) in the architecture design. Develop data governance frameworks to manage data quality, lineage, and metadata. Collaboration & Leadership: Work with stakeholders, data engineers, data scientists, and other technical teams to define data requirements and solutions. Provide technical leadership and mentorship to junior data engineers and architects. Optimization & Performance: Monitor and optimize cloud infrastructure performance, scalability, and cost management. Implement best practices for data processing and storage to ensure low-latency and high-performance data operations. Innovation & Research: Stay up to date with the latest GCP offerings, data technologies, and industry best practices. Drive innovation and propose new tools, technologies, and methodologies for optimizing the data architecture. Required Qualifications: Experience: 10+ years of experience in data architecture, data engineering, or related roles. 5+ years of hands-on experience designing and implementing cloud-based data solutions on GCP. 2+ years of experience as a GCP Architect with a proven track record of architecting large-scale, cloud-native data solutions. Proven experience with Apache Airflow for workflow orchestration, scheduling, and automating data pipelines. Expertise in GCP services such as BigQuery, Cloud Storage, Pub/Sub, Dataflow, Dataproc, and Google Kubernetes Engine (GKE). Skills: Strong understanding of data modeling, data warehousing, and data lakes. In-depth knowledge of data pipelines, ETL/ELT processes, and real-time data processing. Proficiency in SQL, Python, and other programming/scripting languages for data engineering. Experience with BigQuery performance tuning and optimization techniques. Familiarity with tools like Apache Beam, Apache Kafka, and cloud-native orchestration tools like Apache Airflow. Cloud Technologies & Architecture: Strong understanding of cloud-native architecture, serverless computing, and microservices in GCP. Knowledge of data security, encryption, and privacy best practices in the cloud. Certifications: Google Cloud Professional Data Engineer certification or equivalent is a plus. Additional certifications in data architecture, cloud computing, or related fields are a plus. Rathna HL Solutions LLC. Mail:[email protected] Direct: 609-212-1238 6136 Frisco Square Blvd, Suite 400 Frisco, TX,75034 www.hlsolutionsusa.com Keywords: golang California Texas GCP Data Architect :Santa Clara, CA [email protected] |
[email protected] View all |
Fri Dec 06 19:42:00 UTC 2024 |