Need Cloud Data Architect with Databricks Experience- Remote at Remote, Remote, USA |
Email: [email protected] |
From: Durga Prasad, Spar Information Systems [email protected] Reply to: [email protected] Job title: Cloud Data Architect Location: Remote Duration: 12+ Months Job Description: A highly skilled and experienced Senior Cloud Data Architect to lead the design, development, and deployment of scalable and secure cloud-based data solutions. The ideal candidate will have extensive experience in cloud platforms, data architecture, and analytics, along with excellent problem-solving skills and a strong background in designing and implementing robust data systems. As a member of the EDA architecture team, the cloud data architect has the following responsibilities: Cloud Data Architecture: Design, develop, and implement scalable and secure cloud-based data architecture solutions on platforms such as Azure and AWS using distributed computing skills on DataBricks. Data Modeling: Create and maintain data models, schemas, and structures to ensure efficient storage, retrieval, and analysis of data. ELT Processes: Develop robust ELT (Extract, Load, Transform) processes to integrate data from various sources into the DPaaS multi-cloud data platform. Data Security: Implement data security best practices, encryption, and access control measures to protect sensitive information. Performance Optimization: Optimize data storage, retrieval, and processing for maximum performance and efficiency. Collaboration: Collaborate with cross-functional teams, including data engineers, analysts, and business stakeholders, to gather requirements and deliver high-quality solutions. Documentation: Create detailed technical documentation, including architecture diagrams, data flow charts, and system specifications. Skills and Qualifications: Bachelor's or Master's degree in Computer Science required. Proven experience as a Cloud Data Architect, designing and implementing large-scale data solutions on cloud platforms. Expertise in cloud technologies such as DataBricks on Azure or AWS. Strong proficiency in data modeling, ETL processes, and data warehousing concepts. Experience with big data technologies (e.g., Hadoop, Spark) is required. Excellent programming skills in languages such as Python, Java, or Scala. Familiarity with data visualization tools and techniques. Strong problem-solving abilities and attention to detail. Excellent communication and interpersonal skills. Relevant certifications in cloud platforms and data technologies are a plus Keywords: |
[email protected] View all |
Thu Oct 26 00:31:00 UTC 2023 |