Data Architect | Remote at Remote, Remote, USA |
Email: [email protected] |
From: Vinod Katkam, Agile enterprise Solutions [email protected] Reply to: [email protected] Tech Mahindra Description Position :- Data Architect Location :- Remote Minimum 3 recent projects Architect or (6 Years as an architect) experience required. Deliverables: High Level Designs and Business Rule identification Detail Designs and Business Rule identification Data Exploration, Discovery, and Profiling High Level Data Exploration, Discovery, and Profiling Conceptual and Functional Data Detail Flow Designs Detail Data Design, Organization, and Solutioning Erwin Data Model Creation Data Architecture Metadata Source to Target Mappings High Level Source to Target Mappings Minimum 6 years in a Data Architect role supporting warehouse and big data platforms/environments. Minimum 2 years experience working in Sr Data Architect role including with Cloud data platforms. Highly experienced in understanding data relationships and with strong data exploration skills using SQL. Deep understanding of data flows, data taxonomy and organization, data lineage, data virtualization. Strong Data Management and Data Governance knowledge and implementation experience. Experience with implementing data security frameworks and enterprise data architecture best practices. Experience creating high level and detailed data architecture & design documentation. Business Analyst mindset/aptitude to understand domain data requirements for design deliverables. Data Warehouse Data Modeling using Erwin - Data Flow, ER Diagram, Conceptual, Logical and Physical. Experience with architecting end to end data solutions for both batch and real time designs. Metadata & documentation management required for Erwin modelling, data cataloging. Strong data design experience & knowledge that can be applied to big data architectures and cloud data lake environments, (ie Very strong data analysis & design experience on Teradata and GCP platforms for curation and analytics use cases). Experience working collaboratively with clients, developers, and architecture teams understanding requirements to design and implement enterprise level data solutions. GCP Cloud data (including BigQuery) experience is required. Teradata implementation experience is strongly preferred. Experience with Data Fabric and Data Mesh architecture is preferred. Experience with Starburst (data virtualization) & IBM Cloud Pak (data governance) is preferred. Excellent verbal and written communication skills is a must. Not looking for data engineers. Keywords: |
[email protected] View all |
Mon Feb 26 18:47:00 UTC 2024 |