urgent Need:: Enterprise Data Modeler in Richmond, VA at Richmond, Virginia, USA |
Email: [email protected] |
From: Gaurav Chaudhary, Source Infotech [email protected] Reply to: [email protected] Hello, Hope you are doing well.. I have positions for Enterprise Data Modeler in Richmond, VA with our client. Kindly review the job description below and see if it might be an option worth considering Client; The Virginia Department of Transportation (VDOT) Job Title: Enterprise Data Modeler Length of the contract: 12+ month Work location: 1401 East Broad Street Richmond Virginia 23219 - Hybrid (2-3 days onsite in Richmond, VA)- need locals Video interview Visa: GC/USC/H4/GC EAD The Virginia Department of Transportation (VDOT) Information Technology Division is seeking a senior Data modeler to develop Data models for Data Assets and implementation of a cloud-based data management platform that will support the agency. Enterprise data modeler provides expert support across the enterprise information framework, analyze and translate business needs into long-term solution data models by evaluating existing systems and working with a business and data architect to create conceptual data models, data flows. Develop best practices for Data Asset development, ensure consistency within the system and review modifications of existing cross-compatibility systems. Optimize data systems and evaluate implemented systems for variance discrepancies and efficiency. Maintain logical and physical data models along with accurate metadata. Responsibilities: Create conceptual data model to identify key business entities and visualize their relationships, define concepts and rules. Translate business needs into data models Build logical and physical data models for client hierarchy Document data designs for team. Populate use case data into the physical model using a mix of sql or stored functions or procedures. Analyze performance issues and mitigate those performance issues with constant evaluation. Knowledge of data modeling tools like ER/Studio, ERwin, or similar. Familiarity with data warehousing concepts, star schema, snowflake schema, and data normalization techniques. Understanding of data security, encryption, and compliance requirements in the cloud. Proficiency in SQL, including writing complex queries, stored procedures, and functions. Experience with NoSQL databases and understanding of their data modeling techniques. Present and communicate modeling results and recommendations to internal stakeholders and Development teams and explains features that may affect the physical data model. Ability to work closely with data architects, data engineers, and other stakeholders to gather requirements and translate them into data models. Ensure and enforce a governance process to oversee implementation activities and ensure alignment to the defined architecture. Perform data profiling/analysis activities that helps to establish, modify and maintain data model. Develop canonical models, Data as a service models and Knowledge of SOA to support integrations. Analyze data-related system integration challenges and propose appropriate solutions with strategic approach. Strong Professional experience with Inmon and Kimball Methodology and able to demonstrate good practical use case examples from prior work experience. Strong understanding of Data Vault Methodology and able to demonstrate its effectiveness against Inmon and Kimball Strong understanding of the Azure Platform and working knowledge of Microsoft azure data lake storage ADLS Gen 2 and zone-based architecture. Experience in designing, partition, folder structure and file naming conventions optimized for data Lake. Knowledge of data formats for Data Lake (Parquet, ORC, Avro, XML and Json) Working knowledge of design patterns for data lakes Knowledge of integrating data Lake with Azure Synapse for analytics and reporting Perform data profiling and analysis for maintaining data models Develop and support the usage of MDM toolkit Integrate source systems into the MDM solution Implement business rules for data reconciliation and deduplication Enforce data models and naming standards across deliverables. Establish processes for governing the identification, collection, and use of corporate metadata; take steps to assure metadata accuracy and validity. Establish methods and procedures for tracking data quality, completeness, data redundancy, and improvement. Conduct data capacity planning, life cycle, duration, usage requirements, feasibility studies, and other tasks. Create strategies and plans for data security, backup, disaster recovery, business continuity, and archiving. Ensure that data strategies and architectures are in regulatory compliance. Experience with big data platforms and tools like Azure Databricks, Azure HDInsight, or similar. Familiarity with big data technologies like Hadoop, Spark, and Hive. Willingness to stay updated with the latest trends and advancements in cloud data modeling and Azure services. Ability to work closely with data architects, data engineers, and other stakeholders to gather requirements and translate them into data models. Good knowledge of applicable data privacy practices and laws. Strong written and oral communication skills. Strong presentation and interpersonal skills and Ability to present ideas in user-friendly language. Strong analytical and problem-solving skills. Excellent communication skills to collaborate with cross-functional teams. Attention to detail and the ability to work independently. Experience with Azure data integration tools like Azure Data Factory, Azure Stream Analytics, and Azure Logic Apps. Knowledge of ETL (Extract, Transform, Load) and ELT (Extract, Load, Transform) processes. Familiarity with Azure DevOps and CI/CD pipelines for data solutions. Understanding of data security, encryption, and compliance requirements in the cloud. Familiarity with Azure security features like Azure Key Vault, Azure Policy, and Azure Blueprints. Ability to create comprehensive documentation for data models, data dictionaries, and design specifications. Familiarity with Azure SDKs and APIs for automation and integration. Familiarity with Postman and understanding of it in using APIs Experience in writing queries (SQL, Python, R, Scala) as needed and experience with various data technologies such as Azure Synapse or SQL Server, Snowflake, Databricks Gaurav Chaudhary [email protected] +1 609 991 9440 EXT 162 Keywords: continuous integration continuous deployment rlang information technology green card Virginia urgent Need:: Enterprise Data Modeler in Richmond, VA [email protected] |
[email protected] View all |
Thu May 23 03:19:00 UTC 2024 |