Home

Enterprise Data Modeler :Richmond, VA (Hybrid): USC GC at Enterprise, Utah, USA
Email: [email protected]
From:

Vivek Paliwal,

kpg99

[email protected]

Reply to:   [email protected]

MENTION VISA AND LOCATION

ONLY USC OR GC

Role                  : Enterprise Data Modeler

Location          :  Richmond, VA Locals (Hybrid)

Duration          :  Long term

Visa                   : USC,GC

Job Description:                

Must have active & updated LinkedIn profile.

Visa expiry should valid up to Year 2025.

Recent working experience with Enterprise Clients.

MUST HAVE STRONG COMMUNICATION

AZURE

SQL

DATALAKE

Responsibilities:

Create conceptual data model to identify key business entities and visualize their relationships, define concepts and rules.

Translate business needs into data models Build logical and physical data models for client hierarchy Document data designs for team.

Populate use case data into the physical model using a mix of sql or stored functions or procedures.

Analyze performance issues and mitigate those performance issues with constant evaluation.

Knowledge of data modeling tools like ER/Studio, ERwin, or similar.

Familiarity with data warehousing concepts, star schema, snowflake schema, and data normalization techniques.

Understanding of data security, encryption, and compliance requirements in the cloud.

Proficiency in SQL, including writing complex queries, stored procedures, and functions.

Experience with NoSQL databases and understanding of their data modeling techniques.

Present and communicate modeling results and recommendations to internal stakeholders and Development teams and explains features that may affect the physical data model.

Ability to work closely with data architects, data engineers, and other stakeholders to gather requirements and translate them into data models.

Ensure and enforce a governance process to oversee implementation activities and ensure alignment to the defined architecture.

Perform data profiling/analysis activities that helps to establish, modify and maintain data model.

Develop canonical models, Data as a service models and Knowledge of SOA to support integrations.

Analyze data-related system integration challenges and propose appropriate solutions with strategic approach.

Strong Professional experience with Inmon and Kimball Methodology and able to demonstrate good practical use case examples from prior work experience.

Strong understanding of Data Vault Methodology and able to demonstrate its effectiveness against Inmon and Kimball

Strong understanding of the Azure Platform and working knowledge of Microsoft azure data lake storage ADLS Gen 2 and zone-based architecture.

Experience in designing, partition, folder structure and file naming conventions optimized for data Lake.

Knowledge of data formats for Data Lake (Parquet, ORC, Avro, XML and Json)

Working knowledge of design patterns for data lakes

Knowledge of integrating data Lake with Azure Synapse for analytics and reporting

Perform data profiling and analysis for maintaining data models Develop and support the usage of MDM toolkit Integrate source systems into the MDM solution Implement business rules for data reconciliation and deduplication Enforce data models and naming standards across deliverables.

Establish processes for governing the identification, collection, and use of corporate metadata; take steps to assure metadata accuracy and validity.

Establish methods and procedures for tracking data quality, completeness, data redundancy, and improvement.

Conduct data capacity planning, life cycle, duration, usage requirements, feasibility studies, and other tasks.

Create strategies and plans for data security, backup, disaster recovery, business continuity, and archiving. Ensure that data strategies and architectures are in regulatory compliance.

Experience with big data platforms and tools like Azure Databricks, Azure HDInsight, or similar.

Familiarity with big data technologies like Hadoop, Spark, and Hive.

Willingness to stay updated with the latest trends and advancements in cloud data modeling and Azure services.

Ability to work closely with data architects, data engineers, and other stakeholders to gather requirements and translate them into data models.

Good knowledge of applicable data privacy practices and laws.

Strong written and oral communication skills. Strong presentation and interpersonal skills and Ability to present ideas in user-friendly language.

Strong analytical and problem-solving skills.

Excellent communication skills to collaborate with cross-functional teams.

Attention to detail and the ability to work independently.

Experience with Azure data integration tools like Azure Data Factory, Azure Stream Analytics, and Azure Logic Apps.

Knowledge of ETL (Extract, Transform, Load) and ELT (Extract, Load, Transform) processes.

Familiarity with Azure DevOps and CI/CD pipelines for data solutions.

Understanding of data security, encryption, and compliance requirements in the cloud.

Familiarity with Azure security features like Azure Key Vault, Azure Policy, and Azure Blueprints.

Ability to create comprehensive documentation for data models, data dictionaries, and design specifications.

Familiarity with Azure SDKs and APIs for automation and integration.

Familiarity with Postman and understanding of it in using APIs

Experience in writing queries (SQL, Python, R, Scala) as needed and experience with various data technologies such as Azure Synapse or SQL Server, Snowflake, Databricks

Keywords: continuous integration continuous deployment rlang information technology green card Virginia
Enterprise Data Modeler :Richmond, VA (Hybrid): USC GC
[email protected]
[email protected]
View all
Wed May 22 22:03:00 UTC 2024

To remove this job post send "job_kill 1418769" as subject from [email protected] to [email protected]. Do not write anything extra in the subject line as this is a automatic system which will not work otherwise.


Your reply to [email protected] -
To       

Subject   
Message -

Your email id:

Captcha Image:
Captcha Code:


Pages not loading, taking too much time to load, server timeout or unavailable, or any other issues please contact admin at [email protected]
Time Taken: 293

Location: , Oregon