Home

Senior Database Architect - Columbus, Ohio - Hybrid Onsite at Columbus, Ohio, USA
Email: [email protected]
From:

Abhishek Garg,

USG Inc.

[email protected]

Reply to:   [email protected]

Hello,

Hope you are doing great,

I would appreciate if you can go through the description and let me know if you would be interested in discussing this role furthermore, if so, please do let me know a good time and a number to connect and while replying do send me an updated copy of your resume.

Senior Database Architect

End Date: 06/30/2024

Work Location: 50 West Town Street  Columbus, Ohio 43215 - Hybrid - onsite Tuesday and Thursday

Description:

The Senior Database Architect will be responsible for Medicaid Enterprise Data Warehouse (EDW) design, development, implementation, migration, maintenance, and operational activities. The candidate will closely with Data Governance and Analytics team. Will be one of the key technical resource for data warehouse projects for various Enterprise Data Warehouse projects and building critical data marts, data ingestion to Big Data platform for data analytics and exchange with State and Medicaid partners. This position is a member of Medicaid ITS and OST closely with the Business Intelligence & Data Analytics team.

Responsibilities:

Participate in Team activities, Design discussions, Stand up meetings and planning Review  with team.

Perform data analysis, data profiling, data cleansing and data quality analysis in various layers using Database queries both in Big Data platforms.

Eliciting, analyzing and documenting functional and non-functional requirements.

Document business requirements, meeting minutes, and key decisions/actions.

Lead client meetings and sessions with data-driven analysis to clarify requirements and design decisions.

Perform data gap and impact analysis due to new data addition and existing data changes for any new business requirements and enhancements.

Follow the organization design standards document, create data mapping specification document, pseudo codes for the development team(s) and design documents.

Create logical and physical data models.

Review and understand existing business logic used in Hadoop ETL platforms to verify against the business user needs.

Review PySpark programs that are used to ingest historical and incremental data.

Review SQOOP scripts to ingest historical data from Module Vendors databases to Hadoop IOP, created HIVE tables and Impala view creation scripts for multiple data mart tables.

Assist Business Analyst to create Test Plan, Design Test scenarios, SQL scripts (Hadoop), test or mockup data, executes the test scripts.

Validate test results and records as well as log and research defects.

Analyze production data issues, report problems and find solutions to fix the issues, if any.

Create incidents and tickets to fix production issues, create Support Requests to deploy code for development team to UAT environment.

Participate in meetings to continuously upgrade the Functional and technical expertise.

Establish priorities & follow through on projects, paying close attention to detail with minimal supervision.

Create and present project plan, project status and other dashboards as necessary.

Perform other duties as assigned

REQUIRED Skill Sets:

8+ years Data analysis/architecture experience in Waterfall and Agile Methodology in various domains (prefer Healthcare) in a data warehouse environment.

Good knowledge of relational database, Hadoop big data platform and tools, data vault and dimensional model design.

Strong SQL experience (prefer Oracle, Hive and Impala) in creating DDLs and DMLs in Oracle, Hive and Impala (minimum of 8-9 years experience).

Experience in analysis, design, development, support and enhancements in data warehouse environment with Cloudera BigData Technologies (with a minimum of 8-9 years experience in data analysis, data profiling, data model, data cleansing and data quality analysis in various layers using Database queries both in Oracle and Big Data platforms).

Experience (minimum of 8-9 years) in working Erwin Data Model tool, hive/impala queries, Unix commands, scripting and shell scripting etc.

Experience in migrating data from relational database (prefer Oracle) to big data Hadoop platform is a plus.

Experience eliciting, analyzing and documenting functional and non-functional requirements.

Ability to document business, functional and non-functional requirements, meeting minutes, and key decisions/actions.

Experience in identifying data anomalies.

Experience building data sets and familiarity with PHI and PII data.

Ability to establish priorities & follow through on projects, paying close attention to detail with minimal supervision.

Effective communication, presentation, & organizational skills.

Good experience in working with Visio, Excel, PowerPoint, Word, etc.

Effective team player in a fast paced and quick delivery environment.

Required Education: BS/BA degree or combination of education & experience.

Regards,

Abhishek Garg

|[email protected]

Keywords: business analyst golang
[email protected]
View all
Wed Dec 06 20:28:00 UTC 2023

To remove this job post send "job_kill 913599" as subject from [email protected] to [email protected]. Do not write anything extra in the subject line as this is a automatic system which will not work otherwise.


Your reply to [email protected] -
To       

Subject   
Message -

Your email id:

Captcha Image:
Captcha Code:


Pages not loading, taking too much time to load, server timeout or unavailable, or any other issues please contact admin at [email protected]
Time Taken: 1

Location: ,