Home

Hiring Now : : Senior Database Architect : : Columbus, Ohio(hybrid) at Columbus, Ohio, USA
Email: [email protected]
From:

Suraj Barik,

Vyze Inc

[email protected]

Reply to:   [email protected]

Title: Senior Database Architect

Location: Columbus, Ohio(hybrid)

Duration: 6+ Month

Visa: USC, GC,GC-EAD,H4-EAD 

MOI: Video

Note: Must be EST Zone and onsite Tuesday and Thursday.

About The Job:

The Senior Database Architect will be responsible Enterprise Data Warehouse (EDW) design, development, implementation, migration, maintenance, and operational activities. The candidate will closely with Data Governance and Analytics team. Will be one of the key technical resource for data warehouse projects for various Enterprise Data Warehouse projects and building critical data marts, data ingestion to Big Data platform for data analytics.

Participate in Team activities, Design discussions, Stand up meetings and planning Review with team.

Perform data analysis, data profiling, data cleansing and data quality analysis in various layers using Database queries both in Big Data platforms.

Eliciting, analyzing and documenting functional and non-functional requirements.

Document business requirements, meeting minutes, and key decisions/actions.

Lead client meetings and sessions with data-driven analysis to clarify requirements and design decisions.

Perform data gap and impact analysis due to new data addition and existing data changes for any new business requirements and enhancements.

Follow the organization design standards document, create data mapping specification document, pseudo codes for the development team(s) and design documents.

Create logical and physical data models.

Review and understand existing business logic used in Hadoop ETL platforms to verify against the business user needs.

Review PySpark programs that are used to ingest historical and incremental data.

Review SQOOP scripts to ingest historical data from Module Vendors databases to Hadoop IOP, created HIVE tables and Impala view creation scripts for multiple data mart tables.

Assist Business Analyst to create Test Plan, Design Test scenarios, SQL scripts (Hadoop), test or mock-up data, executes the test scripts.

Validate test results and records as well as log and research defects.

Analyze production data issues, report problems and find solutions to fix the issues, if any.

Create incidents and tickets to fix production issues, create Support Requests to deploy code for development team to UAT environment.

Participate in meetings to continuously upgrade the Functional and technical expertise.

Establish priorities & follow through on projects, paying close attention to detail with minimal supervision.

Create and present project plan, project status and other dashboards as necessary.

Required Skills:

8 years of Experience in analysis, design, development, support and enhancements in data warehouse environment with Cloudera Bigdata Technologies.

8 years Data analysis, profiling, model, cleansing and quality analysis in various layers using Database queries both in Oracle and Big Data platforms.

8 years of Experience in working Erwin Data Model tool, hive/impala queries, Unix commands, scripting and shell scripting etc.

8+ years Data analysis/architecture experience in Waterfall and Agile Methodology in various domains (prefer Healthcare) in a data warehouse environment.

Good knowledge of relational database, Hadoop big data platform and tools, data vault and dimensional model design.

Strong SQL experience (prefer Oracle, Hive and Impala) in creating DDLs and DMLs in Oracle, 

Hive and Impala (minimum of 8-9 years experience).

Experience in analysis, design, development, support and enhancements in data warehouse environment with Cloudera Bigdata Technologies (with a minimum of 8-9 years

Experience in data analysis, data profiling, data model, data cleansing and data quality analysis in various layers using Database queries both in Oracle and Big Data platforms).

Experience (minimum of 8-9 years) in working Erwin Data Model tool, hive/impala queries, Unix commands, scripting and shell scripting etc.

Experience in migrating data from relational database (prefer Oracle) to big data Hadoop platform is a plus.

Experience eliciting, analyzing and documenting functional and non-functional requirements.

Ability to document business, functional and non-functional requirements, meeting minutes, and key decisions/actions.

Experience in identifying data anomalies.

Experience building data sets and familiarity with PHI and PII data.

Ability to establish priorities & follow through on projects, paying close attention to detail with minimal supervision.

Keywords: green card
[email protected]
View all
Fri Dec 08 11:00:00 UTC 2023

To remove this job post send "job_kill 921178" as subject from [email protected] to [email protected]. Do not write anything extra in the subject line as this is a automatic system which will not work otherwise.


Your reply to [email protected] -
To       

Subject   
Message -

Your email id:

Captcha Image:
Captcha Code:


Pages not loading, taking too much time to load, server timeout or unavailable, or any other issues please contact admin at [email protected]
Time Taken: 9

Location: Columbus, Ohio