Home

hybrid Lead Data Engineer No H1B at Remote, Remote, USA
Email: [email protected]
If candidate has already
worked with Cardinal health, there will be no interview, just a discussion call
and offer.

Lead Data
Engineer- Someone who has worked at CH prior will get the job

Visa No H1B

Must have
LinkedIn , SSN and DL

Location  Columbus
OH- Locals preferred- remote

If candidate has already worked with Cardinal health, there
will be no interview, just a discussion call and offer.

Reaching out to check if you have any
experienced Lead Data Engineer (preferably with Cardinal/EDNA BigQuery)
experience available immediately for an onshore role on CHIME program. I have
an urgent need for a Data Engineer who can lead Design and Development of
BigQuery/AtScale/Tableau deliverables working with business and other junior
data engineers on the team. I will be opening a Wand request for the same but
wanted to send this over in advance. Please see the below JD.

Responsibilities:

Lead Design/Development of solution per business requirements

Provide recommendation and direction on best approach to take to ensure
business

functional and non-functional requirements (Data freshness SLAs, Response time

Performance SLA etc)

Define tasks for junior data engineers per product plan and ensure timely
completion

of task by providing support and removing blockers.

Work closely with Product Owner on providing ETAs for future deliverables

Develop BQ views per business requirements and best practices. Perform data

mapping with source systems.

Ensure on time delivery of project work solve technical issues and provide
quick

resolution.

Should have advanced SQL programming experience with GCP BQ. Hands on skills

with GCP, Bigquery, Airflow, needed.

Ensure quality by conducting code review, providing direction to other data
engineers

Participate in technical platform strategy as tools, products, and business
needs evolve

Define and execute database and data movement standards, design reviews,
pipeline CI/CD

process, and data container policies to ensure high quality data management

Interact with multiple organizations to track project progress, identify risks,
communicate

risks and status to leadership, and to assess potential impacts to the
business.

Ensure platforms and tools meet or exceed data security standards, including
internal and

external audits performed

Use strong verbal and written communication skills that non-technical business
and end-

users can understand.

Qualifications

Required:

8+ years experience with data platforms including GCP BigQuery, AtScale, My
SQL and

Airflow

Experience working in large scale data warehouse projects required.

Expert working knowledge of SQL, Python,

Strong PL/SQL development experience required.

Purchasing/Procurement/Inventory domain knowledge would be plus.

Understanding of dimensional data modelling

Demonstrated expertise of database design and modeling.

Expert knowledge of BI Reporting and Data Discovery tools

Expert knowledge of Cloud technologies

Experience in Supply Chain Inventory Management will be a plus

--

Keywords: continuous integration continuous deployment business intelligence information technology procedural language Ohio
[email protected]
View all
Tue Feb 13 19:58:00 UTC 2024

To remove this job post send "job_kill 1112945" as subject from [email protected] to [email protected]. Do not write anything extra in the subject line as this is a automatic system which will not work otherwise.


Your reply to [email protected] -
To       

Subject   
Message -

Your email id:

Captcha Image:
Captcha Code:


Pages not loading, taking too much time to load, server timeout or unavailable, or any other issues please contact admin at [email protected]
Time Taken: 0

Location: ,