Home

Data Engineer = Michigan Rate 50 hr on c2c at Michigan, North Dakota, USA
Email: [email protected]
From:

Mohd,

Global Icon

[email protected]

Reply to:   [email protected]

NO H1b's ONLY GC AND USC

4 open positions, all the same requirements. Talent or similar ETL tool. SQL Server. Those are the two main things they are looking for. Kafka, databricks, azure are the highest ranked preferred experiences. Knowledge within 3 areas, (1)
HR space,
dealing with UKG vendor, (1-2) 
Marketing,
 customer data, campaigned data, identify resolution, worked with salesforce marketing cloud, "prophecy," Customer Data Platform experience. (1), backfill for the
core team
, catch all, supply chain one day and maybe something else the next day. Someone who is well rounded. 

She is remote so communication is huge for her, written and verbal. 

Other ETL tools outside of talend: datastage, informatica, will work too. 

JAVA is the backend behind talend. 

joins querys to extract data, pulling data, write more complex querys (CTE), write performance querys, utilizing partitions. 

Case by case as far as start date, wants to start them in January if possible. 

Looking for someone local to AA. 2-3 days/week. T, W, TH. is when managers are in. But this person could be in only 1-2 times / month. 

All four will be on her team and everyone on her current team has SQL Server and talend. That is a core of her team. 

Need to understand inputs and mapping in ETL. 

Just starting, still has all 4 openings and is starting to interview this week. 

Job Description:

The data engineering specialist will primarily focus on development of large volume information ingestion and transformation. This position is responsible for orchestrating data interfaces into (and out of) our Enterprise Data Warehouse using Talend, SQL, Python and other data engineering solutions. 

GENERAL RESPONSIBILITIES

    Design and develop ETL (Talend) / SQL / Python based processes to perform complex data transformation processes.

    Design, code, and test major data processing features, as well as work jointly with other team members to provide complex software enhancements for the enterprise data storage platforms (RDBMS, Lakehouse, No-SQL platforms)

    Build Data Integration solutions to handle batch / streaming / IoT data on ETL, Big-Data platforms.

    Develop and Deliver changes in the Enterprise Data Warehouse according to Data Warehousing best practices

    Gather requirements and construct documentation to aid in maintenance and code reuse in accordance with team processes and standards

    Design, code, and test major data processing features, as well as work jointly with other team members to provide complex software enhancements for the enterprise data storage platforms

    Monitor scheduled jobs and improve reliability of ongoing processing

    Monitor, measure, and enhance ways to improve system performance

    Ability to multi-task deliverables and manage them in an efficient manner. 

    Performs other duties as assigned

Qualifications

    Understanding and Hands-on experience with key technologies (SQL, ETL, Data modeling, data processing) 

    Strong SQL skills with Relational DBMS technology - SQL Server

    2-4 yrs Hands-on experience with ETL tools (Talend preferred)

    Good understanding and expertise in relational database concepts and data processing concepts

    Experience in handling multiple data formats (Delimited file, JSON, XML etc.)

    Experience with data lakehouses (Databricks) and cloud technologies (Aure) a plus

    Experience with Customer and digital marketing data, including implementing Customer Data Platform and Identity Resolution solutions and integrating with a ESP (Salesforce), a plus

    Hands on experience designing and implementing data ingestion techniques for real time processes (IoT, eCommerce) a plus.

    Development experience in a Big-Data environment a Plus; Spark, Kafka, Message Queues

    Experience with shell scripting / Python a plus

    Strong communication skills (oral and written)

    Good analytical and problem solving skills

    Knowledge of CRM, MDM, HR Reporting and Business Intelligence a plus

    Candidate must be thorough and detail-oriented and a team player

    Able to work on multiple priorities in a deadline-driven environment

Additional Information

All your information will be kept confidential according to EEO guidelines.

* Hybrid position requiring weekly on-site work in Ann Arbor office *

    Roles would all be for Data Engineering developers, with at least 2-4 years of experience

o    Preferred experience would be with Talend or similar ETL tool

Keywords: green card
[email protected]
View all
Wed Jan 03 10:24:00 UTC 2024

To remove this job post send "job_kill 975596" as subject from [email protected] to [email protected]. Do not write anything extra in the subject line as this is a automatic system which will not work otherwise.


Your reply to [email protected] -
To       

Subject   
Message -

Your email id:

Captcha Image:
Captcha Code:


Pages not loading, taking too much time to load, server timeout or unavailable, or any other issues please contact admin at [email protected]
Time Taken: 14

Location: ,