Home

I tried to reach you for the below position Senior Data Engineer (10+ years) and Candidates must be LOCAL to the TROY, MI at Troy, New York, USA
Email: [email protected]
From:

Raj,

Techgene

[email protected]

Reply to:   [email protected]

I tried to reach you for the below position   Senior Data Engineer (10+ years) and Candidates must be LOCAL to the TROY, MI

I hope you are well. I am (Raj) with Techgene Solutions LLC. Were Senior Data Engineer (10+ years)

). I think your experience is a great fit for this role. If youre interested in learning more, Id love to connect with you. Please review th e Job description below and let me know if you are comfortable. I look forward to hearing from you soon. Thank you!

Please share your profiles to [email protected] or you can reach me at 972-580-0247 Ext 253 

CHECKPOINT  

Full Legal Name  

Contact Number  

Email ID  

Date of Birth (MM/DD)  

Last 4 Digit SSN  

Current Location with Zip Code  

Two LinkedIn references from their recent two projects

Work Authorization/Visa Status  

Passport Number  

Corporation Name  

Open to relocate OR Travel  

LinkedIn  

Total Years of IT Experience  

Education Details

University, year:

US Experience  

Senior Data Engineer (10+ years)

Experience Required**:

  - Extensive experience with ETL

  - Business Intelligence (Cognos or Power BI)

  - Data Warehouse experience

Candidate Requirements**:

  - Long project history/good tenure

  - Excellent communication skills

  - State-issued ID (not bills) showing local residency

Required Location: Hybrid/ Candidates must be LOCAL to the TROY, MI / 3 days a week.

duration: 12 month

Interview Required: Video

Any Visas Accepted: No Opt. 

Sub-contracting: Yes

Candidates must be LOCAL to the TROY, MI area and COMMUTE into the office THREE TIMES A WEEK. NO RELOCATION CONSIDERED. *** Please send candidates even if they are over the target rate. The Client is flexible.

*** PLEASE Only send me candidates in the TROY, MI area Open to Hybrid.

*** Please make sure that each submittal includes:

1.     Drivers license or State ID

2.     Link to the candidates LinkedIn account.

3.     Below submittal Format

*** Candidate Must Haves on a resume and for submittal:

1. How many years working with: Sr. Data Engineer

2. How many years working with: ETL

3. How many years working with: Business Intelligence

4. How many years working with: Data Warehouse

5. How many years working with: Local to Troy MI

Submission format :

        Full Name:

        Rate: 

        Location: 

        Availability to Interview: One Days notice

        Availability to Start: 

        Email Address:   

        Phone Number: 

        Visa Status:

        Education - College/Year of graduation:

        Link to LinkedIn 

        Certifications (Please list)          

Job Description:

Title Sr Data Engineer Location Flagstar, Troy, MI Job Summary The Sr. Data Engineer is responsible in understanding and supporting the businesses through the design, development, and execution of Extract, Transform, and Load (ELT/ETL), data integration, and data analytics processes across the enterprise.

He/She will stay on top of tech trends, experiment with and learn new technologies, contribute to the growth of data organization, participate in internal & external technology communities, and mentor other members of the team. Provide technical leadership at every stage of the data engineering lifecycle, from designing data platforms, data pipelines, data stores, and gathering, importing, wrangling, querying, and analyzing data. The Sr data engineer will work closely with various customers including their immediate project teams, business domain experts and other technical staff members. Work daily within a project team environment, taking direction from project management and technical leaders. Responsible for design, development, administration, support, and maintenance of the Snowflake Platform and Oracle Platform. Participates in the full systems life cycle and cloud data lake/data warehouse design and build including recommendation of code development, integration with data marketplace or reuse and buy versus build solutions.

Job Responsibilities: Technical Leadership Lead data integration across the enterprise thru design, build and implementation of large scale, high volume, high performance data pipelines for both on-prem and cloud data lake and data warehouses. Lead the development and documentation of technical best practices for ELT/ETL activities. Also, oversee a program inception to build a new product if needed. Solution Design Lead the design of technical solution including code, scripts, data pipelines, processes/procedures for integration of data lake and data warehouse solutions in an operative IT environment.

Code Development Ensures data engineering activities are aligned with scope, schedule, priority and business objectives. Oversees code development, unit and performance testing activities. Responsible to code and lead the team to implement the solution.

Testing Leads validation efforts by verifying the data at various middle stages that are being used between source and destination and assisting others in validating the solution performs as expected. Meets or exceeds all operational readiness requirements (e.g., operations engineering, performance, and risk management). Ensure compliance with applicable federal, state and local laws and regulations. Complete all required compliance training. Maintain knowledge of and adhere to Flagstar's internal compliance policies and procedures. Take responsibility to keep up to date with changing regulations and policies. Job Requirements: High School Diploma, GED, or foreign equivalent required.

Bachelor's in Computer Science, Mathematics or related field + 7 years of development experience preferred, or 10 years comparable work experience required.

10 years of experience designing, developing, testing, and implementing Extract, Transform and Load (ELT/ETL) solutions using enterprise ELT/ETL

15 years of comparable work experience.

10 years of experience developing and implementing data integration, data lake and data warehouse solutions in an on-premise and cloud environment.

5 years of experience working with Business Intelligence tools (IBM Cognos is preferred), Power BI and Alteryx.

7 years of experience working with APIs, data as a service, data marketplace and data mesh.

10 years of experience with various Software Development Life Cycle methods such as Agile, SCRUM, Waterfall, etc.

3-year experience in 100+ TB data environment.

Proven experience developing and maintaining data pipelines and ETL jobs using IBM DataStage, Informatica, Matillion, FiveTran, Talend or Dbt

Knowledge of AWS cloud services such as S3, EMR, Lambda, Glue, Sage Maker, Redshift & Athena and/or Snowflake.

Experienced in data modelling for self-service business intelligence, advanced analytics, and user application.

Experience with Data Science including AI/ML Engineering, ML framework/pipeline build and predictive/prescriptive analytics on Aws Sagemaker.

Experience with migrating, architecting, designing, building and implementing cloud data lake, data warehouses (cloud/on-prem), data mesh, data as a service, and cloud data marketplace.

Ability to communicate complex technical concepts by adjusting messaging to the audience: business partners, IT peers, external stakeholders, etc.

Proven ability to design and build technical solutions using applicable technologies; ability to demonstrate exceptional data engineering skills. Ability to prioritize work by dividing time, attention and effort between current project workload and on-going day to day activities.

Demonstrates strength in adapting to change in processes, procedures and priorities.

Proven ability to establish a high level of trust and confidence in both the business and IT communities.

Strong teamwork and interpersonal skills at all management levels.

Proven ability to manage to a project budget.

Experience applying agile practices to solution delivery.

Must be team-oriented and have excellent oral and written communication skills.

Strong analytic and problem-solving skills.

Good organizational and time-management skills.

Experience in Strategic Thinking and Solutioning.

Must be a self-starter to understand existing bottlenecks and come up with innovative solution.

Demonstrated ability to work with key stakeholders outside the project to understand requirements/resolve issues.

Experience with data model design, writing complex SQL queries, etc., and should have a good understanding of BI/DWH principles.

Expertise in Relational Database Management System, Data Mart and Data Warehouse design.

Expert-level SQL development skills in a multi-tier environment.

Expertise in flat file formats, XML within PL/SQL, and file format conversion.

Strong understanding of SDLC and Agile Methodologies.

Strong understanding of model driven development. Strong understanding of ETL best practices. Proven strength in interpreting customer business needs and translating them into application and operational requirements.

Strong problem-solving skills and analytic skills with proven strength in applying root cause analysis

Keywords: artificial intelligence machine learning access management materials management business intelligence sthree information technology procedural language Idaho Michigan
I tried to reach you for the below position Senior Data Engineer (10+ years) and Candidates must be LOCAL to the TROY, MI
[email protected]
[email protected]
View all
Tue Oct 29 03:32:00 UTC 2024

To remove this job post send "job_kill 1885668" as subject from [email protected] to [email protected]. Do not write anything extra in the subject line as this is a automatic system which will not work otherwise.


Your reply to [email protected] -
To       

Subject   
Message -

Your email id:

Captcha Image:
Captcha Code:


Pages not loading, taking too much time to load, server timeout or unavailable, or any other issues please contact admin at [email protected]
Time Taken: 2

Location: ,