Home

Data Engineer /ETL/Power BI/Data Warehouse at Remote, Remote, USA
Email: [email protected]
Sr Data Engineer /ETL/Power BI/Data Warehouse
Required Location: Hybrid/ TROY, MI / 3 days a week. Client: Flagstar Bank
Onsite Interview Required No H1-b and CPT

Candidates must be LOCAL to the TROY, MI area and COMMUTE into the office THREE TIMES A WEEK. NO RELOCATION CONSIDERED.

Banking is not required and is only a Plus. This position will require a final in person interview.

*** We need:  A senior (10+ years)  Data Engineer with extensive experience working with ETL, Business Intelligence (Cognos or Power BI)  and Data Warehouse experience.     **Candidates must have Long Projects/Good Tenure, Excellent communication skills and a State issued ID (Not Bills) showing they are Local.

*** PLEASE Only send me candidates in the TROY, MI area Open to Hybrid.

*** Please make sure that each submittal includes:
1.      Drivers license or State ID
2.      Link to the candidates LinkedIn account.
3.      Below submittal Format

*** Candidate Must Haves on a resume and for submittal:

1. How many years working with: Sr. Data Engineer
2. How many years working with: ETL
3. How many years working with: Business Intelligence
4. How many years working with: Data Warehouse
5. How many years working with: Local to Troy MI
*** Please provide all the below Submittal Format details with each submittal. It is required for the client Management system.
         Full Name:
         Rate: 
         Location: 
         Availability to Interview: One Days notice
         Availability to Start: 
         Email Address:   
         Phone Number: 
         Visa Status:
         Education - College/Year of graduation:
         Link to LinkedIn 
         Certifications (Please list)          

Job Description:

Title Sr Data Engineer Location Flagstar, Troy, MI Job Summary The Sr. Data Engineer is responsible in understanding and supporting the businesses through the design, development, and execution of Extract, Transform, and Load (ELT/ETL), data integration, and data analytics processes across the enterprise.

He/She will stay on top of tech trends, experiment with and learn new technologies, contribute to the growth of data organization, participate in internal & external technology communities, and mentor other members of the team. Provide technical leadership at every stage of the data engineering lifecycle, from designing data platforms, data pipelines, data stores, and gathering, importing, wrangling, querying, and analyzing data. The Sr data engineer will work closely with various customers including their immediate project teams, business domain experts and other technical staff members. Work daily within a project team environment, taking direction from project management and technical leaders. Responsible for design, development, administration, support, and maintenance of the Snowflake Platform and Oracle Platform. Participates in the full systems life cycle and cloud data lake/data warehouse design and build including recommendation of code development, integration with data marketplace or reuse and buy versus build solutions.

Job Responsibilities: Technical Leadership Lead data integration across the enterprise thru design, build and implementation of large scale, high volume, high performance data pipelines for both on-prem and cloud data lake and data warehouses. Lead the development and documentation of technical best practices for ELT/ETL activities. Also, oversee a program inception to build a new product if needed. Solution Design Lead the design of technical solution including code, scripts, data pipelines, processes/procedures for integration of data lake and data warehouse solutions in an operative IT environment.
Code Development Ensures data engineering activities are aligned with scope, schedule, priority and business objectives. Oversees code development, unit and performance testing activities. Responsible to code and lead the team to implement the solution.
Testing Leads validation efforts by verifying the data at various middle stages that are being used between source and destination and assisting others in validating the solution performs as expected. Meets or exceeds all operational readiness requirements (e.g., operations engineering, performance, and risk management). Ensure compliance with applicable federal, state and local laws and regulations. Complete all required compliance training. Maintain knowledge of and adhere to Flagstar's internal compliance policies and procedures. Take responsibility to keep up to date with changing regulations and policies. Job Requirements: High School Diploma, GED, or foreign equivalent required.
Bachelor's in Computer Science, Mathematics or related field + 7 years of development experience preferred, or 10 years comparable work experience required.
10 years of experience designing, developing, testing, and implementing Extract, Transform and Load (ELT/ETL) solutions using enterprise ELT/ETL
15 years of comparable work experience.
10 years of experience developing and implementing data integration, data lake and data warehouse solutions in an on-premise and cloud environment.
5 years of experience working with Business Intelligence tools (IBM Cognos is preferred), Power BI and Alteryx.
7 years of experience working with APIs, data as a service, data marketplace and data mesh.
10 years of experience with various Software Development Life Cycle methods such as Agile, SCRUM, Waterfall, etc.
3-year experience in 100+ TB data environment.
Proven experience developing and maintaining data pipelines and ETL jobs using IBM DataStage, Informatica, Matillion, FiveTran, Talend or Dbt
Knowledge of AWS cloud services such as S3, EMR, Lambda, Glue, Sage Maker, Redshift & Athena and/or Snowflake.
Experienced in data modelling for self-service business intelligence, advanced analytics, and user application.
Experience with Data Science including AI/ML Engineering, ML framework/pipeline build and predictive/prescriptive analytics on Aws Sagemaker.
Experience with migrating, architecting, designing, building and implementing cloud data lake, data warehouses (cloud/on-prem), data mesh, data as a service, and cloud data marketplace.
Ability to communicate complex technical concepts by adjusting messaging to the audience: business partners, IT peers, external stakeholders, etc.
Proven ability to design and build technical solutions using applicable technologies; ability to demonstrate exceptional data engineering skills. Ability to prioritize work by dividing time, attention and effort between current project workload and on-going day to day activities.
Demonstrates strength in adapting to change in processes, procedures and priorities.
Proven ability to establish a high level of trust and confidence in both the business and IT communities.
Strong teamwork and interpersonal skills at all management levels.
Proven ability to manage to a project budget.
Experience applying agile practices to solution delivery.
Must be team-oriented and have excellent oral and written communication skills.
Strong analytic and problem-solving skills.
Good organizational and time-management skills.
Experience in Strategic Thinking and Solutioning.
Must be a self-starter to understand existing bottlenecks and come up with innovative solution.
Demonstrated ability to work with key stakeholders outside the project to understand requirements/resolve issues.
Experience with data model design, writing complex SQL queries, etc., and should have a good understanding of BI/DWH principles.
Expertise in Relational Database Management System, Data Mart and Data Warehouse design.
Expert-level SQL development skills in a multi-tier environment.
Expertise in flat file formats, XML within PL/SQL, and file format conversion.
Strong understanding of SDLC and Agile Methodologies.
Strong understanding of model driven development. Strong understanding of ETL best practices. Proven strength in interpreting customer business needs and translating them into application and operational requirements.
Strong problem-solving skills and analytic skills with proven strength in applying root cause analysis

Thanks & Regards,

Sam || Technical recuiter
Email:  [email protected]

Address: 17210 Camelot Court, Suite 101, Land O' Lakes, FL-34638

This email is generated using CONREP software.

A74003

Keywords: artificial intelligence machine learning business intelligence sthree information technology procedural language Florida Idaho Michigan
Data Engineer /ETL/Power BI/Data Warehouse
[email protected]
[email protected]
View all
Tue Oct 29 22:58:00 UTC 2024

To remove this job post send "job_kill 1888540" as subject from [email protected] to [email protected]. Do not write anything extra in the subject line as this is a automatic system which will not work otherwise.


Your reply to [email protected] -
To       

Subject   
Message -

Your email id:

Captcha Image:
Captcha Code:


Pages not loading, taking too much time to load, server timeout or unavailable, or any other issues please contact admin at [email protected]
Time Taken: 0

Location: ,