Home

Looking for EDW Data Engineer :no H1B or CPT at Remote, Remote, USA
Email: [email protected]
From:

Nitu,

RCI

[email protected]

Reply to:   [email protected]

Role: EDW Data Engineer

Location : Hybrid San Jose, CA

Duration: 6+Month

Must Haves: LinkedIn Profile and  great experience with SQL

JOB Description:

 The Senior EDW Data Engineer builds and maintains a data pipeline infrastructure for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL , cloud based relational or non-relational databases and employing Talend and/or scripting languages like Perl/Python.

 Primary Responsibilities:

 Lead analysis, design, development, and deployment of complex data platforms and pipelines across functional teams involving multiple integration and other technologies to meet committed deliverables

 Partner and collaborate with business and IT stakeholders to deliver solutions to meet the business needs with quality and within the defined timeline

 With minimal supervision, build and maintain complex data platforms and large scale CI/CD data pipelines utilizing a variety of technologies (REST APIs, TeamCity, Jenkins) and cloud databases (Snowflake)

 Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.

 Partner with data and analytics experts to strive for optimal functionality in our data integration platform

 Collaborate on a day to day basis with business partners, architects and technical teams to identify, troubleshoot and resolve enterprise data warehouse issues

 Resolve data issues by identifying multiple potential solutions and assessing them for both technical and business suitability, and work closely with all stakeholders (including technical peers and end users) to ensure technical compatibility and user satisfaction

 Follow established best practices guidelines related to enterprise data warehouse configuration and change control processes

 As a subject matter expert, mentor less experienced Data Engineers on a day to day basis

 Conduct performance testing, tune and optimize data pipelines to meet the SLAs

Experience and Education Requirement:

 Minimum Education: Bachelors degree in Information Technology, Computer Science or other related fields of study

 Minimum 6 years of developing data pipelines using TalendInformatica and Python including the processing of NoSQL and JSON, XML formats, preferably in the financial services industry using Agile development process

 Advanced working experience in SQL/PL-SQL, Snowflake, API

 Excellent skills in scripting languages, including Python, Perl and Shell

 Excellent skills with Ant, Maven, GIT, Jenkins, TeamCity

 Excellent skills in source code repository tools such as Clear Case, SVN, CVS, and Gi.

 Deep understanding of ETL, ELT, Talend, Informatica, Hadoop, data warehousing schema and Cloud platforms

 Ability to work autonomously to manage time effectively and prioritize work appropriately to meet deadlines

 Excellent problem solving and critical thinking skills

 Strong business communication skills; able to write/speak clearly and professionally for a variety of audiences

 Demonstrate independence, creativity, and initiative

 Working knowledge of Microsoft Office Suite

Keywords: continuous integration continuous deployment information technology procedural language California
[email protected]
View all
Sat Jan 21 02:41:00 UTC 2023

To remove this job post send "job_kill 296035" as subject from [email protected] to [email protected]. Do not write anything extra in the subject line as this is a automatic system which will not work otherwise.


Your reply to [email protected] -
To       

Subject   
Message -

Your email id:

Captcha Image:
Captcha Code:


Pages not loading, taking too much time to load, server timeout or unavailable, or any other issues please contact admin at [email protected]
Time Taken: 34

Location: San Jose, California