Home

Senior Azure Data Cloud Engineer at Remote, Remote, USA
Email: [email protected]
From:
Zara,
TechRakers
[email protected]
Reply to:   [email protected]

Position Title: Sr. Azure Cloud Data Engineer

Location: Vienna, VA 100% remote

Duration: 12 months+

Open for H1B

10+ Years Exp

To support the Information System Division and the enterprise by providing comprehensive data engineering solutions in translating Navy Federal's business vision and strategies into effective IT and business capabilities through the design, implementation, and integration of IT systems utilizing legacy systems and Microsoft Azure. The Data Engineer will be responsible for guiding the design, development, of Navy Federal's Data with a specific focus on Azure Data Factory, and ADLS pipelines to support Pega Marketing systems capabilities and on integrating data from Teradata, other external source, and applications into high performing operational Hubs, Dated lake and Microsoft SQL Server.

Responsibilities:

   Design and implement data frameworks and pipelines to process data from on-premises and cloud data sources to feed into the Azure Data storages, monitor data quality and implement related controls

Migrate on-premises data stores to a managed service in the cloud

Evaluate existing designs, improve methods, and implement optimizations

Ensure the security and integrity of solutions including compliance with Navy Federal, industry engineering and Information Security principles and practices

Analyze and validate data sharing requirements within and outside data partners

Prepare project deliverables that are valued by the business and present them in such a manner that they are readily understood by project stakeholders

Perform other duties as assigned

Qualifications and Education Requirements:

 Bachelors degree in Information Systems, Computer Science, Engineering, or related field, or the equivalent combination of education, training, and experience

Hands-on experience creating automated data pipelines using modern technology stacks for batch ETL, and for data processing to load advanced analytics data repositories

Proficient in designing and implementing data integration processes in a large, distributed environment using cloud services e.g., Azure Event Hub, Data Factory, Data Catalog, and Databricks

Experience in designing data lake storage structures, data acquisition, transformation, and distribution processing

   General knowledge and experience with configuration, load-balancing, auto-scaling, monitoring, networking, and problem-solving in a cloud environment

   Expert level skills in Python, Databricks, Azure Data Factory

   Experienced in the use of ETL tools and techniques -- Informatica

Hands-on experience with Teradata platform and data warehousing

Very good understanding of SQL (Advanced level)

Ability to understand other projects or functional areas in order to consolidate analytical needs and processes

Demonstrates change management and/or excellent communication skills 

Desired Qualifications and Education Requirements:

   Prior experience in the Financial Industries and large banks

Keywords:
[email protected]
View all
Wed Nov 23 00:40:00 UTC 2022

To remove this job post send "job_kill 162232" as subject from [email protected] to [email protected]. Do not write anything extra in the subject line as this is a automatic system which will not work otherwise.


Your reply to [email protected] -
To       

Subject   
Message -

Your email id:

Captcha Image:
Captcha Code:


Pages not loading, taking too much time to load, server timeout or unavailable, or any other issues please contact admin at [email protected]
Time Taken: 23

Location: , Virginia