Home

Urgent Requirement || Data Engineer || Hybrid : St. Louis, MO (Local candidates) at Remote, Remote, USA
Email: [email protected]
http://bit.ly/4ey8w48
https://jobs.nvoids.com/job_details.jsp?id=2488400&uid=eaeae7d576e6444785f2f9d577d6b721

Hi,

I hope you are doing
well.

Please let me know if you are looking for a job change and interested in the
below position.

Data
Engineer

Hybrid : St. Louis, MO (Local candidates)

6
months with possible extension/conversion

Interview : Video

Visa : USC, GC, GC EAD, H4, L2

Initial Project: Migrate existing Python data extraction
tool and codebase to Alpine Data Lake, set up API extraction in Synapse,
establish daily ETL for updates, and update Power BI reports to use data lake
source. [Alpine Strategic Credit (ASC)]

Subsequent Project: Extract and migrate 20+ years of
historical data from Portfolio system to a data warehouse, set up real-time API
for continuous data feed from Tamarac, establish daily ETL for updates, and
enable improved Power BI reporting. [Alpine Private Wealth (APW)]

Both projects involve
ensuring data accuracy and integrity, structuring data for seamless
integration into the data warehouse and Power BI.

Data modeling will
utilize a Star Schema or Snowflake Schema approach.

Responsibilities:

Designing, developing, and maintaining data pipelines and warehousing
solutions.

Key tasks will include API integration, ETL development, data modeling
(Star Schema or Snowflake Schema), and supporting Power BI reporting.

Collaborate with internal project teams to ensure data accuracy,
integrity, and structured organization for business intelligence.

Tech stack:

Azure
Synapse Analytics

Two
separate environments (e.g., Development and Production).

Handles
data warehousing and large-scale analytics workloads.

Azure
Data Lake

Centralized
storage layer.

Supports
both structured and unstructured data.

Scalable
foundation for analytics and data integration.

Azure
Key Vault

Manages
secrets, encryption keys, and certificates.

Ensures
secure access across both environments.

Azure
DevOps

CI/CD
pipelines for automated builds and deployments.

Manages
data pipeline lifecycle and component delivery.

Apache
Spark Notebooks

Deployed
in both environments.

Used
for interactive data exploration, transformation, and analytics.

Azure
Integration Runtime

Facilitates
secure and scalable data movement.

Enables
transformations across network boundaries within Synapse or Data Factory.

Metastore
Data Warehouse

Centralized
metadata repository.

Maintains
schema definitions, and table metadata

ARM
Template (Azure Resource Manager)

Defines
and automates infrastructure deployment.

Enables
consistent provisioning of Synapse, Data Lake, Key Vault, and other
resources across environments.

--

Keywords: continuous integration continuous deployment business intelligence information technology green card wtwo Missouri
Urgent Requirement || Data Engineer || Hybrid : St. Louis, MO (Local candidates)
[email protected]
http://bit.ly/4ey8w48
https://jobs.nvoids.com/job_details.jsp?id=2488400&uid=eaeae7d576e6444785f2f9d577d6b721
[email protected]
View All
07:23 PM 06-Jun-25


To remove this job post send "job_kill 2488400" as subject from [email protected] to [email protected]. Do not write anything extra in the subject line as this is a automatic system which will not work otherwise.


Your reply to [email protected] -
To       

Subject   
Message -

Your email id:

Captcha Image:
Captcha Code:


Pages not loading, taking too much time to load, server timeout or unavailable, or any other issues please contact admin at [email protected]


Time Taken: 241

Location: St Louis, Missouri