Home

Data Warehouse Architect --Cloud Architect --Only Local to PA at Harrisburg, Pennsylvania, USA
Email: [email protected]
From:

Mike,

Dewsoftech

[email protected]

Reply to:   [email protected]

Data Warehouse Architect /Cloud Architect (Only Local to PA)

Location: Harrisburg, PA (Remote)

Position Type: Contract

Job Description:

In this position you will apply your skills to manage the existing cloud data platform to make it more scalable, reliable, and cost efficient. You will also work on additional projects that would leverage the existing architecture in some cases and use newer technologies where needed.

Primary Responsibilities:

Define and align on strategic initiatives pertaining to Data and Analytics Architecture

Design and develop data lakes, manage data flows that integrate information from various sources into a common data lake platform through an ETL Tool and support near real time use cases as well.

Design repeatable and reusable solution architectures and data ingestion pipelines for bringing in data from ERP source systems like SAP.

Manage data integration with tools like Databricks and Snowflake or equivalent data lake and data warehouses.

Design and Develop Data warehouses for Scale.

Design and Evaluate Data Models (Star, Snowflake and Flattened)

Design data access patterns for OLTP and OLAP based transactions.

Triage, debug, and fix technical issues related to Data Lakes and Data Warehouses

Serve and Share data through modern data warehousing tools and practices.

Coordinate with Business and Technical teams through all the phases in the software development life cycle.

Participate in making major technical and architectural decisions.

Hands-on prototyping of new technology solutions by working with cross teams

You Must Have:

5+ Years of Experience operating on AWS Cloud with building Data Lake and data warehousing architectures.

5+ Years of Experience building Data Warehouses on Snowflake, Redshift, HANA, Teradata, Exasol etc.

3+ Years of Experience with AWS Data services like S3, Glue, Lake Formation, EMR, Kinesis, RDS, DMS and others

3+ Years of Experience with Data Modelling experience

3+ Years of working knowledge in Spark or equivalent big data technologies.

3+ Years of Experience in building Delta Lakes using technologies like Databricks.

3+ Years of working experience in ETL tools like Talend, Informatica, SAP Data Services etc.

3+ Years of Experience in any programming language (Python, R, Scala, Java)

3+ Years of Experience working with ERP systems like SAP focusing on data integration.

Good understanding and implementation experience on GenAI models available with Cloud hyperscalers.

AWS Bedrock experience is a plus but not required.

Bachelors degree in computer science, information technology, data science, data analytics or related field

Experience working on Agile projects and Agile methodology in general.

Excellent problem solving, communications, and teamwork skills.

Exceptional presentation, visualization, and analysis skills

Thanks & Regards 

Mike Henry

Senior IT Recruiter

Email: [email protected]

Keywords: sthree rlang information technology Pennsylvania
Data Warehouse Architect --Cloud Architect --Only Local to PA
[email protected]
[email protected]
View all
Fri Jun 21 21:30:00 UTC 2024

To remove this job post send "job_kill 1502675" as subject from [email protected] to [email protected]. Do not write anything extra in the subject line as this is a automatic system which will not work otherwise.


Your reply to [email protected] -
To       

Subject   
Message -

Your email id:

Captcha Image:
Captcha Code:


Pages not loading, taking too much time to load, server timeout or unavailable, or any other issues please contact admin at [email protected]
Time Taken: 40

Location: Harrisburg, Pennsylvania