Home

New Requirement Data Architect with Snowflake and Spark Experience at Snowflake, Arizona, USA
Email: [email protected]
Role:
Data
Architect

Location: Jersey City / Boston (Hybrid)

Contract

Job Description

We are seeking a Data Architect to help drive our data strategy for
Private Banking and Investment Management business lines.  This individual
must have experience working on modern data platforms with the capabilities of
supporting bigdata, relational/Non-relational databases, data warehousing,
analytics, machine learning and data lake.  Key responsibilities will
include developing and migrating off our legacy Oracle Data Warehouses to a new
data platform as the foundation for a key set of offerings running on Oracle
Exadata and Cloudera's distribution technology.  Data Architect will
continue to support, develop, and drive the data roadmap supporting our system
and business lines.

Primary Skill
: Snowflake, Spark, Kafka and SQL Queries

Key Responsibilities Include:

Participate in
strategic planning and contribute to the organizations data strategy and
roadmap.

Completely understand
the current DW systems and user communities data needs and requirements.

Define Legacy Data
Warehouse migration strategy. Understand existing target platform and data
management environment.

Build the Facilitate
the establishment of a secure data platform on OnPrem Cloudera infrastructure .

Document and develop
ETL logic and data flows to facilitate the easy usage of data assets, both
batch and real-time streaming.

Migrate,
operationalize and support of the platform.

Manage and provide
technical guidance and support to the development team, ensuring best practices
and standards are followed.

Qualifications for your role would include:

Bachelors
degree in computer science or related technical field, or equivalent
experience

10+ years of
experience in an IT, preliminary on hands on development

Strong knowledge
of architectural principles, frameworks, design patterns and industry best
practices for design and development.

6+ years real
DW project experience

Strong hands-on experience with Snowflake

Strong hands-on experience with Spark

Strong hands-on experience with Kafka

Experience with performance tuning of SQL Queries
and Spark

Experience in designing efficient and robust
ETL/ELT workflows and schedulers.

Experience
working with Git, Jira, and Agile methodologies.

End-to-end
development life-cycle support and SDLC processes

Communication
skills both written and verbal

Strong
analytical and problem-solving skills

Self-driven,
work in teams and independently if required.

Nice To Have:

             
Working experience with Snowflake, AWS/Azure/GCP

             
Working experience in a Financial industry is a plus

--

Keywords: information technology
New Requirement Data Architect with Snowflake and Spark Experience
[email protected]
[email protected]
View all
Wed May 08 00:08:00 UTC 2024

To remove this job post send "job_kill 1374902" as subject from [email protected] to [email protected]. Do not write anything extra in the subject line as this is a automatic system which will not work otherwise.


Your reply to [email protected] -
To       

Subject   
Message -

Your email id:

Captcha Image:
Captcha Code:


Pages not loading, taking too much time to load, server timeout or unavailable, or any other issues please contact admin at [email protected]
Time Taken: 1

Location: ,