Home

Data Architect (Snowflakes + Azure Databricks) at Remote, Remote, USA
Email: [email protected]
From:

Sandeep Bisht,

Key Infotek

[email protected]

Reply to:   [email protected]

Data Architect (Snowflakes + Databricks)

Boston, MA

Local to Boston/relocate to Boston

Onsite - Hybrid 3 Days (Tues - Thursday)

No H1B Transfers at this point.

Primary Skill: Snowflake

Secondary Skill: Databricks

So candidates need to understand both Snowflake and Data Bricks. Majority is Snowflake

Job Description:

This is a hands-on technology position in our esteem client seeking a data technology leader with specialized business knowledge in the middle/front office areas. The candidate is someone with proven record of technology project execution for data on cloud, with ability to get hands on when it comes to analysis design and development, someone with creativity and self-motivation to delivery on mission critical projects.

As a technical architect on the data architecture team, you will be building the next generation distributed data storage and processing systems that are based on industry leading distributed data design patterns using Data mesh, data product supporting diverse workloads ranging from ETL, reporting to data science.

What you will be responsible for

Hands on design experience and understanding in designing and developing data processing patterns to simplify the complexity of real-world data engineering architecture. Patterns which are cost-efficient, scalable, provide performance and reliability of a modern 
lakehouse, with low latency of streaming,

Work with clients architect team to design and develop the standard framework modules, high performance services and client libraries for big data using one or more tools from aws, azure, Kubernetes, spark, sql-DBT, databricks and snowflake, 
AWS S3, Azure Blob Store.

Hands on experience in operationalizing tens of hundreds of data pipelines, deploy, test & upgrade pipelines and best practices to eliminate operational burdens for managing and building high quality data pipelines.

Expertise in in query optimizer and execution engine that's fast, tuning free, scalability, ACID transactions and time travel patterns using 
spark jobs, DBT on Databricks and Snowflake.

Quickly evaluate various technologies and complete POC driving architecture design for the applications

Work in complex environment with multi location teams

Work in team of agile developers

What we value

These skills will help you succeed in this role:

Having 10+ Years of experience in application development team with hands on architecting, designing, developing and deployment skillset.  Have demonstrated ability to translate business requirements in a technical design and through to implementation.

Experienced Subject Matter Expert in designing & architecting BigData platforms services, systems using Java/python, SQL, Databricks, Snowflake, cloud native tools on azure and aws.

Experience in event driven architectures, message hub, MQ, Kafka.

Experience in 
Kubernetes, ETL tools, Data as a Service, Star Schema, Dimension modelling, OLTP, ACID and data structures is desired.

Proven Experience with Cloud and Big data platforms, building data processing application utilizing spark, Airflow, Object storage etc.

Ability to work in an on-shore/off-shore model working with development teams across continents.

Use coding standard, secured application development, documentation, Release and configuration management and expertise in CI/CD.

Well versed in SDLC using Agile Scrum.

Plan and execute the deployment of releases.

Ability to work with Application Development, SQA, and Infrastructure team.

Strong leadership skill, analytical problem-solving skills along with the ability to learn and adapt quickly

Self-motivated, quick learner and creative problem solver, organized, and responsible for managing a team of dev engineers.

Good experience delivering large scale projects in financial industry

Education & Preferred Qualifications

Bachelors degree and 6 or more years of experience in Information Technology

Strong team ethics and team player.

Desired experience preferably in financial industry working on financial reporting/regulatory projects is a plus. Domain knowledge of financial industry concepts Securities 101 Capital Markets 101 etc. is desirable

Cloud certification, Databricks or snowflake Certification is plus

Experience in evaluating software estimating cost and delivery timelines and managing financials

Experience leading agile delivery & adhering to SDLC processes is required

Work closely with the business & IT stake holders to manage delivery 

Additional requirements

Ability to lead delivery, manage team members if required and provide feedback

Ability to make effective decisions and manage change

Communicates effectively in a professional manner both written and orally

Team player with a positive attitude enthusiasm initiative and self-motivation

Keywords: continuous integration continuous deployment message queue sthree information technology Massachusetts
[email protected]
View all
Sat Jul 22 00:30:00 UTC 2023

To remove this job post send "job_kill 436896" as subject from [email protected] to [email protected]. Do not write anything extra in the subject line as this is a automatic system which will not work otherwise.


Your reply to [email protected] -
To       

Subject   
Message -

Your email id:

Captcha Image:
Captcha Code:


Pages not loading, taking too much time to load, server timeout or unavailable, or any other issues please contact admin at [email protected]
Time Taken: 1

Location: ,