Home

REMOTE - Azure Data Architect - Databricks & Snowflake at Snowflake, Arizona, USA
Email: [email protected]
From:

Vick Singh,

PEOPLE FORCE CONSULTING, INC.

[email protected]

Reply to: [email protected]

As a Azure Data Architect - Databricks & Snowflake, you will be a part of an Agile team to build healthcare applications and implement new features while adhering to the best coding development standards.

Experience: - 10+ Years

Location: - Remote EST time zone

Tentative Duration - 8 - 10 weeks +

VISA COPY, PHOTO ID AND BACHELORS COPY REQUIRED BY THE CLIENT ALONG WITH EVERY RESUME SUBMISSION.

Educational Qualifications: -
Engineering Degree BE/ME/BTech/MTech/BSc/MSc.
Technical certification in multiple technologies is desirable.

Responsibilities: -
Design, deploy, and manage scalable, secure, and resilient infrastructure on AWS. Develop, maintain, and optimize CI/CD pipelines using Jenkins and other CI/CD tools.
Utilize Terraform to define, provision, and manage infrastructure as code. Write and maintain scripts using Bash and Python to automate tasks and improve efficiency.
Collaborate with development and operations teams to streamline software development and deployment processes.
Implement monitoring solutions to ensure the health and performance of applications and infrastructure.
Ensure security best practices are integrated into infrastructure and application development. Troubleshoot and resolve infrastructure and application issues promptly.
Continuously evaluate and integrate new tools and technologies to improve DevOps processes.
Document processes, configurations, and best practices for knowledge sharing and compliance.

Mandatory skills / Required Experience
12+ years of experience in implementing large data and analytics platforms.
Deep experience architecting and building data platforms on Azure with strong experience on Medallion Architecture.
Experience architecting and building data platforms on Databricks from scratch.
Proficient in all Databricks engineering services Unity Catalog, Spark Jobs, Delta Live Table (DLT), DLT Meta, Databricks Workflow, Auto Loader.
In-depth experience with data governance, data integration and related technologies.
Hands on in Databricks SQL, PySpark Expertise in data modelling, data mappings, understand the data model changes and propose the best practices and guidelines.
Creating technical design based on end-to-end data architecture and frameworks for meta data driven data ingestion, transformation, logging and monitoring.
Deploying data pipelines/applications to higher environments using CI-CD pipelines.
Experience in writing technical user stories and worked in Agile methodology and scrums.
Excellent communication, presentation and client handling skills written and verbal.

Keywords: continuous integration continuous deployment Idaho
REMOTE - Azure Data Architect - Databricks & Snowflake
[email protected]
[email protected]
View all
Thu Aug 15 19:03:00 UTC 2024

To remove this job post send "job_kill 1662457" as subject from [email protected] to [email protected]. Do not write anything extra in the subject line as this is a automatic system which will not work otherwise.


Your reply to [email protected] -
To       

Subject   
Message -

Your email id:

Captcha Image:
Captcha Code:


Pages not loading, taking too much time to load, server timeout or unavailable, or any other issues please contact admin at [email protected]
Time Taken: 0

Location: ,