Home

Databricks Architect at Raleigh, North Carolina, USA
Email: [email protected]
From:

shivani sharma,

tek inspirations

[email protected]

Reply to:   [email protected]

Job Description -

Databricks Certifications in MUST

Title:C

Location: need local candidate to Raleigh, NC (Hybrid)

Job Description

**The candidate must come onsite on the first day to collect equipment.**

**All candidates must be local to the Triangle region of North Carolina, and posting may require up to 1-2 days per month in a Triangle area office for meetings.**

NCDIT-Transportation Database Team seeks a Databricks Administrator/Architect with proven skills for a 12-month engagement for creation/tuning & support of the Databricks environment. This position will be responsible for developing and designing the Databricks environment at NCDIT-T. This individual will work with internal staff to plan/design/maintain the Databricks environment and recommend changes needed to accommodate/grow as our business needs dictate. This individual will facilitate changes through DIT-Ts change process and work very closely with the DBA & Development Staff regarding all aspects of the design and planning of the Databricks environment.

Responsibilities:

Provide mentorship, guidance, overall knowledge share, and support to team members, promoting continuous learning and development.

Oversee the design, implementation, and maintenance of Databricks clusters.

Ensure the platforms scalability, performance, and security.

Provide escalated support and troubleshooting to users.

Oversee maintenance of role-based access to data and features in the Databricks Platform using Unity Catalog.

Review clusters health check and best practices implementation.

Review and maintain documentation for users and administrators.

Design and implement tailored data solutions to meet customer needs and use cases, spanning from ingesting data from APIs, building data pipelines, analytics, and beyond within a dynamically evolving technical stack.

Work on projects involving on-prem data ingestion into Azure using ADF.

Build data pipelines based on the medallion architecture that clean, transform, and aggregate data from disparate sources.

Skill

Required / Desired

Required Experience

Candidate Experience

Last Used

Extensive hands-on experience implementing Lakehouse architecture using Databricks Data Engineering platform, SQL Analytics, Delta Lake, Unity Catalog

Required

5

Strong understanding of Relational & Dimensional modeling.

Required

5

Demonstrate proficiency in coding skills - Python, SQL, and PySpark to efficiently prioritize perf, security, scalability, robust data integrations.

Required

6

Experience implementing serverless real-time/near real-time arch. using Cloud (i.e., Azure, AWS, or GCP Tech Stack), and Spark tech (Streaming & ML)

Required

2

Experience Azure Infra config (Networking, architect and build large data ingestion pipelines and conducting data migrations using ADF or similar tech

Required

4

Experience working w/ SQL Server features such as SSIS and CDC.

Required

7

Experience with Databricks platform, security features, Unity Catalog, and data access control mechanisms.

Required

2

Experience with GIT code versioning software.

Required

4

Databricks Certifications

Desired

Keywords: cprogramm machine learning North Carolina
Databricks Architect
[email protected]
[email protected]
View all
Sat Nov 02 03:55:00 UTC 2024

To remove this job post send "job_kill 1896003" as subject from [email protected] to [email protected]. Do not write anything extra in the subject line as this is a automatic system which will not work otherwise.


Your reply to [email protected] -
To       

Subject   
Message -

Your email id:

Captcha Image:
Captcha Code:


Pages not loading, taking too much time to load, server timeout or unavailable, or any other issues please contact admin at [email protected]
Time Taken: 58

Location: Raleigh, North Carolina