Databricks Administrator Architect at Raleigh, North Carolina, USA |
Email: [email protected] |
From: Dheeraj, webgeosolutions [email protected] Reply to: [email protected] Greetings from Web Geo Solutions, LLC. We have an exciting new job opportunity with our direct client, matching your skills. Please review the job description and respond with all the requested details if you are interested. Job Details JOB ID: NCDOT - Technical Specialist- Junior (749745) Client: State of NC Last day to apply: 11/6 Initial Term: 12 Months+ Tentative start date: ASAP Interview mode: Webcam/In-Person Work location: Raleigh, NC Job Description: **The candidate must come onsite on the first day to collect equipment.** **All candidates must be local to the Triangle region of North Carolina, and posting may require up to 1-2 days per month in a Triangle area office for meetings.** NCDIT-Transportation Database Team seeks a Databricks Administrator/Architect with proven skills for a 12-month engagement for creation/tuning & support of the Databricks environment. This position will be responsible for developing and designing the Databricks environment at NCDIT-T. This individual will work with internal staff to plan/design/maintain the Databricks environment and recommend changes needed to accommodate/grow as our business needs dictate. This individual will facilitate changes through DIT-Ts change process and work very closely with the DBA & Development Staff regarding all aspects of the design and planning of the Databricks environment. Reponsibiities: *Provide mentorship, guidance, overall knowledge share, and support to team members, promoting continuous learning and development. *Oversee the design, implementation, and maintenance of Databricks clusters. *Ensure the platforms scalability, performance, and security. *Provide escalated support and troubleshooting to users. *Oversee maintenance of role-based access to data and features in the Databricks Platform using Unity Catalog. *Review clusters health check and best practices implementation. *Review and maintain documentation for users and administrators. *Design and implement tailored data solutions to meet customer needs and use cases, spanning from ingesting data from APIs, building data pipelines, analytics, and beyond within a dynamically evolving technical stack. *Work on projects involving on-prem data ingestion into Azure using ADF. *Build data pipelines based on the medallion architecture that clean, transform, and aggregate data from disparate sources. If you are interested, please reply with below details for further evaluation Please send all the requested below information for consideration Contact details (and employer details if C2C) Your full legal name (Must match with state issued ID or passport) Phone number Personal email id (this same id should be used to send E-RTR after submission) Desired hourly rate (above max advertised rate is not considered): W2: $ C2C: $ Work Status (H1B, EAD, GC, US Citizen), please attach a copy of work authorization: If H1B, who holds the visa (contact details, please include name, phone number and email) If selected, how soon would you be available to start Have you ever worked for this client If so, please provide team name, project name, reason for leaving and when the engagement ended. Upon selected, employment is subjected to 7-year national criminal background check. Will you be able to attend In-Person interview (Yes or No) Your current address Are you available to be on-site at clients location for the duration of the project Note: Payment for all approved hours will be paid at the straight hourly rate regardless of the total hours worked (regular and overtime hours) by the engaged resource. Do you accept this requirement Upon selection, candidate must be able to provide three (3) references from prior engagements before a final decision is made. Do you accept this requirement LinkedIn profile URL Skill Matrix NCDOT - Technical Specialist- Junior (749745) Skills Required / Desired Amount in years Your actual Experience in years (Please fill all of them) Extensive hands-on experience implementing Lakehouse architecture using Databricks Data Engineering platform, SQL Analytics, Delta Lake, Unity Catalog Required 5 Strong understanding of Relational & Dimensional modeling. Required 5 Demonstrate proficiency in coding skills - Python, SQL, and PySpark to efficiently prioritize perf, security, scalability, robust data integrations. Required 6 Experience implementing serverless real-time/near real-time arch. using Cloud (i.e., Azure, AWS, or GCP Tech Stack), and Spark tech (Streaming & ML) Required 2 Experience Azure Infra config (Networking, architect and build large data ingestion pipelines and conducting data migrations using ADF or similar tech Required 4 Experience working w/ SQL Server features such as SSIS and CDC. Required 7 Experience with Databricks platform, security features, Unity Catalog, and data access control mechanisms. Required 2 Experience with GIT code versioning software. Required 4 Databricks Certifications Desired Your resume in MS Word format, copy of any relevant certifications & copy of work authorization (H1B/GC/EAD). Please check the other positions that are currently open with several NC State agencies http://www.indeed.com/cmp/Web-Geo-Solutions,-LLC/jobs Keywords: machine learning green card wtwo microsoft Idaho North Carolina Databricks Administrator Architect [email protected] |
[email protected] View all |
Wed Nov 06 01:19:00 UTC 2024 |