Immediate Hiring Infrastructure Azure Data Bricks Operations Engineer || Contract at Remote, Remote, USA |
Email: [email protected] |
From: Dharani, VDart [email protected] Reply to: [email protected] Hello, Hope you are doing well, Please find the below job description and if you are interested kindly let me know. Title: Infrastructure Azure Data Bricks Operations Engineer Duration: Contract Location: Remote Job Description : Qualifications Extensive hands-on experience implementing Lake house architecture using Data bricks Data Engineering platform, SQL Analytics, Delta Lake, and Unity Catalog Extensive hands-on experience implementing enterprise grade Cloud Databricks Infrastructure(i.e., Azure Stack), RBAC, SCIM etc., using IAC tools like Terraform Extensive hands-on experience implementing server less real-time/near real-time architecture using Cloud native services (i.e., Azure Stack), and Spark technologies (e.g., Spark Streaming, Spark ML) Experience in data governance fields as lineage, quality, catalog, multi-cloud experience, etc. 5+ years of experience in Data bricks engineering solutions on one of the Cloud platforms (Azure) 5+ years of experience in Data Architecture and delivery experience on one of the Cloud platforms (Azure) 5+ years of Demonstrated experience in Microsoft Azure cloud solutions, architecture, related technologies. Demonstrable experience working in transformational projects where you implemented data platforms and/or ETL/ELT/CDC pipelines using Databricks. 5+ years of Demonstrated experience in Microsoft Azure cloud solutions, architecture, related technologies. Experience with conceptualizing and architecting data lakes on cloud-centric platforms; should be able to articulate the value and key components of data lakes. Experience in Databricks Overwatch. Excellent interpersonal, verbal, and written communication skills Passionate about learning new technologies. Responsibilities Implement Cloud(Azure) Delta lake house platform infrastructure, RBAC using IAC(Terraform) as per the industry best practices. Work with architecture / security teams to remediate infrastructure related concerns and recommendations. Work with clients to define their cloud data platform and implement cutting-edge data solutions to provide valuable business insights including AI/ML Drive analysis, architecture, design, governance and development of data warehouse, data lake, and business intelligence solutions Research, synthesize, recommend, and select technical approaches for solving difficult migration, development, and integration problems. Understanding the Python , Scala and SQL Codes and troubleshooting databricks , database and azure connectivity and access related issues. Conduct and support workshops, design sessions, and project meetings as needed, playing a key role in client relations. Utilize Lake house Fundamentals, Developer Foundations Capstone, Developer Essentials Capstone, etc. Collaborate applications teams/Business users to develop new pipelines with Cloud data migration methodologies and processes including tools like Azure Data Factory, Event Hub, etc. Shift: On-call 24x7 support Work hours will not exceed 80 hrs in 2 weeks with Comp time off given for hours worked after hours and weekend. Comp time must be taken in the 2nd week to ensure that between the 2 weeks, does not exceed 80 hrs. (there is no extra $ to bill) Waiting for your response. Please share your available time for me to contact you directly through phone call. Best Regards, Dharani K Technical Recruiter VDart Inc Email: [email protected] LinkedIn: http://www.linkedin.com/in/dharani-krishnasamy-396116178/ Keywords: artificial intelligence machine learning |
[email protected] View all |
Tue Jan 23 01:47:00 UTC 2024 |