Databricks Platform Engineer at Remote, Remote, USA |
Email: [email protected] |
From: shayista, Resource logistics [email protected] Reply to: [email protected] Title - Databricks Platform Engineer Duration - 9+ Months Location Remote with travel on need basis Responsibilities : Must have skills - Terraform, Python, Databricks platform administration, Unity Catalog, Azure (platform experience), Integration/configuration experience of databricks with Github, Jenkins and reporting tools such as Tableau, Security, and governance of databricks platform. As a Senior Databricks Platform Engineer, you will have the opportunity to build Databricks platform from the ground up. You'll be putting your Databricks knowledge and experience to work as you collaborate with teams of Data Engineers, Data Scientists, and Analysts to build a world-class data platform. You will lead the Databricks platform to deliver services for projects and users, including tracking Databricks roadmap for new features and evaluating public/private preview offerings for future use. What youll do as a Senior Databricks Platform Engineer Automate and manage provisioning needs, such as Databricks Cluster Policies, Unity Catalog, Secrets Management, Role Based Access Control model and permissions. Tune Databricks for performance and utilization optimization. Provide infrastructure guidance of Databricks capabilities to accommodate business/technical use cases. Leverage your strong communication skills to keep users informed and provide excellent quality of service. Monitor and manage data and processes to control cost and optimize compute spend. Configure and manage monitoring/alerting around latency performance on the Databricks platform including Delta Live Tables and Databricks Workflows. Coordinate and collaborate with dependent infrastructure and Azure services to implement Databricks integration with services, such as ADLS, Access Management, SSO. Support CICD processes including Jenkins/GITHUB. Plan, implement and test disaster recovery protocols. Define standards and practices for platform use, with a focus on Data Engineering use cases. Provide technical expertise and troubleshooting as well as support for change management, governance. compliance, internal audits and remediations. What youll bring A proven track record of administration, engineering, and operationalizing the Databricks platform in a production mode at scale for Data Engineering, Data Science or MLOps use cases. Experience working in Azure. Strong experience with infrastructure as code technologies (eg Terraform). Strong experience with Python. Experience with integrating other platforms like Snowflake, Tableau, MongoDB with Databricks. Experience with streaming use cases and Databricks streaming technologies including Delta Live Tables. The ability to leverage new technologies to test, build, and optimize data platform processes, pipelines, transformations, architectures, and data sets. Expert level SQL knowledge in a variety of data engines (for example, Snowflake, MySQL, CosmosDB, SQL Server) is a big plus. Excellent communications skills and interpersonal skills to effectively communicate and collaborate with both business and technical teams. Experience with data logging/monitoring tools (e.g., Datadog, Splunk) is preferred. Experience with MLFlow is a plus. Keywords: |
[email protected] View all |
Sat Nov 18 00:17:00 UTC 2023 |