Databricks Developer at Remote, Remote, USA |
Email: [email protected] |
From: Pallavi, nss [email protected] Reply to: [email protected] C2C-Databricks Developer (100% Remote)-10+yrs LinkedIn created before 2014 and need to share passport number. Please find the below job description and let me know Must haves- Apache Spark: Deep understanding of Spark's core concepts, performance optimization, and the ability to develop efficient data processing jobs. Databricks data catalog: create tables from S3 and make them available in the unity catalog Python, and SQL. Knowledge of R is a plus. Hands-on experience with the Databricks platform, including Databricks SQL, Delta Lake, and Databricks Workspaces. Experience with AWS and understanding how to integrate them with Databricks. Version Control: Proficiency in using version control systems, such as Git, for code management and collaboration. Great to haves- DevOps and CI/CD: Understanding of DevOps principles and experience with CI/CD pipelines to automate testing and deployment of Databricks jobs. Data Engineering Skills: Experience in designing and implementing scalable and reliable data pipelines, understanding of ETL processes, and familiarity with data modeling techniques. Machine Learning and Data Science: Knowledge of machine learning algorithms and data science principles. Experience with MLflow for managing the machine learning lifecycle. Job Type: Contract Pay: $50.00 - $55.00 per hour Expected hours: 40 per week Experience level: 10 years 11+ years Schedule: Day shift Application Question(s): LinkedIn ID Experience: Spark: 10 years (Preferred) Python: 10 years (Preferred) Microsoft SQL Server: 8 years (Preferred) AWS: 6 years (Preferred) Databricks: 6 years (Preferred) Work Location: Remote Keywords: continuous integration continuous deployment sthree rlang Idaho Databricks Developer [email protected] |
[email protected] View all |
Sun May 26 04:10:00 UTC 2024 |