Need Profiles - Sr Databricks Developer role ll Princeton, NJ :: Local Preferred at Princeton, New Jersey, USA |
Email: [email protected] |
From: Ashish Kumar, Themesoft Inc [email protected] Reply to: [email protected] Dear Partner, Hope you are doing well ! My name is Ashish Kumar and Im from Themesoft Inc. I hope this email finds you well. I am reaching out to you because I came across your profile and I believe that you have excellent candidates to be a great fit for a job opportunity we currently have available with one of our clients. We extremely admire your immediate support for the open onsite demand, Please find the details mentioned below and share profiles accordingly. # Pls share local resource only, need profiles on priority Work Location/Client Location (with City, State & Zip Code) Princeton, NJ, 08540 Job Title/Role Databricks Developer Experience Level Required 7-10 years Mandatory Required Skills Azure Databricks Apache Spark Data Modelling Azure Data Lake Creation Python Programming Preferred /Desired Skills predictive analytics Experience in ML Libraries Detailed Job Description Responsibilities: Develop and maintain ETL (Extract, Transform, Load) pipelines using Databricks to process and transform large datasets. Collaborate with data engineers and data scientists to design and implement scalable and efficient data processing workflows. Build and optimize Apache Spark jobs and clusters on the Databricks platform. Develop and maintain data ingestion processes to acquire data from various sources and systems. Implement data quality checks and validation procedures to ensure accuracy and integrity of data. Perform data analysis and exploratory data mining to derive insights from complex datasets. Design and implement machine learning workflows using Databricks for predictive analytics and model training. Troubleshoot and debug issues related to data processing, performance, and job failures. Collaborate with cross-functional teams to understand business requirements and translate them into technical solutions. Stay updated with the latest advancements in big data technologies and contribute to the improvement of existing systems and processes. Requirements: Solid experience in developing data processing workflows using Apache Spark and Databricks. Proficiency in programming languages like Python, Scala, or SQL for data manipulation and analytics. Strong understanding of distributed computing principles and experience with large-scale data processing frameworks. Familiarity with cloud platforms such as Amazon Web Services (AWS), Microsoft Azure, or Google Cloud Platform (GCP). Experience with data modeling, database systems, and SQL. Knowledge of machine learning concepts and experience with ML libraries and frameworks. Excellent problem-solving skills and ability to work independently and in a team. Strong communication skills to collaborate with stakeholders from different technical backgrounds. Keywords: machine learning access management New Jersey |
[email protected] View all |
Wed Jul 26 00:10:00 UTC 2023 |