Urgent requirement for Spark Scala Developer at Remote at Remote, Remote, USA |
Email: [email protected] |
From: Praveen Kumar, Magicforce [email protected] Reply to: [email protected] Job Title: Spark Scala Developer Location: Remote Duration: 1+ Year Job Description: Mandatory Skills: AWS, SPARK, SCALA, Bigdata concepts Transform two different tables to a single one by using scala sudo code. Bachelors degree in Computer Science, Information Technology, or a related field. Proven experience as a Databricks Engineer with a focus on AWS. Strong programming skills in languages such as Python or Scala. Experience with data engineering tools and technologies, including ETL frameworks. Knowledge of cloud computing platforms, especially AWS. Familiarity with big data processing frameworks such as Apache Spark. Ability to optimize and fine-tune Databricks clusters for performance. Excellent problem-solving and troubleshooting skills. Strong communication and collaboration skills. Preferred Qualifications: AWS certification in Big Data or related field. Previous experience with data lakes and data warehousing solutions. Familiarity with data governance and compliance standards. Keywords: Urgent requirement for Spark Scala Developer at Remote [email protected] |
[email protected] View all |
Wed May 29 01:20:00 UTC 2024 |