Urgent requirement for Big Data Developer with Spark Scala at Remote at Remote, Remote, USA |
Email: [email protected] |
From: Praveen Kumar, Magicforce [email protected] Reply to: [email protected] Job Title: Spark Scala Developer Location: Remote Duration: 1+ Year Mandatory Skills: AWS, SPARK, SCALA, Bigdata concepts Transform two different tables to a single one by using scala sudo code. Job Description: Bachelors degree in Computer Science, Information Technology, or a related field. Proven experience as a Databricks Engineer with a focus on AWS. Strong programming skills in languages such as Python or Scala. Experience with data engineering tools and technologies, including ETL frameworks. Knowledge of cloud computing platforms, especially AWS. Familiarity with big data processing frameworks such as Apache Spark. Ability to optimize and fine-tune Databricks clusters for performance. Excellent problem-solving and troubleshooting skills. Strong communication and collaboration skills. Preferred Qualifications: AWS certification in Big Data or related field. Previous experience with data lakes and data warehousing solutions. Familiarity with data governance and compliance standards. Keywords: Urgent requirement for Big Data Developer with Spark Scala at Remote [email protected] |
[email protected] View all |
Tue Jun 04 20:05:00 UTC 2024 |