Urgent requirement for BIG DATA Developer with Spark, Scala at Remote at Remote, Remote, USA |
Email: [email protected] |
From: Praveen Kumar, Magicforce [email protected] Reply to: [email protected] Job Title: BIG DATA Developer with Spark, Scala Location: Remote Duration: 1+ Year Mandatory Skills: SPARK, SCALA, Bigdata concepts Transform two different tables to a single one by using scala sudo code. Job Description: Bachelors degree in Computer Science, Information Technology, or a related field. Proven experience as a Databricks Engineer with a focus on AWS. Strong programming skills in languages such as Python or Scala. Experience with data engineering tools and technologies, including ETL frameworks. Knowledge of cloud computing platforms, especially AWS. Familiarity with big data processing frameworks such as Apache Spark. Ability to optimize and fine-tune Databricks clusters for performance. Excellent problem-solving and troubleshooting skills. Strong communication and collaboration skills. Preferred Qualifications: AWS certification in Big Data or related field. Previous experience with data lakes and data warehousing solutions. Familiarity with data governance and compliance standards. Keywords: Urgent requirement for BIG DATA Developer with Spark, Scala at Remote [email protected] |
[email protected] View all |
Wed Jun 05 18:47:00 UTC 2024 |