Scala Spark Developer at Remote, Remote, USA |
Email: [email protected] |
From: malini, smacforce consulting [email protected] Reply to: [email protected] Title: Scala / Spark Developer Location: Onsite, New York City Duration: 3-5 Years Job Requirement Details: We are looking expert with ability to build large modules in compute platform using Hadoop Hive, heavy use of in memory data structures. Looking looking for 2 strong hands on developers for my team in NYC, 2/3 days at work other days remote. Strictly 9am-5pm working hours, at Onsite. Scala Programming, Spark Programming Or SpySpark Programming Mandatory at least 2-4 years of proven hands on expertise on Scala and/or Spark or SpySpark. With some SQL knowledge and work in compute platform based on Hadoop and Hive. DevOps exposure to independently integrate and tag/package code is critical as well. Tools: Jenkins, Zookeeper, YAml, Autosys/Airflow/Contrl M or any other scheduler NOTE: NOT TO pursue if you have find profiles as below since these are not client interested in. They are particular about Scala or Spark programming. 1. Cloud Engineer(AWS or Azure or any other) 2. ETL pipeline automation engineer (using Jupyter Notebook/Airflow/Data bricks compute jars etc) 3. *Spark SqL or Snowflake or Databricks *programmer 4. Python programmer 5. programmer in Java(or advanced Java with In memory APIs usage) Malini [email protected] 267-649-4530 Keywords: Scala Spark Developer [email protected] |
[email protected] View all |
Tue Oct 29 18:35:00 UTC 2024 |