Requirement of Software Engineer (spark) at Remote, Remote, USA |
Email: [email protected] |
Job title : Software Engineer Work Location LOWE'S Tech Hub 4+ years of experience in software development experience using Java and PySpark 2+ years of experience in designing, developing and optimizing data pipelines and jobs using PySpark and Requirements: Apache Spark framework 2+ years of experience in application support and maintenance of PySpark applications. 2+ years of experience in designing and implementing data workflows with Apache Airflow. 2+ years of experience in performance tuning techniques for large-scale data processing. 2+ years of experience in handling implementations involving data storage, and database querying using Spark SQL, PostgreSQL Additional Requirements: Adherence to clean coding principles: Candidates should be capable of producing code that is devoid of bugs and can be easily understood and replicated by other developers. Strong teamwork abilities, Apache Spark developers typically collaborate closely with data scientists and other backend developers. Therefore, candidates should exhibit excellent communication and collaboration skills. Please share resume to [email protected] Thanks & Regards Anusha Smart Info Solutions -- Keywords: information technology Requirement of Software Engineer (spark) [email protected] |
[email protected] View all |
Tue Jun 11 20:42:00 UTC 2024 |