Home

SCALA PYTHON DEVELOPER at Remote, Remote, USA
Email: [email protected]
From:

Zara,

TechRakers

[email protected]

Reply to: [email protected]

Subject: scala/python roles for Availity-100% remote-

op Skills' Details

1. 8-10+ years of experience with hands on development with Scala and Spark. This role is 100% hands on development. Scala is the top programming requirement for this role. Need someone strong in Scala. Bulk of the work is with Apache Spark/Scala programming language. This person will be writing Scala code that will run on Apache Spark

2. This person must have strong knowledge of Database/SQL fundamental concepts mixed with cloud concepts and fundamentals.

3. Database experience required: Oracle database, Postgres/SQL, Nice to have: Aurora and RedShift

4. Experience with AWS is required- this team is working on several client implementations in AWS. Must have strong experience with EC2 and Lambda.

6. Airflow experience is required

7. Must be strong in SQL

8. Must have strong OOP concepts and programming concepts background.

Secondary Skills - Nice to Haves

amazon web services

Job Description

At Availity, we're not just another Healthcare Technology company; we're pioneers reshaping the future of healthcare! With our headquarters in vibrant Jacksonville, FL, and an exciting office in Bangalore, India, along with an exceptional remote workforce across the United States, we're a global team united by a powerful mission.

We're on a mission to bring the focus back to what truly matters patient care. As the leading healthcare engagement platform, we're the heartbeat of an industry that impacts millions. With over 2 million providers connected to health plans, and processing over 13 billion transactions annually, our influence is continually expanding.

Join our energetic, dynamic, and forward-thinking team where your ideas are celebrated, innovation is encouraged, and every contribution counts. We're transforming the healthcare landscape, solving communication challenges, and creating connections that empower the nation's premier healthcare ecosystem.

Reporting to Application Development Manager, the Big Data Software Engineer IV will work on a dedicated team of engineers developing, enhancing, and maintaining Availitys high transactional Provider Data Management platform.

Sponsorship, in any form, is not available for this position.

Location: Remote US

Why work on this team:

This team supports a high transactional platform that directly impacts patient experience

This team is working to continually improve process and enhance platform capabilities

What you will be doing:

Develop a scalable and resilient cloud data platform and scalable data pipelines.

Ensure industry best practices around data pipelines, metadata management, data quality, data governance, and data privacy.

Build highly scalable AWS Infrastructure (from scratch or through 3rd party products) to enable Big Data Processing in the platform.

Find optimization within cloud resource usage to minimize costs while maintaining system reliability, including leveraging reserved instances and spot instances effectively.

Find performance sensitive considerations within development best practices, as well as, troubleshooting across the data platform utilizing tools (e.g., Splunk and New Relic, Cloud Watch, etc.) to ensure performance measurement and monitoring.

Participate in coding best practices, guidelines and principles that help engineers write clean, efficient, and maintainable code.

Participate in code reviews to catch issues, improve code quality, and provide constructive feedback to individuals within the team during code reviews.

Working on ETL transformation which includes gathering raw data and files from the client, transforming it into Availitys format and sending down the ETL pipeline for further processing

Working on a team following Agile Scrum principles

Incorporating development best practices

Ensuring your code is efficient, optimized, and performant

Collaborating on programming or development standards

Maintaining technical debt and applying security principles

Innovating with ideas and products to the organization

Performing unit testing and complex debugging to ensure quality

Learning new things & sharing your knowledge with others

Requirements:

Bachelors degree preferably Computer Science, Engineering, or other quantitative fields

6+ years of related experience in designing and implementing enterprise applications using big data

5+ years of experience in a senior level engineering role mentoring other engineers, which includes engineering best practices, unblocking, code reviews, unit testing, managing deployments, technical guidance, system design, etc.

5+ years of experience working with large-scale data and developing SQL queries

Advanced experience with scripting languages (e.g., Python, Bash, node.js) and programming languages (e.g., SQL, Java, Scala) to design, build, and maintain complex data processing, ETL (Extract, Transform, Load) tasks, and AWS automation.

5+ years of hands-on experience with AWS cloud services, such as Apache Spark, with Scala, AWS EMR, Airflow, RedShift

4+ years of experience with RESTFul APIs and web services

Excellent communication skills including discussions of technical concepts, soft skills, conducting peer-programming sessions, and explaining development concepts

In-depth understanding of Spark framework, scripting languages (e.g., Python, Bash, node.js) and programming languages (e.g., SQL, Java, Scala) to design, build, and maintain complex data processing, ETL (Extract, Transform, Load) tasks, and AWS automation.

A firm understanding of unit testing.

Possess in-depth knowledge of AWS services and data engineering tools to diagnose and solve complex issues efficiently, specifically AWS EMR for big data processing.

In-depth understanding of GIT or other distributed version control systems.

Excellent communication. Essential to performing at maximum efficiency within the team.

Collaborative attitude. This role is part of a larger, more dynamic team that nurtures collaboration.

Strong technical, process, and problem-solving proficiency.

Thorough understanding of complex data structures and transformations, such as nested JSON, XML, Avro, or Parquet, into structured formats suitable for analysis and large datasets (100 gigs or more).

Advance skills in data cleansing, deduplication, and quality validation to maintain high-quality d

Keywords: javascript information technology Florida
SCALA PYTHON DEVELOPER
[email protected]
[email protected]
View all
Thu May 02 22:46:00 UTC 2024

To remove this job post send "job_kill 1362702" as subject from [email protected] to [email protected]. Do not write anything extra in the subject line as this is a automatic system which will not work otherwise.


Your reply to [email protected] -
To       

Subject   
Message -

Your email id:

Captcha Image:
Captcha Code:


Pages not loading, taking too much time to load, server timeout or unavailable, or any other issues please contact admin at [email protected]
Time Taken: 5

Location: ,