Home

Data Engineer AWS Pyspark with Python at Remote, Remote, USA
Email: [email protected]
From:

Rishav Verma,

Tanisha Systems Inc

[email protected]

Reply to: [email protected]

Job Type :: Contract/Fulltime

Job Title:- AWS Pyspark with Python

Job Location:- Plano TX / Wilmington DE (5 Days Onsite No Remote)

Experience: 12+ Years & Locals Only

Note: In-person Interview is mandatory

Job Description:

Duties and responsibilities

Mandatory Skills:

5+ years of experience in a data engineering position

Proficiency is Python (or similar) and SQL

Strong experience building data pipelines with Spark

Strong verbal & written communication

Strong analytical and problem solving skills

Experience with relational datastores, NoSQL datastores and cloud object stores

Experience building data processing infrastructure in AWS

Bonus: Experience with infrastructure as code solutions, preferably Terraform

Bonus: Cloud certification

Bonus: Production experience with ACID compliant formats such as Hudi, Iceberg or

Delta Lake

Bonus: Familiar with data observability solutions, data governance frameworks

Requirements

Bachelors Degree in Computer Science/Programming or similar is preferred

Right to work

Must have legal right to work in the USA

Job responsibilities

Required qualifications, capabilities, and skills

Formal training or certification on software engineering concepts and 10+ years applied experience

Hands-on practical experience delivering system design, application development, testing, and operational stability

Advanced in one or more programming language(s) - Java, Python, Go

A strong understanding of business technology drivers and their impact on architecture design, performance and monitoring, best practices

Design and building web environments on AWS, which includes working with services like EC2, ALB, NLB, Aurora Postgres, DynamoDB, EKS, ECS fargate, MFTS, SQS/SNS, S3 and Route53

Advanced in modern technologies such as: Java version 8+, Spring Boot, Restful Microservices, AWS or Cloud Foundry, Kubernetes.

Experience using DevOps tools in a cloud environment, such as Ansible, Artifactory, Docker, GitHub, Jenkins, Kubernetes, Maven, and Sonar Qube

Experience and knowledge of writing Infrastructure-as-Code (IaC) and Environment-as-Code (EaC), using tools like CloudFormation or Terraform

Experience with high volume, SLA critical applications, and building upon messaging and or event-driven architectures

Deep understanding of financial industry and their IT systems

Preferred qualifications, capabilities, and skills

Expert in one or more programming language(s) preferably Java

AWS Associate level certification in Developer, Solutions Architect or DevOps

Experience in building the AWS infrastructure like EKS, EC2, ECS, S3, DynamoDB, RDS, MFTS, Route53, ALB, NLB

Experience with high volume, mission critical applications, and building upon messaging and or event-driven architectures using Apache Kafka

Experience with logging, observability and monitoring tools including Splunk, Datadog, Dynatrace. CloudWatch or Grafana

Experience in automation and continuous delivery methods using Shell scripts, Gradle, Maven, Jenkins, Spinnaker

Experience with microservices architecture, high volume, SLA critical applications and their interdependencies with other applications, microservices and databases

Experience developing process, tooling, and methods to help improve operational maturity

Keywords: sthree information technology golang Delaware Texas
Data Engineer AWS Pyspark with Python
[email protected]
[email protected]
View all
Tue Oct 15 01:30:00 UTC 2024

To remove this job post send "job_kill 1840720" as subject from [email protected] to [email protected]. Do not write anything extra in the subject line as this is a automatic system which will not work otherwise.


Your reply to [email protected] -
To       

Subject   
Message -

Your email id:

Captcha Image:
Captcha Code:


Pages not loading, taking too much time to load, server timeout or unavailable, or any other issues please contact admin at [email protected]
Time Taken: 0

Location: ,