Home

Urgent Requirement for Kafka Engineers/AWS/Terraform || Must Have AWS/MSK, Terraform || No H1b || Remote || at Remote, Remote, USA
Email: [email protected]
Hi,
I hope you are doing well
Please find the JD mentioned below.

Title: Kafka Engineers/AWS/Terraform
Contract: - 1 Year+
Location: Remote 100%

Candidates can work remote 100% IN EST OR CST PREFERRED.

****PLEASE ONLY SEND ME SENIOR KAFKA ENGINEERS, NOT DEVOPS. THIS IS A DATA LAYER PROJECT, NOT CI/CD.

**We need two senior (10+ Years) Apache Kafka developers with experience in AWS MSK (Amazon Managed Streaming for Apache Kafka) as well as Well-versed on MSK Connect and Kafka Connect for integration with RDBMS, using JDBC connectors and Hands-on experience in Terraform.
They must also have experience with AWS services (EC2, ELB, RDS, Route53, S3, VPC, VPN, TGW);

Candidates Must Have:]
Kafka
AWS/MSK
Terraform
AWS services (EC2, ELB, RDS, Route53, S3, VPC, VPN, TGW);
Heavy Data experience

Interview Process: Video

Job Description:
Minimum 10 years of IT experience
Minimum of 7 years of hands-on experience within data engineering organization;
Experience designing and operating solutions with relational Databases (Oracle, Postgres, SQL Server);
Self-motivated and able to work with minimal supervision;
Familiar with the creation of different types of documentation like functional, process, test plan etc;
Experience in AWS MSK (Amazon Managed Streaming for Apache Kafka);
Well-versed on MSK Connect and Kafka Connect for integration with RDBMS, using JDBC connectors;
Hands-on experience in Terraform;
Experience in primary AWS services (EC2, ELB, RDS, Route53, S3, VPC, VPN, TGW);
Strong AWS skills;
Strong track record of implementing AWS services in a variety of distributed computing environments;
Ability to install, maintain and troubleshoot Kafka;
Experience working with Apache Kafka, like designing, developing, and managing data streams, troubleshooting, internals, cluster design, optimization, monitoring, schema registry and schema evolution, REST API, and its many related components;
Extensive experience with messaging and stream processing on Kafka;
In-depth knowledge of all the functionalities surrounding Kafka;
Excellent understanding of SQL and Python for data processing;
Understanding of Data Modeling concepts and RDBMS Database design;
Ability to work with a variety of platforms and application stacks;
Experience working in an Agile/Scrum development process;
Bachelors Degree in Computer Science or Software Engineering.

Key Must Haves:
Experience in primary AWS services (EC2, ELB, RDS, Route53, S3, VPC, VPN, TGW)
Experience designing and operating solutions with relational Databases (Oracle, Postgres, SQL Server)
Strong track record of implementing AWS services in a variety of distributed computing environments.

Kind Regards,
Sugam Saurav
Talent Acquisition
[email protected]
https://www.linkedin.com/in/sugam-saurav-b173081b3/
Intellicept Inc - A Division of McKinsol Consulting

Keywords: continuous integration continuous deployment sthree information technology golang
[email protected]
View all
Wed Sep 06 01:00:00 UTC 2023

To remove this job post send "job_kill 603583" as subject from [email protected] to [email protected]. Do not write anything extra in the subject line as this is a automatic system which will not work otherwise.


Your reply to [email protected] -
To       

Subject   
Message -

Your email id:

Captcha Image:
Captcha Code:


Pages not loading, taking too much time to load, server timeout or unavailable, or any other issues please contact admin at [email protected]
Time Taken: 2

Location: ,