Home

Role Big Data Hadoop Must have Kafka ||$52 || 10+exp || Tempe, AZ (Onsite) -C2C at Tempe, Arizona, USA
Email: [email protected]
Processing description:
http://bit.ly/4ey8w48
https://jobs.nvoids.com/job_details.jsp?id=2254333&uid=

only

Hi

My name is Deepak
Rajput and I represent Teamware Solutions Inc. I came across the
below job opportunity. Kindly go through the job details and let me know.
If you would be interested.

Role:

Big Data Hadoop Must have Kafka

Location:

Tempe,
AZ (Onsite)

Duration: 12+ Months

Profile : 11+ Years (
Please Dont Share profile below 10+ years)

Role name:

Developer

Role Description:

Standalones
identify and rectify the Kafka Messaging issues within justifying time. Work
with the business and IT team to understand business problems, and to design
implement, and deliver an appropriate solution using Agile methodology across
the larger program. Work independently to implement solutions on multiple
platform (DEV, QA, UAT, PROD). Provide technical direction, guidance, and
reviews to other engineers working on the same project. Administer
Distributed kafka cluster in Dev,QA, UAT, PROD environments and troubleshoot
performance issue Implement and debug subsystems/microservice and
components. Follows automate-first/automate-everything philosophy. Hands on
in programming languages

Competencies:

Digital : Kafka

Experience (Years):

4-6

Essential Skills:

Key skills required
for Kafka Developer: Deep understanding of Confluent Kafka: Thorough
knowledge of Kafka concepts like producers, consumers, topics, partitions,
brokers, and replication mechanisms. Programming language proficiency:
Primarily Java or Scala, with potential for Python depending on the project.
System design and architecture: Ability to design robust and scalable
Kafka-based data pipelines, considering factors like data throughput, fault
tolerance, and latency. Data management skills: Understanding of data
serialization formats like JSON, Avro, and Protobuf, and how to manage data
schema evolution. Kafka Streams API (optional): Knowledge of Kafka Streams
for real-time data processing within the Kafka ecosystem. Monitoring and
troubleshooting: Familiarity with tools to monitor Kafka cluster health,
identify performance bottlenecks, and troubleshoot issues. Cloud
integration : Experience deploying and managing Kafka on cloud platforms like
AWS, Azure, or GCP. Distributed systems concepts: Understanding of concepts
like distributed consensus, leader election, and fault tolerance. Security
best practices: Knowledge of Kafka security features to implement
authentication and authorization mechanisms. Communication and
collaboration: Ability to work effectively with other developers, data
engineers, and stakeholders to design and implement Kafka solutions.

Desirable Skills:

Other Skills
required: Strong Experience with Kafka connect / KSQL architecture and
associated clustering model. Hands on experience with Kafka Db connector for
Oracle, Mysql. Strong fundamentals and experience in Kafka administration,
configuration, and troubleshooting. Understand and experience with Kafka
clustering, and its fault-tolerance model supporting HA and DR. Have
developed KStreams pipelines, as well as deployed KStreams clusters. Strong
problem-solving skills and a passion for debugging complex issues and mature
code. Experience with using agile methodologies for software development.
Experience with developing KSQL queries and best practices of using KSQL vs
streams. Familiarity with Confluent Control Center; or have worked on Kafka
Monitoring Tool (UI). Ability to work in fast-paced and dynamically changing
environment. Ability to lead the effort; and work with minimum
supervision.Duties & Responsibilities: Standalones identify and rectify
the Kafka Messaging issues within justifying time. Work with the business
and IT team to understand business problems, and to design implement, and
deliver an appropriate solution using Agile methodology across the larger
program. Work independently to implement solutions on multiple platform
(DEV, QA, UAT, PROD). Provide technical direction, guidance, and reviews to
other engineers working on the same project. Administer Distributed kafka
cluster in Dev,QA, UAT, PROD environments and troubleshoot performance issue
Implement and debug subsystems/microservice and components. Follows
automate-first/automate-everything philosophy. Hands on in programming
languages

Country:

United States

Branch | City |
Location:

TCS - Phoenix,AZ

Tempe

Tempe, AZ

Thanks & Regards

Deepak Rajput | Technical Recruiter

Contact No -
214-393-8746
, +91 89828 12503

Email -
[email protected]

--

Keywords: quality analyst user interface database rlang information technology golang Arizona
Role Big Data Hadoop Must have Kafka ||$52 || 10+exp || Tempe, AZ (Onsite) -C2C
[email protected]
http://bit.ly/4ey8w48
https://jobs.nvoids.com/job_details.jsp?id=2254333&uid=
[email protected]
View All
09:40 PM 13-Mar-25


To remove this job post send "job_kill 2254333" as subject from [email protected] to [email protected]. Do not write anything extra in the subject line as this is a automatic system which will not work otherwise.

Pages not loading, taking too much time to load, server timeout or unavailable, or any other issues please contact admin at [email protected]


Time Taken: 0

Location: ,