Home

Kafka Architect - Raleigh, NC - Hybrid (EST time zone only) at Raleigh, North Carolina, USA
Email: [email protected]
From:

John,

Vysystems

[email protected]

Reply to:   [email protected]

Job description:

Seasoned messaging expert with established track record with Kafka technology, with hands-on production experience and a deep understanding of the Kafka architecture and internals of how it works, along with interplay of architectural components: brokers, Zookeeper, producers/consumers, Kafka Connect, Kafka Streams

Knowledge of Building Kafka ecosystem by creating a framework for leveraging technologies such as Kafka Connect, KStreams/KSQL, Schema Registry, and other streaming-oriented technology.

Strong fundamentals in Kafka administration, configuration, and troubleshooting

Mandatory experience implementing Kafka Multi Region Cluster (MRC) architecture.

Experience enabling observability on Kafka cluster through Datadog

Knowledge of Kafka clustering, and its fault-tolerance model supporting HA and DR

Practical experience with how to scale Kafka, Streams, and Connector infrastructures, with the motivation to build efficient platforms.

Best practices to optimize the Kafka ecosystem based on use-case and workload, e.g. how to effectively use topic, partitions, and consumer groups to provide optimal routing and support of QOS.

Experience with Kafka Streams / KSQL architecture and associated clustering mode.

Deep understanding of different messaging paradigms (pub/sub, queuing), as well as delivery models, quality-of-service, and fault-tolerance architectures

Experience analysing production issues with authentication, consumer rebalancing, and latency variation, as well as any others that are encountered.

Experience with no-downtime Kafka infrastructure upgrades

Experience Benchmarking existing and potential infrastructure options to provide a scale-out plan.

Strong knowledge of the Kafka Connect framework, with experience using several connector types: HTTP REST proxy, JMS, File, SFTP, JDBC, Splunk, Salesforce, and how to support wire-format translations. Knowledge of connectors available from Confluent and the community

Hands-on experience in designing, writing, and operationalizing new Kafka Connectors using the framework.

Strong familiarity of wire formats such as XML, JSON, Avro, CSV, etc. along with serialization/deserialization options Knowledge of messaging protocols and associated APIs

Strong background in integration patterns

Keywords: information technology
[email protected]
View all
Sat Feb 10 01:10:00 UTC 2024

To remove this job post send "job_kill 1106159" as subject from [email protected] to [email protected]. Do not write anything extra in the subject line as this is a automatic system which will not work otherwise.


Your reply to [email protected] -
To       

Subject   
Message -

Your email id:

Captcha Image:
Captcha Code:


Pages not loading, taking too much time to load, server timeout or unavailable, or any other issues please contact admin at [email protected]
Time Taken: 0

Location: ,