Home

Kafka Architect at Remote, Remote, USA
Email: [email protected]
From:

Saeeda,

Global IT Family

[email protected]

Reply to:   [email protected]

Role: Kafka Architect

Duration: Long term

Location: Remote (Raleigh, NC)

We are looking for an Architect with 12 + years of experience in Middleware, Data analytics background. He/she should have minimum 5 plus years experience in architect and designing streaming technology using Kafka, Zookeeper, Kafka Confluent Control Center, Confluent Platform Metadata Service (MDS). Expert in upgradation and migration of Kafka infrastructure from on-premises to AWS cloud. Hands on experience in Upgradation, Debugging the error and performance issue, capacity planning, road map definition, lead pro-active improvements etc.

Job description:

Seasoned messaging expert with established track record with Kafka technology, with hands-on production experience and a deep understanding of the Kafka architecture and internals of how it works, along with interplay of architectural components: brokers, Zookeeper, producers/consumers, Kafka Connect, Kafka Streams

Knowledge of Building Kafka ecosystem by creating a framework for leveraging technologies such as Kafka Connect, Streams/KSQL, Schema Registry, and other streaming-oriented technology.

-Strong fundamentals in Kafka administration, configuration, and troubleshooting

Mandatory experience implementing Kafka Multi Region Cluster (MRC) architecture.

Experience enabling observability on Kafka cluster through Datadog

Knowledge of Kafka clustering, and its fault-tolerance model supporting HA and DR

Practical experience with how to scale Kafka, Streams, and Connector infrastructures, with the motivation to build efficient platforms.

Best practices to optimize the Kafka ecosystem based on use-case and workload, e.g. how to effectively use topic, partitions, and consumer groups to provide optimal routing and support of QOS.

Experience with Kafka Streams / KSQL architecture and associated clustering mode.

Deep understanding of different messaging paradigms (pub/sub, queuing), as well as delivery models, quality-of-service, and fault-tolerance architectures

Experience analyzing production issues with authentication, consumer rebalancing, and latency variation, as well as any others that are encountered.

Experience with no-downtime Kafka infrastructure upgrades

Experience Benchmarking existing and potential infrastructure options to provide a scale-out plan.

Strong knowledge of the Kafka Connect framework, with experience using several connector types: HTTP REST proxy, JMS, File, SFTP, JDBC, Splunk, Salesforce, and how to support wire-format translations. Knowledge of connectors available from Confluent and the community

Hands-on experience in designing, writing, and operationalizing new Kafka Connectors using the framework.

Strong familiarity of wire formats such as XML, JSON, Avro, CSV, etc. along with serialization/deserialization options Knowledge of messaging protocols and associated APIs

Strong background in integration patterns.

Thanks & Regards,

Saeeda Shaikh

Global IT Family LLC

Email: [email protected]   

www.globalitfamily.com

Keywords: information technology North Carolina
[email protected]
View all
Fri Feb 09 22:59:00 UTC 2024

To remove this job post send "job_kill 1105147" as subject from [email protected] to [email protected]. Do not write anything extra in the subject line as this is a automatic system which will not work otherwise.


Your reply to [email protected] -
To       

Subject   
Message -

Your email id:

Captcha Image:
Captcha Code:


Pages not loading, taking too much time to load, server timeout or unavailable, or any other issues please contact admin at [email protected]
Time Taken: 0

Location: ,