Home

Kafka Lead - Chicago OR Rosemont, IL at Chicago, Illinois, USA
Email: [email protected]
From:

Gowri Shankar,

Teamware solutions

[email protected]

Reply to:   [email protected]

Posted On:Aug-09-2024Role name:DeveloperRole Description:" 8+ years of experience in Enterprise Middleware, Microservices, designing and implementing APIs. 5+ years of experience architecting, designing, implementing, operating a streaming platform based on Confluent Kafka. 5+ Years of experience in Confluent platform or Confluent Cloud. 5+ years of Java, Spring Boot experience Strong fundamentals in distributed systems design and operations, microservice architecture, integration patterns Deep understanding of different messaging paradigms (pub/sub, queuing), as well as delivery models, quality-of-service, and fault-tolerance architectures Established track record with Kafka technology, with hands-on production experience and a deep understanding of the Kafka architecture and internals of how it works, along with interplay of architectural components: brokers, Zookeeper, producers/consumers, Kafka Connect, Kafka Streams Practical experience with how to scale Kafka, KStreams, and Connector infrastructures. Experience with Kafka Streams / KSQL architecture and associated clustering model. Hands-on experience as a developer who has used the Kafka API to build producer and consumer applications, along with expertise in implementing KStreams components. Have developed KStreams pipelines, as well as deployed KStreams clusters. Experience with developing KSQL queries and best practices of using KSQL vs KStreams Strong knowledge of the Kafka Connect framework, with experience using several connector types: HTTP REST proxy, JMS, File, SFTP, JDBC. Experience using Source/sink connectors asRDBMS, NoSQL data stores. Hands-on experience in designing, writing, and operationalizing new Kafka Connectors using the framework Strong familiarity of data formats such as XML, JSON, Avro, CSV, etc. along with serialization/deserialization options Familiarity of the Schema Registry Experience with monitoring Kafka infrastructure along with related components (Connectors, KStreams, and other producer/consumer apps) Familiarity with Confluent Control CenterFirm understanding of SDLC (systems development lifecycle) Excellent written and verbal communication skills. Excellent analytical and troubleshooting abilities Prior experience in banking / financial services industry and firm understanding of the banking data landscape. 12+ Years of Experience in the software industry Bachelors degree (or equivalent) in relevant technical field Proven experience in Messaging systems, API integrations, Streaming platforms, KafkaGoal of this engagement Commercial Banking Enhanced Reporting and API productionizationObjectives of this role Hands-on Technical Role, responsible for design, development, and deployment of end-end implementation, including Confluent Kafka, Integrations with source/target systems Enterprise middleware systems.Responsibilities Architecting and designing for a ground-up build of a Confluent Kafka platform for the enterprise. Design, build and configure KaCompetencies:Data Warehouse, Digital : KafkaExperience (Years):6-8Essential Skills:" 8+ years of experience in Enterprise Middleware, Microservices, designing and implementing APIs. 5+ years of experience architecting, designing, implementing, operating a streaming platform based on Confluent Kafka. 5+ Years of experience in Confluent platform or Confluent Cloud. 5+ years of Java, Spring Boot experience Strong fundamentals in distributed systems design and operations, microservice architecture, integration patterns Deep understanding of different messaging paradigms (pub/sub, queuing), as well as delivery models, quality-of-service, and fault-tolerance architectures Established track record with Kafka technology, with hands-on production experience and a deep understanding of the Kafka architecture and internals of how it works, along with interplay of architectural components: brokers, Zookeeper, producers/consumers, Kafka Connect, Kafka Streams Practical experience with how to scale Kafka, KStreams, and Connector infrastructures. Experience with Kafka Streams / KSQL architecture and associated clustering model. Hands-on experience as a developer who has used the Kafka API to build producer and consumer applications, along with expertise in implementing KStreams components. Have developed KStreams pipelines, as well as deployed KStreams clusters. Experience with developing KSQL queries and best practices of using KSQL vs KStreams Strong knowledge of the Kafka Connect framework, with experience using several connector types: HTTP REST proxy, JMS, File, SFTP, JDBC. Experience using Source/sink connectors asRDBMS, NoSQL data stores. Hands-on experience in designing, writing, and operationalizing new Kafka Connectors using the framework Strong familiarity of data formats such as XML, JSON, Avro, CSV, etc. along with serialization/deserialization options Familiarity of the Schema Registry Experience with monitoring Kafka infrastructure along with related components (Connectors, KStreams, and other producer/consumer apps) Familiarity with Confluent Control CenterFirm understanding of SDLC (systems development lifecycle) Excellent written and verbal communication skills. Excellent analytical and troubleshooting abilities Prior experience in banking / financial services industry and firm understanding of the banking data landscape. 12+ Years of Experience in the software industry Bachelors degree (or equivalent) in relevant technical field Proven experience in Messaging systems, API integrations, Streaming platforms, KafkaGoal of this engagement Commercial Banking Enhanced Reporting and API productionizationObjectives of this role Hands-on Technical Role, responsible for design, development, and deployment of end-end implementation, including Confluent Kafka, Integrations with source/target systems Enterprise middleware systems.Responsibilities Architecting and designing for a ground-up build of a Confluent Kafka platform for the enterprise. Design, build and configure KaDesirable Skills:" 8+ years of experience in Enterprise Middleware, Microservices, designing and implementing APIs. 5+ years of experience architecting, designing, implementing, operating a streaming platform based on Confluent Kafka. 5+ Years of experience in Confluent platform or Confluent Cloud. 5+ years of Java, Spring Boot experience Strong fundamentals in distributed systems design and operations, microservice architecture, integration patterns Deep understanding of different messaging paradigms (pub/sub, queuing), as well as delivery models, quality-of-service, and fault-tolerance architectures Established track record with Kafka technology, with hands-on production experience and a deep understanding of the Kafka architecture and internals of how it works, along with interplay of architectural components: brokers, Zookeeper, producers/consumers, Kafka Connect, Kafka Streams Practical experience with how to scale Kafka, KStreams, and Connector infrastructures. Experience with Kafka Streams / KSQL architecture and associated clustering model. Hands-on experience as a developer who has used the Kafka API to build producer and consumer applications, along with expertise in implementing KStreams components. Have developed KStreams pipelines, as well as deployed KStreams clusters. Experience with developing KSQL queries and best practices of using KSQL vs KStreams Strong knowledge of the Kafka Connect framework, with experience using several connector types: HTTP REST proxy, JMS, File, SFTP, JDBC. Experience using Source/sink connectors asRDBMS, NoSQL data stores. Hands-on experience in designing, writing, and operationalizing new Kafka Connectors using the framework Strong familiarity of data formats such as XML, JSON, Avro, CSV, etc. along with serialization/deserialization options Familiarity of the Schema Registry Experience with monitoring Kafka infrastructure along with related components (Connectors, KStreams, and other producer/consumer apps) Familiarity with Confluent Control CenterFirm understanding of SDLC (systems development lifecycle) Excellent written and verbal communication skills. Excellent analytical and troubleshooting abilities Prior experience in banking / financial services industry and firm understanding of the banking data landscape. 12+ Years of Experience in the software industry Bachelors degree (or equivalent) in relevant technical field Proven experience in Messaging systems, API integrations, Streaming platforms, KafkaGoal of this engagement Commercial Banking Enhanced Reporting and API productionizationObjectives of this role Hands-on Technical Role, responsible for design, development, and deployment of end-end implementation, including Confluent Kafka, Integrations with source/target systems Enterprise middleware systems.Responsibilities Architecting and designing for a ground-up build of a Confluent Kafka platform for the enterprise. Design, build and configure KaCountry:United StatesBranch | City | Location:TCS - Chicago(Downtown), IL

Rosemont

Rosemont,IL

Keywords: rlang information technology Illinois
Kafka Lead - Chicago OR Rosemont, IL
[email protected]
[email protected]
View all
Wed Sep 11 19:03:00 UTC 2024

To remove this job post send "job_kill 1739350" as subject from [email protected] to [email protected]. Do not write anything extra in the subject line as this is a automatic system which will not work otherwise.


Your reply to [email protected] -
To       

Subject   
Message -

Your email id:

Captcha Image:
Captcha Code:


Pages not loading, taking too much time to load, server timeout or unavailable, or any other issues please contact admin at [email protected]
Time Taken: 1

Location: ,