Home

Looking for Apache Kafka in Technical Architect Richmond, VA || Day 1 Onsite at Richmond, Virginia, USA
Email: [email protected]
Need candidates from EST 

Please mention visa and current location

Title: Apache Kafka Technical Architect

Location: 2035 Maywill Street, Suite, 100, Richmond, Virginia 23230

Terms: Contract

Job Details:

Required Skills

Kafka Cluster Management: Design, deploy, and manage Apache Kafka clusters, ensuring high availability, scalability, and fault tolerance.

Data Streaming Architecture: Develop and maintain real-time data streaming solutions using Kafka, Kafka Streams, and related technologies.

Performance Optimization: Monitor and optimize Kafka clusters for performance, including tuning brokers, topics, partitions, and configurations.

Security and Compliance: Implement and manage Kafka security measures, including encryption, authentication, and authorization, to ensure data integrity and compliance with industry standards.

Integration: Work closely with application developers, data engineers, and DevOps teams to integrate Kafka with other systems and services.

Monitoring and Alerts: Use tools such as Prometheus, Grafana, and Kafka Manager to set up monitoring, logging, and alerting for Kafka clusters.

Troubleshooting and Support: Diagnose and promptly resolve issues related to Kafka performance, connectivity, and data processing.

Documentation: Create and maintain detailed documentation for Kafka configurations, processes, and best practices.

Innovation and Improvement: Stay up-to-date with the latest developments in Kafka and related technologies, proposing improvements and new solutions as appropriate.

Proven experience with distributed systems, data streaming, and event-driven architectures.

Experience with Kafka Streams, KSQL, and other Kafka ecosystem tools.

Hands-on experience with cloud platforms (AWS, Azure, Google Cloud) is a plus.

Technical Skills:

Strong proficiency in Apache Kafka, including broker setup, topic management, and partitioning strategies.

Knowledge of data serialization formats such as Avro, Protobuf, and JSON.

Experience with Linux/Unix systems and scripting (Bash, Python, etc.).

Familiarity with DevOps practices and tools like Docker, Kubernetes, CI/CD pipelines, and Terraform.

Experience with monitoring tools (Prometheus, Grafana) and logging tools (Elasticsearch, Logstash, Kibana).

Regards,

Srijan Roy

Cynet Systems

--

Keywords: continuous integration continuous deployment information technology
Looking for Apache Kafka in Technical Architect Richmond, VA || Day 1 Onsite
[email protected]
[email protected]
View all
Tue Sep 03 21:21:00 UTC 2024

To remove this job post send "job_kill 1715201" as subject from [email protected] to [email protected]. Do not write anything extra in the subject line as this is a automatic system which will not work otherwise.


Your reply to [email protected] -
To       

Subject   
Message -

Your email id:

Captcha Image:
Captcha Code:


Pages not loading, taking too much time to load, server timeout or unavailable, or any other issues please contact admin at [email protected]
Time Taken: 1

Location: ,