Financial Domain experience :: Senior Kafka Developers & Kafka Lead Developers with Strong Java || Local Candidates only at Strong, Arkansas, USA |
Email: shubham@visionaryz.com |
https://jobs.nvoids.com/job_details.jsp?id=1939787&uid= Hi Everyone, Please read the JD carefully, every information is given in the body of this email: The priority is for the Senior Kafka Developers and Kafka Lead Developers with Strong Java : Location: Hybrid for NYC, Charlotte, and Atlanta (3 days min in office) 40-hour work week Rate: $70/Hour - $90/Hour on C2C 5 Positions to close: 3 Lead Developers and 2 senior developers Minimum 10+ Years of experience required Initiative: The successful candidate will be part of a team responsible for building a foundational data hub to support near real-time risk reporting, analytics and business reporting. The hub will encompass the CIB systems of record and Reference Data to facilitate Regulatory Trade Reporting, and Risk Reporting. Business Units covered include Derivative (interest rates, credit, commodities), Foreign Exchange, Equities, Fixed Income, Loans, and Investment Banking / Capital Markets. Job Description: Senior Developer and Development Lead: We are seeking motivated and proactive Software Engineers with expertise in Java, Kafka / real-time messaging, Apache Flink, and MQ technologies to design, develop, and maintain a foundational data hub. This framework will enable seamless, event-driven communication and data streaming to support near real-time risk reporting, analytics and business reporting Role Includes: Technical Design, analysis, and support of the foundational data hub across CIB Technology. Development and customization of integration tools and solutions, using Kafka, Flink, MQ Series, or other event-driven message transmission systems. Use of integration products to customize or generate solutions that facilitate seamless communication between systems, ensuring high reliability and performance. Responsibilities: Develop and deploy real-time messaging applications using Kafka. Design / Implement Kafka producers/consumers for high-throughput, low-latency data processing in a trading environment. Integrate Kafka with various trading platforms and financial systems. Troubleshoot Kafka-related issues and optimize performance for high-frequency trading scenarios. Leverage Apache Flink for real-time stream processing, including event-driven data transformations and aggregation. Collaborate with DevOps and SecOps for Kafka and Flink cluster deployment, monitoring, and maintenance. Stay updated on best practices for real-time messaging, Kafka, Apache, Flink, and MQ technologies. Design, develop, and integrate Apache messaging frameworks to ensure high- performance and reliable messaging in critical systems. Work with MQ systems for message queuing, ensuring that data is reliably processed and communicated between distributed systems in real-time. Assist in the migration and integration of Apache Kafka, Flink, MQ, and other messaging solutions as needed. Qualifications: Bachelor degree in Computer Science, Software Engineering, or a related field. Strong Java development experience with expertise in real-time messaging and Kafka. Experience integrating Kafka with trading systems and managing high-volume, low- latency data streams. Proficiency in Apache Flink for stream processing and real-time data analytics. Familiarity with event-driven architecture, distributed systems, and fault tolerance principles. Proficiency with Apache messaging technologies (e.g., Apache ActiveMQ or Apache Kafka) and MQ systems (e.g., IBM MQ, Tibco EMS). Experience with Docker, Kubernetes, and microservices architecture is a plus. Strong understanding of message queuing, reliability, and fault-tolerant systems. Developer with Kafka We are seeking motivated and proactive Software Engineers with expertise in Java, Kafka / real-time messaging, Apache Flink, and MQ technologies to design, develop, and maintain a foundational data hub. This framework will enable seamless, event-driven communication and data streaming to support near real-time risk reporting, analytics and business reporting Responsibilities: Develop and deploy real-time messaging applications using Kafka. Design / Implement Kafka producers/consumers for high-throughput, low-latency data processing in a trading environment. Integrate Kafka with various trading platforms and financial systems. Troubleshoot Kafka-related issues and optimize performance for high-frequency trading scenarios. Leverage Apache Flink for real-time stream processing, including event-driven data transformations and aggregation. Qualifications: Bachelors degree in computer science, Software Engineering, or a related field. Strong Java development experience with expertise in real-time messaging and Kafka. Experience integrating Kafka with trading systems and managing high-volume, low- latency data streams. Proficiency in Apache Flink for stream processing and real-time data analytics. Familiarity with event-driven architecture, distributed systems, and fault tolerance principles. Proficiency with Apache messaging technologies (e.g., Apache ActiveMQ or Apache Kafka) and MQ systems (e.g., IBM MQ, Tibco EMS). Experience with Docker, Kubernetes, and microservices architecture is a plus. Strong understanding of message queuing, reliability, and fault-tolerant systems. Regards, Shubham Rawat SME Talent Acquisition Visionaryz Management Team -- Google Groups "Urgentinterviewc2crequirements" . Keywords: message queue Financial Domain experience :: Senior Kafka Developers & Kafka Lead Developers with Strong Java || Local Candidates only shubham@visionaryz.com https://jobs.nvoids.com/job_details.jsp?id=1939787&uid= |
shubham@visionaryz.com View All |
06:43 PM 18-Nov-24 |