Fully Remote Kafka Developer at Remote, Remote, USA |
Email: [email protected] |
From: Sonali, KPG99 [email protected] Reply to: [email protected] Job Title: Fully Remote Kafka Developer Visa: GC only Duration: 6 month Location: Remote Jointly develops practical implementation plans with other accountable parties. Proactively develops and maintains technical knowledge in specialized area(s), remaining up to date on current trends and best practices Performs assessments and listens to internal clients to understand and anticipate their needs and determine their priorities in the context of the overall enterprise. Self-directed person who can identify priorities. Detail-oriented person who takes pride in keeping data correct and always having a backup plan. Problem-solver who might write a script or find a tool to get things done when there isn't an established solution. You love Kafka! When you hear terms like "event-driven" or "real-time streaming" you're ready to dive in! Accountabilities in this role Develop and implement solutions using Kafka. Administer and improve use of Kafka across the organization including Kafka Producers, Kafka Consumers, Kafka Connect, ksqlDB, KStreams, and custom implementations. Work with multiple teams to ensure best use of Kafka and data-safe event streaming. Understand and apply event-driven architecture patterns and Kafka best practices. Enable development teams to do the same. Assist developers in choosing correct patterns, event modeling, and ensuring data integrity. Continuous learning to be a Confluent/Kafka subject matter expert. Work with Kafka API's (e.g. metadata, metrics, admin) to provide pro-active insights and automation. Perform regular reviews of performance data to ensure efficiency and resiliency. Contribute regularly to event-driven patterns, best practices, and guidance. Review feature release and change logs for Kafka, and other related components to ensure best use of these systems across the organization. Work with lead to ensure all teams are aware of technology changes and impact. Acquire a deep understanding of source and sink connector technical details for a variety of platforms including SAP, SAP HANA, Sales Force, PostgreSQL, MS SQL Server, and others as required. REQUIREMENTS: (THESE THINGS MUST BE IN THEIR RESUME!!!!) Have knowledge of the primary components of Kafka and their function (topics, partitions, connectors, schema registry, KStreams, and KSqlDB). At least two years of experience supporting Kafka implementations in a production environment. Proficiency in at least one programming language and one scripting language. Java (SpringBoot) and Python preferred. Proficiency with GKE and Docker or other containers solution is a plus. Ability to participate in and contribute to code management in GitHub including actively collaborating in peer-reviews, feature branches, and resolving conflicts and commits. Excellent written and verbal communication skills. Strong sense of responsibility with a bias towards action. Comfortable self-directing and prioritizing your own work. Microservices experience is a plus. Knowledge of Springboot or equivalent. An understanding of any cloud (GCP Preferred) infrastructure and components. Experience with streaming data from SAP (especially S/4HANA) is a plus. Experience with Java coding is a plus. Experience with code coverage and quality tools is a plus. Experience working with CI/CD processes is a plus. Skills: KAFKA, JAVA, Microservices, Springboot Proficiency in at least one programming language and one scripting language. Java Proficiency with GKE and Docker or other containers solution is a plus Thanks & Regards Sonali Kumari Technical Recruiter KPG99, INC Keywords: continuous integration continuous deployment green card microsoft |
[email protected] View all |
Fri Sep 15 02:19:00 UTC 2023 |