JD - HADOOP || BIG DATA WITH KAFKA (INITIALY REMOTE) at El Segundo, California, USA |
Email: [email protected] |
Title: Kafka Lead Location: El Segundo, CA (Initial remote ok) Duration: 12+ Months Mandatory Skills: Confluent Kafka, Node.JS Job Description: Confluent Kafka Platform Installation, Configuration, Administration and Support. Troubleshoot Kafka platform in multiple type of environments. Ownership of problem isolation, resolution, and bug reporting. Design and implement Streaming solutions on Kafka platform. Integrate Kafka connectors with various data sources. Desired Candidate Profile: Installation, configuration and administration of Confluent Kafka in Hortonworks Data Platform. Installation and Integration of Kafka Connectors with various sources of data. Experience in containerization tools like Kubernetes, Docker, OpenShift etc. Hands-on experience Linux and or Unix environments. Working in Agile environments. Strong understanding of network configuration, devices, protocols, speeds and optimizations. Working knowledge and experience on DevOps and automation tools. Ability to compile and install Linux applications from source. Enterprise storage, databases or high-end server solutions. Distributed file system experience. Certifications on Confluent Kafka and GCP Technologies are a plus. Thanks & Regards Ayush Sharma IT Recruiter HMG America LLC Email: [email protected] www.hmgamerica.com -- Keywords: javascript information technology California JD - HADOOP || BIG DATA WITH KAFKA (INITIALY REMOTE) [email protected] |
[email protected] View all |
Thu Jul 18 23:47:00 UTC 2024 |