Urgent Require for Kafka Architect, LA, CA (Initial remote ok) Contract Position HCL at Remote, Remote, USA |
Email: [email protected] |
From: Prince Kumar, IDC Technologies Inc [email protected] Reply to: [email protected] Hello, Hope You are doing good, Please find the below job description and let me know your interest ASAP: Client: HCL End Client Details Can Be Shared At The Time Of Submission Job Titles: . Kafka Architect Work Locations: LA, CA ( Initial remote ok) JOB DESCRIPTIONS: Responsibilities: Provide integration architecture support to design enterprise architecture (server, storage, networks etc.) for select solutions or applications (e.g., Snowflake, Data Lake, Salesforce) for reliability, scalability, and performance. Integration Architect identifies publishes and communicates strategic technology standards, frameworks, principles, and roadmaps to be used throughout the IT organization to guide technology decisions and leverage efficiency opportunities. Drives design for a secure, efficient, and adaptable future state model. Ensures that new projects and migrations are in alignment with the IT capital budget, project portfolio, and IT/enterprise strategic goals. Researches and recommends opportunities to adopt new technologies. Analyzes the impact of implementing new technologies into the IT infrastructure. Proactively researches IT architecture best practices and methodologies and determines relevancy to the business organization. Participates in the life cycle planning of existing IT assets. Assists Technical Architects, and Project Managers in matching technology services to specific business service and application development projects to ensure consistent use throughout the enterprise. Identifies and leverages opportunities across IT departments to ensure a consistent and efficient infrastructure framework. Educates and guides engineers in the vision and use cases of specific solutions within the Architecture portfolio. Develops and participates in the governance of Architecture principles and framework to assure compliance to the strategy and those exceptions are well justified and documented through a formal waiver process. Analyzes the impact of exceptions and its effect on future IT and enterprise goals. Use business requirements to identify, evaluate and present alternative design solutions which meet customer needs. Skills/Knowledge Required: - Advanced understanding/exposure to overall API-Management concepts. - Proficient in API Management features and multiple topologies to implement API runtimes - Experience with CI/CD tools like Azure DevOps, Git, Jenkins, and Jira and automated testing of APIs. - Proficient in API Security Practices. Should know about configuring API Security (OAuth, JWT, 2-way SSL, etc.) on Kafka. - Contribute to our evolving DevOps practice for hosting and managing our microservices in the cloud. Must have deep understanding of Istio, Kubernetes and Docker architecture and associated tools. - Experience in Kafka Capacity planning, installation, administration/Platform management and deep knowledge of Kafka internals - Experience in Kafka Cluster, Partitions, Security, Disaster recovery, Data pipeline, Data replication, Performance optimization - Experience with Kafka Confluent / Apache framework, Kafka SQL KSQL, Kafka Connect and the Kafka Streaming APIs - Experience of Kafka Producer/Consumer Microservices concepts and Kafka distributed Architecture - Experience in confluent replicator Configuration to perform replication between the clusters in a multi-region environment. Knowledge of Zookeeper - Certifications on Confluent Kafka, Cloud Technologies. IF you are interested kindly revert with your Updated Resume, Work Authorization Copy and an ID Proof with the below required information to proceed further. Candidate Full Legal Name (As Per Visa Copy) Current location (city, state or ZIP) DOB (MM/DD) Mobile number Alternate Number E-mail ID Skype ID Masters Degree with University Name, Location and Pass out Year) Bachelors Degree with University Name, Location and Pass out Year) Work Authorization With Validity Available Time Slots for Interview Total Years of experience Available to start the Project Relocation to Client Location Do the candidate have prior experience with Hcl (Yes/No if Yes, please mention the Client/Duration) Passport Number(NOT FOR GC AND CITIZEN) LinkedIn ID Employer Details (Company Name, POC, Ph no, Email ID) Please share below skill matrix Required Skills Candidate Self Rating (Scale 1-10) Work Experience (Years) Kafka Snowflake, Data Lake, Salesforce with CI/CD tools like Azure DevOps, Git, Jenkins, and Jira and automated testing Experience in Kafka Cluster, Partitions, Security, Disaster recovery, Data pipeline, Data replication, Performance optimization Experience of Kafka Producer/Consumer Microservices concepts and Kafka distributed Architecture Experience with Kafka Confluent / Apache framework, Kafka SQL KSQL, Kafka Connect and the Kafka Streaming APIs Experience in confluent replicator Configuration to perform replication between the clusters in a multi-region environment. Knowledge of Zookeeper Thanks & Regards , Prince Kumar IDC Technologies Inc Direct 408-429-2715 Secodary- 408-877-5897 920 Hillview Court, Suite 250, Milpitas, CA 95035 [email protected] | www.idctechnologies.com ________________________________________________ EMPOWERING TECHNOLOGIES SERVICES IT Services | Remote Services | IT Consulting | Staffing Solutions | BPO ______________________________________________________ Keywords: continuous integration continuous deployment materials management information technology green card California Idaho Louisiana |
[email protected] View all |
Fri Sep 15 04:53:00 UTC 2023 |