100% Remote Senior NLP Engineer | Document Processing Expert | OCR Expert | LayoutLM at Remote, Remote, USA |
Email: [email protected] |
From: Akhilesh, DMS VISIONS [email protected] Reply to: [email protected] Hi, Hope you are doing well, Title:100% Remote Senior NLP Engineer | Document Processing Expert | OCR Expert | LayoutLM Duration: 6+ Months Visa: Open Must have LayoutLM About the job Need Job description for Uses deep subject matter/functional expertise, influence and process skills to help internal/external customers and stakeholders identify and meet their high priority needs while considering cultural and diversity implications. Jointly develops practical implementation plans with other accountable parties. Proactively develops and maintains technical knowledge in specialized area(s), remaining up to date on current trends and best practices Performs assessments and listens to internal clients to understand and anticipate their needs and determine their priorities in the context of the overall enterprise. Self-directed person who can identify priorities. Detail-oriented person who takes pride in keeping data correct and always having a backup plan. Problem-solver who might write a script or find a tool to get things done when there isn't an established solution. You love Kafka! When you hear terms like "event-driven" or "real-time streaming" you're ready to dive in! Accountabilities in this role Develop and implement solutions using Kafka. Administer and improve use of Kafka across the organization including Kafka Producers, Kafka Consumers, Kafka Connect, ksqlDB, KStreams, and custom implementations. Work with multiple teams to ensure best use of Kafka and data-safe event streaming. Understand and apply event-driven architecture patterns and Kafka best practices. Enable development teams to do the same. Assist developers in choosing correct patterns, event modeling, and ensuring data integrity. Continuous learning to be a Confluent/Kafka subject matter expert. Work with Kafka API's (e.g. metadata, metrics, admin) to provide pro-active insights and automation. Perform regular reviews of performance data to ensure efficiency and resiliency. Contribute regularly to event-driven patterns, best practices, and guidance. Review feature release and change logs for Kafka, and other related components to ensure best use of these systems across the organization. Work with lead to ensure all teams are aware of technology changes and impact. Acquire a deep understanding of source and sink connector technical details for a variety of platforms including SAP, SAP HANA, Sales Force, PostgreSQL, MS SQL Server, and others as required. Requirements Kills Have knowledge of the primary components of Kafka and their function (topics, partitions, connectors, schema registry, KStreams, and KSqlDB). Microservices experience is a MUST. knowledge of Springboot or equivalent At least two years of experience supporting Kafka implementations in a production environment. Proficiency in at least one programming language and one scripting language. Java (SpringBoot) and Python is a must. Desired Skills: Proficiency with GKE and Docker or other containers solution is a plus. Ability to participate in and contribute to code management in GitHub including actively collaborating in peer-reviews, feature branches, and resolving conflicts and commits. Strong sense of responsibility with a bias towards action. Comfortable self-directing and prioritizing your own work. . An understanding of any cloud (GCP Preferred) infrastructure and components. Experience with streaming data from SAP (especially S/4HANA) is a plus. Experience with Java coding is a plus. Experience with code coverage and quality tools is a plus. Experience working with CI/CD processes is a plus. Thank you Akhilesh Keywords: continuous integration continuous deployment microsoft |
[email protected] View all |
Sat Sep 16 00:44:00 UTC 2023 |