Data Engineer with Flink at Remote, Remote, USA |
Email: [email protected] |
From: sunil, ICS [email protected] Reply to: [email protected] Role: Data Engineer with Flink Location: Addison, Dallas(Hybrid Remote) Duararion: long Term Job Description: Who are we looking for Flink Developer with excellent hands-on development in Java, Kafka, Flink along with good interpersonal skills, capable of working in highly critical transformation projects which can handle high-throughput Realtime streaming pipelines that are available 24x7. client is looking for Flink (Apache Flink) specifically as primary skill set in reference to the below role, The candidates need to demonstrate that they have sufficient work experience on Flink to take the workload of the client initiative. About the Role: Your responsibilities: Develop Flink data processing applications to handle streaming data with high throughput. A senior who can help his Flink development team, guiding and helping them implement custom solutions through Flink. Develop applications with good usability and scalability principles that read from various sources and writes into various sinks. Worked on integrations of other technologies with Flink, eg: Kafka, MongoDB, etc Collaborate with team to design, develop, test and refine deliverables that meet the objectives. Provide design and architectural solutions to the business problems. Conduct frequent brainstorming sessions and motivate team and drive innovations. Experience in the areas of Messaging, Data processing, preferably on Flink on any cloud platform (Azure, GCP, AWS) Mandatory Skills: 10+ years of Java development with expertise in transforming data. 5+ years of experience consuming streaming data from Kafka, Flink. 5+ years of experience on building pipelines which can handle high throughput using Flink. Hands on experience with Continuous Integration & Deployment (CI/CD) Product & Design Knowledge - Experience with Large Enterprise Scale Integrations (preferably in design/development of customer facing large enterprise applications) Experience in Digital Banking/ecommerce or any complex customer facing applications. Excellent business communication skills Seasoned Java developer who knows about all aspects of SDLC . Desired Skills: Experience in Apache Flink (Stream, Batch, Table APIs) Experience in Apache Spark (Structured streaming, Batch processing) Experience working with document databases preferably MongoDB Experience working on Kafka with Flink Working experience in Agile methodologies Knowledge in cloud platform, preferably Azure Experience of working across one or more geographic territories or regions Keywords: continuous integration continuous deployment |
[email protected] View all |
Thu Feb 01 20:32:00 UTC 2024 |