Looking for a Apache Flink and IBM Streams with 10 plus yrs exp H1B and need only locals at Apache, Oklahoma, USA |
Email: [email protected] |
From: Chaitanya, RuriSoft LLC [email protected] Reply to: [email protected] Apache Flink and IBM Streams Location- Tampa/ Atlanta/Dallas/NJ ( Hybrid) Only Local Profiles IBM Streams Understand the IBM Stream Processing Language (SPL) Hands on exp in IBM Domain Manager; Streams Console; Streams Studio to edit and understand the current pipelines Understand tuples, data streams, operators, processing elements (PEs), and jobs. Hands on exp in Streaming Data Flow jobs creation over GCP using Custom Templates Constructing the Apache Beam based templates that help to deploy over the GCP Data Flow Hands on exp in constructing the High volume based Pipelines in GCP and tune them to reach its max throughput (millions of records / min) Able to create the CI/CD pipelines to deploy over the GCP composer Knowledge about GCS, BigQuery, Cloud Functions Flink As per the latest discussion it is not much required to know the person about Flink (will update few more details once I discuss with Technical team) Hands on exp in constructing Realtime Event data consumption and transformation pipelines over the GCP that consumes data from Pubsub Hands on exp in Streaming Data Flow jobs creation over GCP using Custom Templates, Able to create Java based Data pipeline creation using Apache Beam (Python knowledge required) Constructing the Apache Beam based templates that help to deploy over the GCP Data Flow Knowing the Apache Flink is added advantage. Capturing metrics to measure the Pipelines throughput and peak capacity Able to create the CI/CD pipelines to deploy over the GCP Composer(Airflow) Knowledge about GCS, BigQuery, Cloud Functions Keywords: continuous integration continuous deployment information technology New Jersey |
[email protected] View all |
Tue Sep 26 20:30:00 UTC 2023 |