Hiring - Data Engineer with Kafka Experience in Remote at Remote, Remote, USA |
Email: [email protected] |
From: Sri, VDart Inc [email protected] Reply to: [email protected] Position - Data Engineer with asset management and Kafka exp. Location: Dallas / Denver / Remote Job description: Candidate is expected to profile data, understand the characteristics of data, identify outliers and anomalies within data, discuss with stakeholders to define cleansing, . transformation rules. Based on the rules defined, build data pipelines (ETLs/ELTS) in Python. Must have strong experience in handling with Streaming data using Kafka. (Must have) Must have strong experience and thorough knowledge in SPARK, Python, Airflow, Snowflake or Big Query and Google cloud in general. Must possess good design skills pertaining to processing Data in Motion, Data at Rest, ETL/ELT frameworks (Error & warning handling, Audit & reconciliation, process controls) Must have strong experience in analyzing data (SPARK, SPARK-SQL & ANSI SQL) and building data pipelines (Python, Spark, Kafka) Must have good knowledge in all aspects of data management and prior experience in asset management is mandatory. Must have good understanding of Asset management business domain. Best Regards, Srijayanth L Technical Recruiter VDart Inc Direct Number: (470) 607 2475 Email: [email protected] Keywords: |
[email protected] View all |
Fri Feb 17 13:25:00 UTC 2023 |