Hot C2C opening for Data Engineer - Remote role at Remote, Remote, USA |
Email: [email protected] |
From: Gobi, PiplNow LLC [email protected] Reply to: [email protected] Hi, We have an urgent C2C opening for Data Engineer Remote role Our client is looking to fill this role immediately. If you are interested in this role, Please share the updated resume, filled skill matrix, consultant details, visa and dl copy ASAP. Skill Matrix: Skills Years of experience Over all experience Total years of work exp in US As Data Engineer Data Analysis Data pipelines SQL Spark Hadoop Data Quality Kafka Analyse Large Datasets Data warehousing concepts Relational databases (e.g., PostgreSQL, MySQL) Columnar databases (e.g., Redshift, BigQuery, HBase, ClickHouse) Consultant Details: Criteria Consultant's Data Full Name Primary Phone Primary Email Education Details Graduation Education Details Masters Certification if any LinkedIn Profile US work authorization and expiration Passport Number Expected pay rate on C2C Current Company Name Current location (City/State) Willing to relocate (yes/No) Availability to join new project/ Notice period Have you ever worked or interviewed for this client in the past If yes, as a consultant or as an employee Last 5 digits of Social Security Number Birth month and day (NOT YEAR) Position: Data Engineer Location: Remote Design, build, and maintain robust and efficient data pipelines that collect, process, and store data from various sources, including user interactions, listing details, and external data feeds. Develop data models that enable the efficient analysis and manipulation of data for merchandising optimization. Ensure data quality, consistency, and accuracy. Build scalable data pipelines (Spark, SQL & Scala) leveraging Airflow scheduler/executor framework Collaborate with cross-functional teams, including Data Scientists, Product Managers, and Software Engineers, to define data requirements, and deliver data solutions that drive merchandising and sales improvements. Improve code and data quality by leveraging and contributing to internal tools to automatically detect and mitigate issues 9+ years of relevant industry experience with a BS/Masters, or 2+ years with a PhD Experience with distributed processing technologies and frameworks, such as Hadoop, Spark, Kafka, and distributed storage systems (e.g., HDFS, S3) Demonstrated ability to analyze large data sets to identify gaps and inconsistencies, provide data insights, and advance effective product solutions Expertise with ETL schedulers such as Apache Airflow, Luigi, Oozie, AWS Glue or similar frameworks Solid understanding of data warehousing concepts and hands-on experience with relational databases (e.g., PostgreSQL, MySQL) and columnar databases (e.g., Redshift, BigQuery, HBase, ClickHouse) Excellent written and verbal communication skills Keywords: sthree Hot C2C opening for Data Engineer - Remote role [email protected] |
[email protected] View all |
Thu Aug 01 01:29:00 UTC 2024 |