Home

Data Engineer at Remote, Remote, USA
Email: [email protected]
http://bit.ly/4ey8w48
https://jobs.nvoids.com/job_details.jsp?id=1058861&uid=

From:

vivek,

vyzeinc

[email protected]

Reply to: [email protected]

Data Engineer

Location: Remote

12 Month contract

Contract Position

As a Senior Data Engineer, you will be part of a cross-functional development team that is focused on creating a forecasting platform. Using the agile framework, you will build end-to-end pipelines based on rigorous engineering standards and coding practices to deliver data that is accessible and of the highest quality.

What you'll do

Design and develop highly scalable and extensible data pipelines which enable collection, storage, distribution, modeling, and analysis of large data sets from many channels.

The ideal candidate will have strong data warehousing and API integration experience and the ability to develop scalable data pipelines that make data management and analytics/reporting faster, more insightful, and more efficient

Establish and follow data governance processes and guidelines to ensure data availability, usability, consistency, integrity, and security

Collaborate with analytics and business teams to improve data models that feed business intelligence tools, increasing data accessibility, and fostering data-driven decision making across the organization

Design, implement, and automate deployment of our distributed system for collecting and processing streaming events from multiple sources

What you'll need

Education: 4-year college degree or equivalent combination of education and experience. Prefer an academic background in Computer Science, Mathematics, Statistics, or related technical field

5+ years of relevant work experience in analytics, data engineering, business intelligence or related field

Skilled in object-oriented programming (Python in particular)

Strong Experience in Python, PySpark and SQL

Strong Experience in Databricks

Experience with cloud-based databases, specifically Azure technologies (e.g., Azure data lake, ADF, Azure DevOps and Azure Functions)

Experience using SQL queries as well as writing and perfecting SQL queries in a business environment with large-scale, complex datasets

Experience with data warehouse technologies. Experience creating ETL and/or ELT jobs

Experience in Kafka, Flink ,Fivetran and Matillion is nice to have

Keywords:
http://bit.ly/4ey8w48
https://jobs.nvoids.com/job_details.jsp?id=1058861&uid=
[email protected]
View All
04:33 AM 27-Jan-24


To remove this job post send "job_kill 1058861" as subject from [email protected] to [email protected]. Do not write anything extra in the subject line as this is a automatic system which will not work otherwise.

Pages not loading, taking too much time to load, server timeout or unavailable, or any other issues please contact admin at [email protected]


Time Taken: 0

Location: ,