Data Engineer at Remote, Remote, USA |
Email: [email protected] |
From: vivek, vyzeinc [email protected] Reply to: [email protected] Data Engineer Location: Remote 12 Month contract Contract Position As a Senior Data Engineer, you will be part of a cross-functional development team that is focused on creating a forecasting platform. Using the agile framework, you will build end-to-end pipelines based on rigorous engineering standards and coding practices to deliver data that is accessible and of the highest quality. What you'll do Design and develop highly scalable and extensible data pipelines which enable collection, storage, distribution, modeling, and analysis of large data sets from many channels. The ideal candidate will have strong data warehousing and API integration experience and the ability to develop scalable data pipelines that make data management and analytics/reporting faster, more insightful, and more efficient Establish and follow data governance processes and guidelines to ensure data availability, usability, consistency, integrity, and security Collaborate with analytics and business teams to improve data models that feed business intelligence tools, increasing data accessibility, and fostering data-driven decision making across the organization Design, implement, and automate deployment of our distributed system for collecting and processing streaming events from multiple sources What you'll need Education: 4-year college degree or equivalent combination of education and experience. Prefer an academic background in Computer Science, Mathematics, Statistics, or related technical field 5+ years of relevant work experience in analytics, data engineering, business intelligence or related field Skilled in object-oriented programming (Python in particular) Strong Experience in Python, PySpark and SQL Strong Experience in Databricks Experience with cloud-based databases, specifically Azure technologies (e.g., Azure data lake, ADF, Azure DevOps and Azure Functions) Experience using SQL queries as well as writing and perfecting SQL queries in a business environment with large-scale, complex datasets Experience with data warehouse technologies. Experience creating ETL and/or ELT jobs Experience in Kafka, Flink ,Fivetran and Matillion is nice to have Keywords: |
[email protected] View all |
Sat Jan 27 04:33:00 UTC 2024 |