Big Data Engineer - REMOTE ROLE - 10 YEARS EXPEREINCE REQUIRED at Remote, Remote, USA |
Email: [email protected] |
From: Niki, Parin Technologies [email protected] Reply to: [email protected] Title: Big Data Engineer Location: North Dakota REMOTE Duration: 7 Months Interview: Webcam Only Job Description Develops, engineers, maintains, runs tests, evaluates, and implements big data infrastructure, projects, tools, and solutions, working with the latest database technologies, to get results from vast amounts of data quickly. The Big Data Engineer is a vital member of a collaborative team, responsible for designing, engineering, maintaining, testing, evaluating, and implementing big data infrastructure, tools, projects, and solutions for the client.This role involves working closely with the team to leverage cutting-edge database technologies for the swift retrieval of results from vast datasets. The engineer will select and integrate big data frameworks and tools to meet specific needs and manage the entire lifecycle of large datasets to extract valuable insights. Key Responsibilities: Design and implement scalable big data solutions tailored to needs. Maintain and enhance existing big data infrastructures to meet unique requirements. Test and evaluate new big data technologies and frameworks for compatibility with NDUS systems and goals. Collect, store, process, manage, analyze, and visualize large datasets to derive actionable insights. Collaborate with team members to integrate big data solutions with existing systems. Ensure data integrity and security across all platforms used within clients. Develop and optimize data pipelines for ETL/ELT processes specific to data needs. Document technical solutions and maintain comprehensive records in line with standards and protocols. Stay updated with the latest trends and advancements in big data technology relevant to strategic initiatives. Required Qualifications: Thorough understanding of cloud computing technologies, including IaaS, PaaS, and SaaS implementations. Skilled in exploratory data analysis (EDA) to support ETL/ELT processes. Proficiency with Microsoft cloud products, including Azure and Fabric. Experience with tools such as Data Factory and Databricks. Ability to script in multiple languages, with a strong emphasis on Python and SQL. Preferred Qualifications: Experience with data visualization tools. Proficiency with Excel and Power BI. Familiarity with Delta Lake. Knowledge of Lakehouse Medallion Architecture. Keywords: business intelligence Big Data Engineer - REMOTE ROLE - 10 YEARS EXPEREINCE REQUIRED [email protected] |
[email protected] View all |
Sat Nov 09 02:40:00 UTC 2024 |