Home

Data Ops Engineer || Need local candidtes with linkedin with recent work exp. in MN) WITH DL at Remote, Remote, USA
Email: [email protected]
http://bit.ly/4ey8w48
https://jobs.nvoids.com/job_details.jsp?id=1401935&uid=

From:

hanshika,

vyze

[email protected]

Reply to: [email protected]

Job Description -

Need local candidtes with linkedin with recent work exp. in MN)

Data Ops Engineer

Hybrid ( Arden Hills, MN)

Skype

Job Description:

The Data Ops Engineer has an understanding that data is the foundation of any information-driven operating environment and is critical to the success of business outcomes. The Data Ops Engineer will work with the Data Architecture and Data Engineering teams to learn/assist in the design of solutions that ensure consistent, reliable, efficient, and sustainable data are delivered using best practices, standards and processes. The result of these efforts is highly accurate and trusted data assets for use in the Enterprise Data Platform and across the organization. Additionally, the Data Ops Engineer will also learn/assist in the effort to support the tools used to build the data platform for DevOps processes. The Data Operations Engineer will work to enhance the platform and utilities to drive efficiencies in existing data pipelines as well as the process to engineer new ones.

Technical Platform Operations

Automate administration tasks using Python and SQL against APIs and CLIs to nurture and develop the data platform

Work closely with technical staff to learn and optimize their technical environment needs

Administer cloud native platforms including Snowflake, Databricks, Qlik, Event Hubs, Power BI and other Azure services

Build notification solutions to ensure cloud platforms are operational and alerts are sent when they are not using methods such as pub/sub

Install patches and upgrades

Support issue resolution and escalation

Facilitate reporting of business metrics of platform (adoption, alignment to business goals, alignment to product team needs, etc.)

Build custom DevOps capabilities where tools or platforms do not have them built-in

Engage with software vendors to understand new capabilities of their platforms and tools

Establish platform-wide standards

Measure data pipeline alignment to standards

Measure uptime of platforms

Participate in MVP and PoC activities to understand new technical capabilities of the technology platform

Perform root cause analysis when there are issues

Product Development Operations

Ability to effectively communicate with technical and non-technical personnel

Build automation routines to support releases of incremental functionality to data applications owned and managed by a product

Implement data ingestion frameworks to support ingestion of varied data

Solve unforeseen technical and process problems

Document processes and onboard new data engineering resources

Measure and report consumption of cloud native platforms

Improve Interoperability

Communicate status of platform to technical product members

Build regression testing solutions to ensure various teams are not implementing conflicting solutions in the platform

Ability to work closely with cross functional IT stakeholders to build devops solutions

Ability to proactively communicate any potential risks cross-functionally

Required Experience/Education

College or Vocation Training in Computer Science, Programming, or similar technical areas.

Or 1 2 years work experience in data processing, programming languages (SQL, Python, SAS, R, etc.), infrastructure, dev ops, etc.. This could experience could be in school or a work environment.

Required Competencies/Skills

Basic technical knowledge and understanding of data and processing

Ability to learn quickly from formal and informal training

Ability to identify potential technical issues/risks

Ability to identify potential business solutions

Ability to collaborate across all levels and areas of the data platform team

Ability to participate in PoC efforts to research emerging technologies

Ability to work closely with technical and business people, as well as perform hands-on implementation

Preferred Experience/Education

Post High School college or tech training

Preferred Competencies/Skills

Knowledge of modern data platform tools (Snowflake, Redshift/AWS, Big Query/Google, etc.)

Knowledge of Python, SQL, Linux, Docker, NPM (package installation) or similar tools

Strong communication skills

Interest in a technology focused career

Infrastructure as code

CICD

Desire to work in a fast paced, changing environment

Keywords: business intelligence rlang information technology Minnesota
Data Ops Engineer || Need local candidtes with linkedin with recent work exp. in MN) WITH DL
[email protected]
http://bit.ly/4ey8w48
https://jobs.nvoids.com/job_details.jsp?id=1401935&uid=
[email protected]
View All
10:14 PM 16-May-24


To remove this job post send "job_kill 1401935" as subject from [email protected] to [email protected]. Do not write anything extra in the subject line as this is a automatic system which will not work otherwise.

Pages not loading, taking too much time to load, server timeout or unavailable, or any other issues please contact admin at [email protected]


Time Taken: 1

Location: ,