Home

Immediate job opening for Data Engineer in Need Local to OR Candidates at Remote, Remote, USA
Email: [email protected]
From:

Khursheed,

RHG

[email protected]

Reply to:   [email protected]

Hi Professional, 

Hope you're doing great! 

Data Engineer

Onsite from day one. candidate has to go to office in Oregon.

Need Local to OR Candidates

Responsibilities:

(The primary tasks, functions and deliverables of the role)

Design and build reusable components, frameworks and libraries at scale to support analytics products

Design and implement product features in collaboration with business and Technology stakeholders

Identify and solve issues concerning data management to improve data quality

Clean, prepare and optimize data for ingestion and consumption

Collaborate on the implementation of new data management projects and re-structure of the current data architecture

Implement automated workflows and routines using workflow scheduling tools

Build continuous integration, test-driven development and production deployment frameworks

Analyze and profile data for designing scalable solutions

Troubleshoot data issues and perform root cause analysis to proactively resolve product and operational issues

Requirements:

Experience:

Strong understanding of data structures and algorithms

Strong understanding of solution and technical design

Has a strong problem solving and analytical mindset

Able to influence and communicate effectively, both verbally and written, with team members and business stakeholders

Able to quickly pick up new programming languages, technologies, and frameworks

Experience building cloud scalable, real time and high-performance data lake solutions

Fair understanding of developing complex data solutions

Experience working on end-to-end solution design

Willing to learn new skills and technologies

Has a passion for data solutions

Required and Preferred Skill Sets:

Hands on experience in AWS - EMR [Hive, Pyspark], S3, Athena or any other equivalent cloud

Familiarity with Spark Structured Streaming

Minimum experience working experience with Hadoop stack dealing huge volumes of data in a scalable fashion

hands-on experience with SQL, ETL, data transformation and analytics functions

hands-on Python experience including Batch scripting, data manipulation, distributable packages

experience working with batch orchestration tools such as Apache Airflow or equivalent, preferable Airflow

working with code versioning tools such as GitHub or BitBucket; expert level understanding of repo design and best practices

Familiarity with deployment automation tools such as Jenkins

hands-on experience designing and building ETL pipelines; expert with data ingest, change data capture, data quality; hand on experience with API development;

designing and developing relational database objects; knowledgeable on logical and physical data modelling concepts; some experience with Snowflake

Familiarity with Tableau or Cognos use cases

Familiarity with Agile; working experience preferred

Best Regards, 

Khursheed

Keywords: sthree golang
[email protected]
View all
Fri Sep 08 21:57:00 UTC 2023

To remove this job post send "job_kill 619017" as subject from [email protected] to [email protected]. Do not write anything extra in the subject line as this is a automatic system which will not work otherwise.


Your reply to [email protected] -
To       

Subject   
Message -

Your email id:

Captcha Image:
Captcha Code:


Pages not loading, taking too much time to load, server timeout or unavailable, or any other issues please contact admin at [email protected]
Time Taken: 153

Location: , Indiana