Home

Requirement for Senior Big Data Engineer with Databricks. 100% Remote in MST Time Zone at Remote, Remote, USA
Email: [email protected]
From:

SAPNA,

ITECS

[email protected]

Reply to:   [email protected]

Key Skills: Big Data, DataBricks, Python, Google Pub/Sub, Kafka, Snowflake, Mongo DB, ThoughtSpot

We have worked for similar requirement for Dollar General and submitted few good candidates last week. Out of all of those, pls check who is the best having strong Big Data and we can consider those candidates to this position as well and submit.

We need to submit some very good quality resumes on this requirement  in next 4 hours.

-----------------------------------------------------------------------------------------

Position: Senior Big Data Engineer with Databricks

Location: 100% Remote in AZ  (MST) Time Zone.

Description:

Purpose of the Position

:

We are looking for a Senior Big Data Engineer to help design, build, and enable new use cases leveraging our Big Data Platform. 

In this role you will perform a broad range of data engineering activities including the design, creation, and support new bespoke data sets leveraging big data tools to help provide new and innovative insights to our business.    You will also be assisting in the
creation of ingestion pipelines and refinement of data ingested into our environment.  Responsibilities include working with customers to define data assets that are needed and working to align with peers across data engineering
(data ingestion and the cloud platform teams) to ensure needed data assets exist.  You will also be a key partner to our Data Science team to enable pipelines allowing model outputs to be consumed on a large scale and real-time basis.  This is an extremely hands on engineering role that values an engineering mind set and belief in first principles engineering. At times you might also be expected to lead teams of engineers based on the product backlog(s). 

Key Responsibilities:

Minimum of 10 years of experienced, Senior Engineer on the Big Data - Data Engineering team.  Design, develop, implement, and maintain code that enables data and analytics use cases across the enterprise

Design and build complex applications with an emphasis on performance, scalability, and high-reliability (DevOps Model).  You build it you own it!

Code, test, and document new or modified data systems to create robust and scalable applications for data analytics.

Write concise code utilizing complex algorithms that can run at scale to build data driven features.

Dive deep into performance, scalability, capacity, and reliability problems to resolve issues.

Develop models within distributed computing frameworks.

Take on research projects to improve data processing and implemented machine learning frameworks.

Work and Technical Experience:

Must-Have Skills:

Big Data Proficiency: Experience with big data technologies -
Databricks, Google Pub sub/Kafka/Mongo DB

Data Modelling: Strong understanding of data modelling principles and experience in designing efficient data models for ThoughtSpot or similar tool.

SQL and Data Manipulation: Proficiency in
SQL and data manipulation skills to prepare data for visualization.

Problem-Solving: Strong problem-solving skills and the ability to troubleshoot and resolve technical issues.

Communication: Excellent communication and collaboration skills to work effectively with cross-functional teams and clients.

Team Leadership: Experience in providing guidance and mentorship to junior developers.

Good-to-Have Skills:

Scripting Languages: Familiarity with scripting languages such as Python or JavaScript for customizations and automation.

ETL
: Familiarity with ETL processes and tools for data extraction, transformation, and loading.

Cloud Platforms
: Knowledge of cloud platforms like Databricks, Snowflake, Google Cloud for ThoughtSpot deployments.

Data Warehousing
: Understanding of data warehousing concepts and technologies.

Data Governance
: Familiarity with data governance and data quality best practices.

Certifications
: certifications or relevant industry certifications.

Data Security: Knowledge on PII and masking is desirable.

Agile: Working knowledge on Agile/SAFe

Domain Expertise: Knowledge on Retail Domain KPIs

Qualifications

:

Bachelors degree in computer science, Information Technology, or a related field.

10+ years of Experience in Big Data Technologies must be able to articulate use cases supported and outcomes driven

Large scale data engineering and business intelligence delivery experience.Design of large-scale enterprise level big data platforms

100s or 1000s of concurrent users

Experience working with and performing analysis using large data sets (10-1000 TB)

Qualities:

Excellent verbal and written communication

Collaboration skills to work in a self-organized and cross-functional teams

Strong troubleshooting and problem-solving abilities.

Excellent analytical, presentation, reporting, documentation, and interactive skills.

Keywords: database information technology Arizona
[email protected]
View all
Thu Nov 09 13:19:00 UTC 2023

To remove this job post send "job_kill 841977" as subject from [email protected] to [email protected]. Do not write anything extra in the subject line as this is a automatic system which will not work otherwise.


Your reply to [email protected] -
To       

Subject   
Message -

Your email id:

Captcha Image:
Captcha Code:


Pages not loading, taking too much time to load, server timeout or unavailable, or any other issues please contact admin at [email protected]
Time Taken: 58

Location: , Indiana