Home

Databricks + AWS combination (Remote) at Remote, Remote, USA
Email: [email protected]
From:

Rajiv,

testingxperts

[email protected]

Reply to:   [email protected]

Only with Databricks + AWS combination and not Azure Databricks experienced resources

Databricks Data Engineer

Remote (Need only from EST)

Long term

8+ years experience in architecture, design, and implementation using Databricks

Experience in designing and implementing scalable, fault-tolerant systems

Deep understanding of one or more of the big data computing technologies such as Databricks, snowflake

Demonstrated experience with the deployment of Databricks on cloud platforms, including advanced configurations

In-depth knowledge of spark internals, catalyst optimization, and Databricks runtime environment

Must have experience in implementing solutions using Databricks

Experience in Insurance (P&C) is good to have

Programming Languages SQL, Python

T
echnologies Databricks, Delta Lake storage, Spark (PySpark, Spark SQL).

Good to have - Airflow, Splunk, Kubernetes, Power BI, Git, Azure DevOps

Project Management using Agile, Scrum

B.S. Degree in a data-centric field (Mathematics, Economics, Computer Science, Engineering, or other science field), Information Systems, Information Processing, or engineering.

Excellent communication & leadership skills, with the ability to lead and motivate team members

Ability to work independently with some level of ambiguity and juggle multiple demands

Responsibilities:

Lead and/or assist in designing and developing data systems, tailoring solutions to meet client-specific requirements

Design and implement databricks-based solutions with a focus on distributed data processing, data partitioning and optimization for parallelism

Engage with client to evaluate their current and future needs, crafting bespoke solution architectures and providing strategic recommendations

Develop comprehensive architecture solution roadmaps integrating client business processes and technologies

Define and enforce coding standards for ETL processes, ensuring maintainability, reusability, and adherence to best practices

Architect and implement CI/CD pipelines for Databricks notebooks and jobs, ensuring testing, versioning, and deployment

Disaster recovery strategies for Databricks environments, ensuring data resilience and minimal downtime in case of failure

Innovate and expand solution offerings to address data challenges

Advise stakeholders on data cloud platform architecture optimization, focusing on performance

Experienced with Scrum and Agile Methodologies to coordinate global delivery teams, run scrum ceremonies, manage backlog items, and handle escalations

Integrate data across different systems and platforms

Strong verbal and written communication skills to manage client discussions

Keywords: cprogramm continuous integration continuous deployment business intelligence
Databricks + AWS combination (Remote)
[email protected]
[email protected]
View all
Fri Sep 27 23:29:00 UTC 2024

To remove this job post send "job_kill 1793177" as subject from [email protected] to [email protected]. Do not write anything extra in the subject line as this is a automatic system which will not work otherwise.


Your reply to [email protected] -
To       

Subject   
Message -

Your email id:

Captcha Image:
Captcha Code:


Pages not loading, taking too much time to load, server timeout or unavailable, or any other issues please contact admin at [email protected]
Time Taken: 0

Location: ,