Home

Neeed Local to Chicago -- Data Engineer - Databricks & devOps at Chicago, Illinois, USA
Email: [email protected]
$50/hr on c2c

1.

mid level - 4+ years of data engineering experience within an Enterprise level environment

2. Databricks

3. Snowflake Cloud

4. Azure

5. Strong soft skills - looking to grow their career and willingness to learn

Local to Chicago

Previous Financial Services Experience

Job Description

The Risk Data Analytics team (business side) is looking to expand their team and help support their workload as they move from Azure Databricks
to Snowflake Cloud.

The ideal candidate will be customer-centric and deeply passionate about utilizing modern cloud architectures and technology to drive substantial business impacts across organizational boundaries. The Data Engineer will provide support for their Risk Data Analytics
team specifically for Snowflake & Databricks, catering to the Business Analytics Service (BAS) team and other analytics teams that come on board. Their responsibilities will encompass the administration, maintenance, and provisioning of the infrastructure
elements within Northern Trusts Risk Data Analytics. A key focus of this role will involve Infrastructure as Code (IaC), where the candidate will implement technical solutions to automate the provisioning of compute cluster capacity, manage compute and storage
resources, monitor resource utilization, and assist in migrating production interactive notebooks to more job-based compute capabilities.

Responsibilities:

Maintain and evolve AWB cloud-based platform capabilities (including ETL platform that will be selected to ingest data)

Collaborate with multiple internal teams (e.g. Cloud Foundation, Info Sec, Network Teams) to manage and maintain Analytics Workbench (e.g. billing, subscription levels, workspaces, Azure account hosting, audit logs, and high-level usage monitoring of platform
users)

Research and prototype solutions for data connectivity and operations across various Microsoft Azure Platforms. Responsible for creating and maintaining materials covering data ingestion methods, troubleshooting, and addressing related queries.

Implement Databricks delegated account administration across business divisions and functions, ensuring effective management and support.

Act as the key Databricks workspace administrator, facilitating discussions, address partner needs, and providing support and planning for new implementations.

Develop and enhance the platform in line with vendor release cycles and internal requirements (e.g., data protection standards). Regularly review systems and propose improvements.

Collaborate proactively with software engineers and site reliability engineers to troubleshoot Databricks performance, connectivity, and security issues within the platform engineering organization and user community.

Explore, adopt, and apply new technologies to resolve issues while adhering to company security protocols and standards.

Required

Mid level / 4+ years of experience in developing and deploying large scale data solution in an enterprise and/or cloud environment. Hands-on staging and production experience in each of the following areas:

Microsoft Azure Databricks

ADLS

Data Ingestion and ETL (e.g. ADF/Azure Synapse)

Snowflake Cloud

Documentation and Troubleshooting- Documentation and Troubleshooting

Understanding of the following areas:

Cloud architecture best practices around operational excellence, security, reliability, performance efficiency, and cost optimization (e.g. Cloud Well Architected Framework)

Best practices and IT operations for dynamic, always-up, always-available services

Bachelor's degree in Computer Science or related discipline.

Preferred Specific Skills:

Knowledge of data analytics technology and methodology, such as advanced analytics or machine learning

Experience with DevOps, DataOps, and/or MLOps using ADO

Working knowledge of data security such as SecuPi, and Azure native capabilities

Knowledge of IaC and automation such as Terraform and Powershell

Working experience with Azure and AWS cloud data and service offerings

Additional Skills & Qualifications

With NTs RTO & this being contract to hire, candidates ideally should be local to Chicago already. Plus if they have previously worked within Financial Services (but
not a must have).

--

Keywords: information technology
Neeed Local to Chicago -- Data Engineer - Databricks & devOps
[email protected]
[email protected]
View all
Wed May 08 20:01:00 UTC 2024

To remove this job post send "job_kill 1377354" as subject from [email protected] to [email protected]. Do not write anything extra in the subject line as this is a automatic system which will not work otherwise.


Your reply to [email protected] -
To       

Subject   
Message -

Your email id:

Captcha Image:
Captcha Code:


Pages not loading, taking too much time to load, server timeout or unavailable, or any other issues please contact admin at [email protected]
Time Taken: 0

Location: ,