Home

Only USC & GC :: Data Engineer with AWS :: Locals to Chicago, IL (Face to face interview is required0 at Chicago, Illinois, USA
Email: [email protected]
From:

Avinash Kumar,

ZealHire Inc

[email protected]

Reply to:   [email protected]

Position: Sr. Data Engineer with AWS Cloud

Location: Chicago, IL - Locals to Chicago (Face to Face interview is required)

Duration: Long-Term Contract

Visa Status: Only USC/GC

Desired Years of experience:- 8+ Years

Job Description:   

USC/GC

need LinkedIn updated with picture and no new LinkedIns

VISA DL

Local to Chicago only. Onsite interview required. Hybrid 3x a week

T
op 5 Skill sets

1. DevOps

2. AWS Cloud

3. Terraform

4. Python

5 CI/CD pipelines

1. Blue-Green deployments

2. Kubernetes

3. Ansible Playbooks

OVERVIEW/SUMMARY

The Product Analytics team at United Airlines is on a transformational journey to unlock the full potential of enterprise data, build a dynamic, diverse, and inclusive culture and develop a modern cloud-based data lake architecture to scale our applications, and drive growth using data and machine learning.

Our objective is to enable the enterprise to unleash the potential of data through innovation and agile thinking, and to execute on an effective data strategy to transform business processes, rapidly accelerate time to market and enable insightful decision making.

JOB OVERVIEW AND RESPONSIBILITIES

In this role you will partner with various teams to define and execute data acquisition, storage, transformation, processing and make data actionable for operational and analytics initiatives that create sustainable revenue and share growth.

This role requires expertise in Uniteds data sources and technology business intuition, and a working knowledge of data transformation and analytical tools.

Support large scale data pipelines in a distributed and scalable environment

Enable and optimize production AWS environment for data infrastructure and frameworks

Expert in creating Terraform modules to automate deployments

Knowledge of Databricks and Datalake technologies

Partner with development teams and other department leaders/stakeholders to provide cutting edge technical solutions that enable business capabilities

Participate and lead in design and development of innovative batch and streaming data applications using AWS technologies

Provide the team technical direction and approach to be undertaken and guide them in resolution of queries/issues

AWS Certification

Knowledge: Python, Bash scripting, PySpark, AWS Services (Airflow, Glue, Lambda, others), Terraform, Databricks

Skills: Thorough troubleshooter, Hands on AWS Technology leader, People person and ability to conclude an undertaking

Ability: Solve problems under pressure and in constrained scenarios, Leadership, Making right judgement

Must be fluent in English (written and spoken)

Coordinate and guide cross-functional projects that involve team members across all areas of the enterprise, vendors, external agencies, and partners

Ability to manage multiple deliverables both short and long-term in a busy and aggressive environment, while staying flexible to dynamic needs and priority levels

Manage agile development and delivery by collaborating with project manager, product owner and development leads

REQUIRED

Bachelor's degree in quantitative field (statistics, software engineering, business analytics, information systems, aviation management or related degree)

5+ years of experience in data engineering or ETL development role

Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement.

Strong analytic skills related to working with structure, semi- structure, and unstructured datasets.

Experience with Big Query, SQL server, etc.

Experience with AWS cloud services: Redshift, S3, Athena, etc.

Experience with SQL and various database interface tools: SSMS, Oracle SQL developer, etc.

Passionate about solving problems through data and analytics, and creating data products including data models

Strong initiative to take ownership of data-focused projects, get involved in the details of validation and testing, as well as provide a business user perspective to their work

Ability to communicate complex quantitative concepts in a clear, precise, and actionable manner

Proven proficiency with Microsoft Excel and PowerPoint

Strong problem-solving skills, using data to tackle problems

Outstanding writing, communication, and presentation skills

PREFERRED

Master's degree

Experience with Quantum Metrics and Akamai

Experience with languages: Python, R, etc.

Strong experience with continuous integration & delivery using Agile methodologies

Data engineering experience with transportation/airline industry

Strong problem-solving skills

Keywords: continuous integration continuous deployment sthree rlang green card Illinois
Only USC & GC :: Data Engineer with AWS :: Locals to Chicago, IL (Face to face interview is required0
[email protected]
[email protected]
View all
Tue Sep 24 20:33:00 UTC 2024

To remove this job post send "job_kill 1778842" as subject from [email protected] to [email protected]. Do not write anything extra in the subject line as this is a automatic system which will not work otherwise.


Your reply to [email protected] -
To       

Subject   
Message -

Your email id:

Captcha Image:
Captcha Code:


Pages not loading, taking too much time to load, server timeout or unavailable, or any other issues please contact admin at [email protected]
Time Taken: 11

Location: Chicago, Illinois