Home

(Senior Data Engineer) (Big Query, Python, Airflow), Remote, W2 role at Remote, Remote, USA
Email: [email protected]
From:

Shubhra Soni,

Sonitalent

[email protected]

Reply to:   [email protected]

Hi,

Hope you are doing well

We are looking for Senior Data Engineer if you are interested kindly send me your resume.

Job Role: (Senior Data Engineer) (BigQuery, Python, Airflow)

Duration: 6 months+

Location: Remote

Visa: USC, GC only

Note:    I need the Last 4 of SSN #

              Month and Day of Birthday

Worked at Entertainment or Digital Media, Streaming or Gaming Companys (Disney, FOX, Warner Brothers, NBC, ESPN, Sony, Universal, ABC, any TV network, HMO, MAX, NETFLIX, HULU, PLUTO TV, Discovery Channel, AMC, or Video Game companies (Activision, Electronic Arts, Riot Games etc.,..) (within last 4 jobs)

Please pay attention to the MUST HAVEs (this is a smaller team so communications skills are key)

              MUST HAVES:

              7+ years experience in Data Engineering and Analytics, and Data Modeling

              Experience with BigQuery, Python, Airflow, Experience with ETL & ELT

              Able to write SQL queries to perform common types of analysis and transformations

              Experience building and deploying applications on GCP cloud platform and AWS cloud platform

              Docker container deployment experience

              Worked at Entertainment or Digital Media, Streaming or Gaming Companys (Disney, FOX, Warner Brothers, NBC, ESPN, Sony, Universal, ABC, any TV network, HMO, MAX, NETFLIX, HULU, PLUTO TV, Discovery Channel, AMC, or Video Game companies (Activision, Electronic Arts, Riot Games etc.,..) (within last 4 jobs)

             

Senior Data engineer with GCP, Airflow, Python experience. Media / streaming data experience strongly preferred.

The Senior Data Engineer should possess a deep sense of curiosity and a passion for building smart data pipelines, data structures and data products and the ability to communicate data structures and tools throughout the Paramount Streaming organization.

The candidate for this role will use their skills in reverse engineering, analytics, and creative, experimental solutions to devise data and BI solutions. This engineer supports data pipeline development which includes machine learning algorithms using disparate data sources.

The ideal candidate will work closely with BI, Research, Engineering, Marketing, Finance, and Product teams to implement data-driven plans that drive the business.

They will have good communication skills and possess the ability to convey knowledge of data structures and tools throughout the Paramount Digital Media organization.

This candidate will be expected to lead a project from inception to completion as well as help mentor junior members of the team on best practices and approaches around data.

Your Day-to-Day:

Works with large volumes of traffic data and user behaviors to build pipelines that enhance raw data.

Able to break down and communicate highly complex data problems into simple, feasible solutions.

Extract patterns from large datasets and transform data into an informational advantage.

Find answers to business questions via hands-on exploration of data sets via Jupyter, SQL, dashboards, statistical analysis, and data visualizations.

Partner with the internal product and business intelligence teams to determine the best approach around data ingestion, structure, and storage. Then, work with the team to ensure these are implemented correctly.

Contributing ideas on how to make our data more effective and working with other members of the engineering, BI teams, and business units to implement changes.

Ongoing development of technical solutions while developing and maintaining documentation, at times training impacted teams.

Early on collaboration with the team on internal initiatives to create strategies that improve company processes.

Look at ways of improving efficiency by staying current on the latest technology and trends and introducing team members to such.

Develop prototypes to proof out strategies for data pipelines and products.

Mentor members of the team and department on best practices and approaches.

Lead initiatives in ways to improve the quality of our data as well as make the data more effective, with other members of engineering, BI teams, and business units to implement changes.

Able to break down and communicate highly complex data problems into simple, feasible solutions.

Qualifications:

What you bring to the team: You have

Bachelor's degree and 7+ years of work experience in Data Engineering and Analytics fields or consulting roles with a focus on digital analytics implementations.

Experience with large scale data warehouse management systems such as BigQuery for 3+ years with advanced level understanding of warehouse cost management and query optimization

Proficient in Python.

Experience with Apache Airflow or equivalent tools for orchestration of pipelines.

Experience with Data Modeling of performant table structures.

Able to write SQL to perform common types of analysis and transformations.

Strong problem-solving and creative-thinking skills.

Demonstrated development of ongoing technical solutions while developing and maintaining documentation, at times training impacted teams.

Experience developing solutions to business requirements via hands-on discovery and exploration of data.

Exceptional written and verbal communication skills, including the ability to communicate technical concepts to non-technical audiences, as well as translating business requirements into Data Solutions

Strong Experience with ETL & ELT.

Experience building and deploying applications on GCP and AWS cloud platform.

Influences and applies data standards, policies, and procedures

Builds strong commitment within the team to support the appropriate team priorities

Stays current with new and evolving technologies via formal training and self-directed education

You might also have: (Nice to haves)

Experience with Snowflake, Redshift and other AWS technologies.

Experience with Docker and container deployment.

Experience with Marketing tools like Kochava, Braze, Branch, Salesforce Marketing Cloud is a plus.

Experience with exploratory data analysis using tools like iPython Notebook, Pandas & matplotlib, etc.

Familiarity in Hadoop pipelines using Spark, Kafka.

Familiar with GIT.

Familiar with Adobe Analytics (Omniture) or Google Analytics.

Digital marketing strategy including site, video, social media, SEM, SEO, and display advertising.

Thanks & Regard

Shubhra Soni

Technical Recruiter| Sonitalent Corp.

Email id : [email protected]

Desk |  (859) 659 - 3432 EXT 211

Address - 5404 Merri brook Lane, Prospect, KY, USA

Keywords: business intelligence green card Idaho Kentucky
[email protected]
View all
Tue Feb 13 20:49:00 UTC 2024

To remove this job post send "job_kill 1113505" as subject from [email protected] to [email protected]. Do not write anything extra in the subject line as this is a automatic system which will not work otherwise.


Your reply to [email protected] -
To       

Subject   
Message -

Your email id:

Captcha Image:
Captcha Code:


Pages not loading, taking too much time to load, server timeout or unavailable, or any other issues please contact admin at [email protected]
Time Taken: 0

Location: ,