Home

Data Engineer - 12 months - LOCAL or Nearby TO LOS ANGELES at Remote, Remote, USA
Email: [email protected]
Data Engineer - 12 months - LOCAL or Nearby TO LOS ANGELES 

Any VISA (No OPT/CPT)

Hybrid with 3 days in Fox LA or NYC office

Client: 20th Century.

Top Skills' Details

1. Must have extensive experience building real-time data processing solutions.

2. Experience building highly optimized data pipelines and data models for big data using Python & PySpark

3. Advanced knowledge building Data Warehouses/Data Platforms on Redshift/Snowflake (Cloud-based data warehousing services)

4. Experience working with Apache AirFlow (fully automated workflow scheduling services)

Hybrid with 3 days in Fox LA or NYC office

Secondary Skills - Nice to Haves

data

etl

Job Description

JOB DESCRIPTION

Fox Technology Data Products team is seeking an experienced Sr. Data Engineer with a passion for building robust, scalable, efficient, and high-quality Data Engineering solutions to join our Engineering team. If you enjoy designing and building innovative data
engineering solutions using the latest tech stack in a fast-paced environment, this role is for you.

Primary Job Duties and Responsibilities

Collaborate with and across Agile teams to design and develop data engineering solutions by rapidly delivering value to our customers.

Build distributed, low latency, reliable data pipelines ensuring high availability and timely delivery of data

Design and develop highly optimized data engineering solutions for Big Data workloads to efficiently handle continuous increase in data volume and complexity

Build highly performing real-time data ingestion solutions for streaming workloads.

Adhere to best practices and agreed upon design patterns across all Data Engineering solutions

Ensure the code is elegantly designed, efficiently coded, and effectively tuned for performance

Focus on data quality and consistency, implement processes and systems to monitor data quality, ensuring production data is always accurate and available for key stakeholders and business processes that depend on it.

Create design (Data Flow Diagrams, Technical Design Specs, Source to Target Mapping documents) and test (unit/integration tests) documentation

Perform data analysis required to troubleshoot data related issues and assist in the resolution of data issues.

Focus on end-to-end automation of data engineering pipelines and data validations (audit, balance controls) without any manual intervention

Focus on data security and privacy by implementing proper access controls, key management, and encryption techniques.

Take a proactive approach in learning new technologies, stay on top of tech trends, experimenting with new tools & technologies and educate other team members.

Collaborate with analytics and business teams to improve data models that feed business intelligence tools, increasing data accessibility, and fostering data-driven decision making across the organization.

Communicate clearly and effectively to technical and non-technical leadership.

Additional Skills & Qualifications

WHAT YOU WILL NEED

Education: Bachelors degree in Computer Science, Computer Engineering, or relevant field

Work Experience: 7+ years of experience in architecting, designing and building Data Engineering solutions and Data Platforms

Experience in building Data Warehouses/Data Platforms on Redshift/Snowflake

Extensive experience building real-time data processing solutions.

Extensive experience building highly optimized data pipelines and data models for big data processing.

Experience working with data acquisition and transformation tools such as Fivetran and DBT

Experience building highly optimized & efficient data engineering pipelines using Python, PySpark, Snowpark

Experience working with distributed data processing frameworks such as Apache Hadoop, or Apache Spark or Flink

Experience working with real-time data streams processing using Apache Kafka, Kinesis or Flink

Experience working with various AWS Services (S3, EC2, EMR, Lambda, RDS, DynamoDB, Redshift, Glue Catalog)

Expertise in Advanced SQL programming and SQL Performance Tuning

Experience with version control tools such as GitHub or Bitbucket.

Expert level understanding of dimensional modeling techniques

Excellent communication, adaptability, and collaboration skills

Excellent analytical skills, strong attention to detail with emphasis on accuracy, consistency, and quality

Strong logical and problem-solving skills with critical thinking

--

Keywords: sthree information technology Louisiana
[email protected]
View all
Wed Oct 04 01:11:00 UTC 2023

To remove this job post send "job_kill 711043" as subject from [email protected] to [email protected]. Do not write anything extra in the subject line as this is a automatic system which will not work otherwise.


Your reply to [email protected] -
To       

Subject   
Message -

Your email id:

Captcha Image:
Captcha Code:


Pages not loading, taking too much time to load, server timeout or unavailable, or any other issues please contact admin at [email protected]
Time Taken: 101

Location: , Indiana