Home

Need! Sr Data Engineer || Hybrid _Seattle, W A|| Only Locals at Remote, Remote, USA
Email: [email protected]
From:

vikas,

DVG Tech Solutions

[email protected]

Reply to: [email protected]

Hi,

Hope you are doing good!

1.Role Sr Data Engineer

Location: Hybrid (Seattle, WA) Local Guys

Length: 12+ Months

Type : C2C

Job description

This position contributes to Client success by building enterprise data services for analytic solutions. This position is responsible for design, development, testing and support for data pipelines to enable continuous data processing for data exploration, data preparation and real-time business analytics. Models and acts in accordance with Client guiding principles.

Tops 3 Skills Needed

1

Data Engineering

5+ years

2

Data Bricks

5+ years

3

Py Spark

5+ years

Years of Experience:
12+ years

Technology requirements:
Proficiency in Apache Spark, including Spark Core, Spark SQL and Spark Streaming.
Proficiency in languages such as Python for data processing and scripting.

Basic Qualifications/ Experience:
Experience in designing and implementing ETL processes using Databricks notebooks for efficient data extraction, transformation, and loading.
In-depth knowledge of the Databricks Unified Analytics Platform and its features for collaborative big data analytics.
Understanding of data modeling concepts for designing database structures.
Proficiency in working with both relational databases and NoSQL databases.
Integration of data from diverse sources, including APIs, databases, streaming data, and external data feeds.
Implementation of processes to ensure data quality, including data validation, cleansing, and error handling.
Knowledge of cluster management, optimization, and scaling for efficient data processing.
Optimization of Spark jobs and Databricks clusters for better performance.
Proficiency in cloud platforms such as Azure for building scalable and flexible data architectures.
Use of tools like Apache Airflow, ADF, or Databricks to orchestrate and schedule data workflows.

Degree or certifications required:
Education (minimum education level, degree or certification necessary): Bachelors degree in computer science, management information systems, or related discipline

Skills (minimum skills required):
5+ years Architect and design large scale high performance distributed systems
5-7+ years SQL Platform
2+ years Exposure No-SQL Platform is a plus
5+ years Hadoop, YARN, MapReduce, Pig or Hive, Spark
2+ years Data platform implementation on Azure or AWS is a plus

Key Responsibilities:
Responsibilities and essential job functions include but are not limited to the following:
Demonstrate deep knowledge and ability to lead others in the data engineering team to build and support noninteractive (batch, distributed) & real-time, highly available data, data pipeline and technology capabilities
Translate strategic requirements into business requirements to ensure solutions meet business needs
Work with infrastructure provisioning & configuration tools to develop scripts to automate deployment of physical and virtual environments; to develop tools to monitor usage of virtual resources
Assist in the definition of architecture that ensure that solutions are built within a consistent framework
Lead resolution activities for complex data issues
Define & implement data retention policies and procedures
Define & implement data governance policies and procedures
Identify improvements in team coding standards and help in implementation of the improvements
Leverage subject matter expertise to coordinate issue resolution efforts across peer support groups, technical support teams, and vendors
Develop and maintain documentation relating to all assigned systems and projects
Perform systems and applications performance characterization and trade-off studies through analysis and simulation
Perform root cause analysis to identify permanent resolutions to software or business process issues
Lead by example by demonstrating the Client mission and value

Nice-to-Haves:
Knowledge of data security best practices and the implementation of measures to ensure data privacy and compliance.
Implementation of monitoring and logging solutions to track the health and performance of pipelines.
Familiarity with monitoring platforms like DataDog and New Relic
Azure

Thanks, and regards.

Vikas

Sr. US IT Recruiter

Keywords: information technology Washington
Need! Sr Data Engineer || Hybrid _Seattle, W A|| Only Locals
[email protected]
[email protected]
View all
Sat May 18 00:48:00 UTC 2024

To remove this job post send "job_kill 1406926" as subject from [email protected] to [email protected]. Do not write anything extra in the subject line as this is a automatic system which will not work otherwise.


Your reply to [email protected] -
To       

Subject   
Message -

Your email id:

Captcha Image:
Captcha Code:


Pages not loading, taking too much time to load, server timeout or unavailable, or any other issues please contact admin at [email protected]
Time Taken: 0

Location: ,