Home

Lead Data Engineer :: Remote ,need to support PST zone at Remote, Remote, USA
Email: [email protected]
http://bit.ly/4ey8w48
https://jobs.nvoids.com/job_details.jsp?id=2114412&uid=

Hi,

I hope you are well.

I request you to go through the below requirements help
me with updated resume,

Lead Data Engineer

Location: Remote ,need to support PST zone

Overall Experience level:

12+ years in IT with min 8+ years of Data Engineering and Analyst experience.

Must have skills.

Spark, Pyspark, Python, Kubernetes, Docker, SQL, GCP, Big Data experienceb

Optional Skill :

Kubernetes,Hadoop,Sql

Mandatory if Applicable

Domain Experience (If any ) Retail

Job Description:

Assembling large to complex sets of data that meet non-functional and
functional business requirements

Identifying, designing and implementing internal process improvements
including re-designing infrastructure for greater scalability, optimizing data
delivery, and automating manual processes

Building required infrastructure for optimal extraction, transformation and
loading of data from various data sources using GCP/Azure and SQL technologies

Building analytical tools to utilize the data pipeline, providing actionable
insight into key business performance metrics including operational efficiency
and customer acquisition

Working with stakeholders including data, design, product and executive teams
and assisting them with data-related technical issues

Working with stakeholders including the Executive, Product, Data and Design
teams to support their data infrastructure needs while assisting with
data-related technical issues

Strong background in data warehouse design

Overseeing the integration of new technologies and initiatives into data
standards and structures

Strong Knowledge in Spark, PySpark, SQL, PL/SQL (Procedures, Function,
Triggers, Packages and fixing the problems.)

Experience in Cloud platform(GCP/Azure) data migration Source/Sink mapping,
Build pipelines, work flow implementation, ETL and data validation processing

Strong verbal and written communication skills to effectively share findings
with shareholders

Experience in Data Analytics, optimization, machine learning techniques or
Python is added advantage

Good understanding of web-based application development tech stacks like
Java, AngularJs, NodeJs is a plus

Key Responsibilities

20% Requirements and design

60% coding & testing and 10% review coding done by developers, analyse
and help to solve problems

5% deployments and release planning

5% customer relations

You bring:

Bachelors degree in Computer Science, Computer Engineering or a software
related discipline. A Masters degree in a related field is an added plus

6 + years of experience in Data Warehouse and Hadoop/Big Data

3+ years of experience in strategic data planning, standards, procedures, and
governance

4+ years of hands-on experience in Python or Scala

4+ years of experience in writing and tuning SQLs, Spark queries

3+ years of experience working as a member of an Agile team

Experience with Kubernetes and containers is a plus

Experience in understanding and managing Hadoop Log Files.

Experience in understanding Hadoop multiple data processing engines such as
interactive SQL, real time streaming, data science and batch processing to
handle data stored in a single platform in Yarn.

Experience in Data Analysis, Data Cleaning (Scrubbing), Data Validation and
Verification, Data Conversion, Data Migrations and Data Mining.

Experience in all the phases of Data warehouse life cycle involving Requirement
Analysis, Design, Coding, Testing, and Deployment., ETL Flow

Experience in architecting, designing, installation, configuration and
management of Apache Hadoop Clusters

Experience in analyzing data in HDFS through Map Reduce, Hive and Pig

Experience building and optimizing big data data pipelines, architectures
and data sets.

Strong analytic skills related to working with unstructured datasets

Experience in Migrating Big Data Workloads

Experience with data pipeline and workflow management tools: Airflow

Experience with scripting languages: Python, Scala, etc.

Cloud Administration

For this role, we value:

The ability to adapt quickly to a fast-paced environment

Excellent written and oral communication skills

A critical thinker that challenges assumptions and seeks new ideas

Proactive sharing of accomplishments, knowledge, lessons, and updates across
the organization

Experience designing, building, testing and releasing software solutions in a
complex, large organization

Demonstrated functional and technical leadership

Demonstrated analytical and problem-solving skills (ability to identify, formulate,
and solve engineering problems)

--

Thanks and Regards,

S Siva Prasad | Technical Recruiter

HCL Global Systems, Inc

24543 Indoplex Circle,Suite 220,

Farmington Hills, MI 48335

Phone 248-473-0720
Ext: 222

Email: [email protected] |
[email protected]

LinkedIn: https://www.linkedin.com/in/siva-prasad-somavarapu-2a4727228/

--

Keywords: information technology golang procedural language Michigan
Lead Data Engineer :: Remote ,need to support PST zone
[email protected]
http://bit.ly/4ey8w48
https://jobs.nvoids.com/job_details.jsp?id=2114412&uid=
[email protected]
View All
07:58 PM 27-Jan-25


To remove this job post send "job_kill 2114412" as subject from [email protected] to [email protected]. Do not write anything extra in the subject line as this is a automatic system which will not work otherwise.

Pages not loading, taking too much time to load, server timeout or unavailable, or any other issues please contact admin at [email protected]


Time Taken: 1

Location: ,