Home

Senior GCP Data Engineer Bentonville, AR non locals is also fine need 12+ CANDIDATES at Bentonville, Arkansas, USA
Email: [email protected]
Processing description:
http://bit.ly/4ey8w48
https://jobs.nvoids.com/job_details.jsp?id=2068824&uid=

Senior Data Engineer

Location: This is based out of Bentonville, AR and candidate is expected to be in office from day 1

Onsite Requirement

Mandatory Areas

Must Have Skills

Overall Experience level:

8+ years in IT with min 6+ years of Data Engineering and Analyst experience.

6+ Hadoop/Big Data

4+ years of hands-on experience in Scala

4+ years of experience in writing and tuning SQLs, Spark queries

Hudi,Scala,GCP,Python

UST Global is looking for a highly energetic and collaborative Senior Data Engineer with experience leading enterprise data projects
around Business and IT operations. The ideal candidate should be an expert in leading projects in developing and testing data pipelines, data analytics efforts, proactive issue identification and resolution and alerting mechanism using traditional, new and
emerging technologies. Excellent written and verbal communication skills and ability to liaise with technologists to executives is key to be successful in this role.

As a Senior Data Engineer at UST Global, this is your opportunity to

Assembling large to complex sets of data that meet non-functional and functional business requirements

Identifying, designing and implementing internal process improvements including re-designing infrastructure for greater scalability, optimizing data delivery, and automating manual processes

Building required infrastructure for optimal extraction, transformation and loading of data from various data sources using GCP/Azure and SQL technologies

Building analytical tools to utilize the data pipeline, providing actionable insight into key business performance metrics including operational efficiency and customer acquisition

Working with stakeholders including data, design, product and executive teams and assisting them with data-related technical issues

Working with stakeholders including the Executive, Product, Data and Design teams to support their data infrastructure needs while assisting with data-related technical issues

Strong background in data warehouse design

Overseeing the integration of new technologies and initiatives into data standards and structures

Strong Knowledge in Scala, Spark, PySpark, Python, SQL

Experience in Cloud platform(GCP/Azure) data migration Source/Sink mapping, Build pipelines, work flow implementation, ETL and data validation processing

Strong verbal and written communication skills to effectively share findings with shareholders

Experience in Data Analytics, optimization, machine learning techniques is added advantage

Understanding of web-based application development tech stacks like Java, Reactjs, NodeJs is a plus

Key Responsibilities

20% Requirements and design

60% coding & testing and 10% review coding done by developers, analyse and help to solve problems

10% deployments and release planning

You bring:

Bachelors degree in Computer Science, Computer Engineering or a software related discipline. A Masters degree in a related field is an added plus

6 + years of experience in Data Warehouse and Hadoop/Big Data

3+ years of experience in strategic data planning, standards, procedures, and governance

4+ years of hands-on experience in Scala

4+ years of experience in writing and tuning SQLs, Spark queries

3+ years of experience working as a member of an Agile team

Experience with Kubernetes and containers is a plus

Experience in understanding and managing Hadoop Log Files.

Experience in understanding Hadoop multiple data processing engines such as interactive SQL, real time streaming, data science and batch processing to handle data stored in a single platform in Yarn.

Experience in Data Analysis, Data Cleaning (Scrubbing), Data Validation and Verification, Data Conversion, Data Migrations and Data Mining.

Experience in all the phases of Data warehouse life cycle involving Requirement Analysis, Design, Coding, Testing, and Deployment., ETL Flow

Experience in architecting, designing, installation, configuration and management of Apache Hadoop Clusters

Experience in analyzing data in HDFS through Map Reduce, Hive and Pig is a plus

Experience building and optimizing big data data pipelines, architectures and data sets.

Strong analytic skills related to working with unstructured datasets

Experience in Migrating Big Data Workloads

Experience with data pipeline and workflow management tools: Airflow

Cloud Administration

Thanks & Regards,

Arundeep Rayabandi

Senior Talent Acquisition Specialist

American IT Systems

1116 S Walton Blvd, Suite 113 Bentonville, AR 72712

https://americanitsystems.com/

[email protected]

linkedin.com/in/arundeep-r-713a421b8

Phone No::
:(479) 265-8511 Ext-106

--

Keywords: rlang information technology Arkansas
Senior GCP Data Engineer Bentonville, AR non locals is also fine need 12+ CANDIDATES
[email protected]
http://bit.ly/4ey8w48
https://jobs.nvoids.com/job_details.jsp?id=2068824&uid=
[email protected]
View All
02:21 AM 10-Jan-25


To remove this job post send "job_kill 2068824" as subject from [email protected] to [email protected]. Do not write anything extra in the subject line as this is a automatic system which will not work otherwise.

Pages not loading, taking too much time to load, server timeout or unavailable, or any other issues please contact admin at [email protected]


Time Taken: 50

Location: Bentonville, Arkansas