Home

Data Engineer with AWS, Azure and GCP Certifications @ Remote || 10+ at Remote, Remote, USA
Email: [email protected]
Hello, 

Please share
Data Engineer profiles with AWS, Azure, and GCP experience within 1 week notice period certification in Hyperscaler is a big plus i
f there is any consultant with AWS, Azure, and GCP Certification, will place him/her directly.

Job Summary:

Job Title:
Data Engineer

Location: Remote

Department: IT / Insurance Technology

As a Data
Engineer, your primary responsibilities will include:

Reviewing and understanding all required data
sources, both structured and unstructured.

Developing and testing data pipelines for data
ingestion, acquisition, and storage.

Automating ingestion pipelines and orchestrating
them with standard scheduling tools.

Deploying data engineering jobs on cloud
platforms and ensuring proficiency with major hyperscalers (AWS, Azure, or
GCP).

Collaborating with source systems to build
metadata and create source-to-target mappings.

Working with data storage and warehouse teams to
ensure optimal performance and tuning of data engineering jobs.

Required
Education & Experience:

Bachelors degree or higher in fields such as
Finance, Economics, Mathematics, Computer Science, Statistics, Process and
Mechanical Engineering, Operations Research, Data Science, Accounting,
Business Administration, or related areas.

5+ years of relevant work experience.

Required Soft
Skills:

Strong ownership and accountability in delivering
high-quality work while effectively managing priorities and deadlines.

Ability to recommend and implement improvements
to processes.

Excellent written and verbal communication
skills, including the ability to create and deliver presentations.

Ability to communicate concisely, tailoring
messages to the topic, audience, and competing priorities.

Strong analytical thinking and ability to ask
probing questions to drive clarity and make informed decisions.

Technical
Skills:

Expert knowledge of major hyperscaler data
engineering tools (e.g., AWS Glue, Spark, Azure Data Factory, Informatica,
Talend).

Preferred experience with Databricks, Snowflake,
and related tools in the data ecosystem.

Experience in building data pipelines optimized
for large datasets, including integration, storage, cleansing, and
transformation.

Familiarity with a wide range of data storage
solutions and an understanding of efficient utilization.

Ability to translate data requirements into
technical designs, understanding the differences between various storage
solutions.

Experience with the Software Development
Lifecycle (SDLC).

Thanks & Regards

Daud Khan

Aroha Technologies

5000 Hopyard RD, Suite 415

Pleasanton, CA 94568

Services round the World,
around the Clock

--

Keywords: information technology California
Data Engineer with AWS, Azure and GCP Certifications @ Remote || 10+
[email protected]
[email protected]
View all
Fri Nov 08 00:47:00 UTC 2024

To remove this job post send "job_kill 1912318" as subject from [email protected] to [email protected]. Do not write anything extra in the subject line as this is a automatic system which will not work otherwise.


Your reply to [email protected] -
To       

Subject   
Message -

Your email id:

Captcha Image:
Captcha Code:


Pages not loading, taking too much time to load, server timeout or unavailable, or any other issues please contact admin at [email protected]
Time Taken: 0

Location: ,