Home

AWSData Engineer-Remote at Remote, Remote, USA
Email: [email protected]
From:

sana,

Vrddhi Solutions, LLC

[email protected]

Reply to:   [email protected]

Position- AWS  Data Engineer

Duration-12m+

Location- Remote

Visa:  No OPT, CPT

Experience:  7+ Years  

Whats most important;

Tech skills;

1)integration skills with Data Bricks

2)Hands on experience with Sparq and AWS

3)Database skills; aggregates, joins, etc.

Nice to haves;

Fivetrain(using this at the bank)

Experience with Talend or Informatica would work too

Strong experience with Databricks is very important.
AWS is preferred, but strong Azure experience is good too. A lot of similarities between the two for this process, so they can quickly pick up the AWS pieces they need if theyre strong enough technically.
Technical Expertise is most important. Need to be SME and be able to help guide the technical part of the project.
Communication is very important.  Body language was also emphasized for video interview. 

 Job Description

Position Summary:

This position is responsible for being part of a growing team to support data lake repository and related ETL processes using Fivetran, Databricks and other technologies.  In addition to creation of data lake systems this job is also responsible for data ETL, data curation and data quality check processes.  This position supports the Data Engineering team and works in an Agile environment.

Duties:

The ideal candidate will have a solid background in database and data lake development using technologies such as Fivetran, Databricks, Spark, AWS data lake, and traditional technologies like ETL & MS SQL Server.  The position will be part of a team building an end to end data lake and data pipelines on AWS using Agile project management methodologies
Design and Implement end to end Data Lake solutions using the technologies listed above as well as:

Data integrity process using Databricks and related tools/processes
Integrating Data Lake with 3rd party application APIs for downstream queries
Access design and integrations for Data Lake users and applications

Experience with AWS and related data technologies and concepts in the following areas is required:

Python
Athena
Glue/Crawlers
Lake Formation
ETL
Spark (PySpark a plus)
Workflows & automation
Triggers
Lambda Functions
IAM data lake security concepts/permissions

Data Onboarding define onboarding procedures and work with business stakeholders to onboard new data sources

Requirements:

5+ years of database and data management experience.
Fivetran, Databricks, AWS or other related data certifications are highly preferred.
Understanding of cloud technologies such as IaaS and SaaS.
Experience working in an Agile environment.
4 year college degree in information technology or equivalent experience.

Knowledge, Skills, Abilities and Behaviors:

Outstanding communication (verbal, written, visualization and listening) skills.
Self-starter who can work independently as well as in a team setting.
Hands on technologist with ability to help drive the strategy and mentor others.
Giving and receiving effective feedback across all interactions. 
Ability to address conflict with peers and others in the organization.
Interest in understanding customer perspective to aid in development of the right solution.
Interest in understanding business needs to aid in developing solutions that are right for the broader organization.   

Keywords:
[email protected]
View all
Thu Jan 05 00:49:00 UTC 2023

To remove this job post send "job_kill 252421" as subject from [email protected] to [email protected]. Do not write anything extra in the subject line as this is a automatic system which will not work otherwise.


Your reply to [email protected] -
To       

Subject   
Message -

Your email id:

Captcha Image:
Captcha Code:


Pages not loading, taking too much time to load, server timeout or unavailable, or any other issues please contact admin at [email protected]
Time Taken: 0

Location: ,