Home

AWS Data Engineer || McLean VA || Need Locals || F2F mandatory at Mclean, Virginia, USA
Email: [email protected]
From:

Ram,

Atlantis

[email protected]

Reply to:   [email protected]

Role: AWS Data Engineer

Location: McLean, VA (Hybrid)- Only Locals to VA/MD/DC

Duration: 9+ months

Interview Mode: 2nd round will be In-Person at McLean, VA

Must Haves


Looking for a Senior Cloud/Data Engineer with expertise in Python, Spark, PySpark and SQL.

Need someone who is expert in Snowflake (Mainly in Administration).

Need someone with expertise in AWS Services such as Lambda, S3, EC2, CloudWatch, SSM, and EMR.

Someone with previous Mortgage/Financial experience candidates will be preferred first. 

Develop data filtering, transformational and loading requirements

Define and execute ETLs using Apache Sparks on Hadoop among other Data technologies

Determine appropriate translations and validations between source data and target databases

Implement business logic to cleanse & transform data

Design and implement appropriate error handling procedures

Develop project, documentation and storage standards in conjunction with data architects

Monitor performance, troubleshoot and tune ETL processes as appropriate using tools like in the AWS ecosystem.

Create and automate ETL mappings to consume loan level data source applications to target applications

Execution of end to end implementation of underlying data ingestion workflow.

Qualifications

At least 10+ years of overall experience and 5 years of experience developing in Python, SQL (postgres/snowflake preferred). Strong experience in sql is preferred. 

Bachelors degree with equivalent work experience in computer science, data science or a related field.

Experience working with different Databases and understanding of data concepts (including data warehousing, data lake patterns, structured and unstructured data)

3+ years experience of Data Storage/Big Data platform implementation, with a preference for hands-on experience in implementation and performance tuning Hadoop/Spark implementations.

Implementation and tuning experience specifically using Amazon Elastic Map Reduce (EMR).

Implementing AWS services in a variety of distributed computing, enterprise environments.

Experience writing automated unit, integration, regression, performance and acceptance tests. 

Solid understanding of software design principles

Top Personal Competencies to possess

Seek and Embrace Change Continuously improve work processes rather than accepting the status quo

Growth and Development Know or learn what is needed to deliver results and successfully compete

Preferred Skills

Strong in SQL with strong understanding of snowflake.

Understanding of Apache Hadoop and the Hadoop ecosystem. Experience with one or more relevant tools (Sqoop, Flume, Kafka, Oozie, Hue, Zookeeper, HCatalog, Solr, Avro).

Deep knowledge on Extract, Transform, Load (ETL) and distributed processing techniques such as Map-Reduce

Experience with Columnar databases like Snowflake, Redshift

Experience in building and deploying applications in AWS (EC2, S3, Hive, Glue, EMR, RDS, ELB, Lambda, etc.)

Experience with building production web services

Experience with cloud computing and storage services

Knowledge of Mortgage industry

Ram

[email protected]

Keywords: sthree Maryland Virginia
AWS Data Engineer || McLean VA || Need Locals || F2F mandatory
[email protected]
[email protected]
View all
Wed May 29 23:54:00 UTC 2024

To remove this job post send "job_kill 1435792" as subject from [email protected] to [email protected]. Do not write anything extra in the subject line as this is a automatic system which will not work otherwise.


Your reply to [email protected] -
To       

Subject   
Message -

Your email id:

Captcha Image:
Captcha Code:


Pages not loading, taking too much time to load, server timeout or unavailable, or any other issues please contact admin at [email protected]
Time Taken: 12

Location: Mclean, Virginia