Hot C2C opening forAWS Data Engineer Day 1 Onsite Role atNewark, NJ at Day, New York, USA |
Email: [email protected] |
From: Gobi, PiplNow [email protected] Reply to: [email protected] Hello, We have an urgent C2C opening for AWS Data Engineer Day 1 Onsite Role at Newark, NJ (Hybrid) Our client is looking to fill this role immediately. Locals Highly Preferred If you are interested in this role, Please share the updated resume, filled skill matrix, consultant details, visa and dl copy asap. Skill Matrix: Skills Years of experience Over all experience Total years of work exp in US As AWS Data Engineer Data lakes Data warehouses Python, Shell scripting and SQL Lake house/data cloud architecture SQL Cloud Formation, S3, Athena, Glue, EMR/Spark, RDS, Redshift, DynamoDB, Elastic search, API Gateway, Step Functions, IAM, KMS, SM AWS Lambda ETL/ELT CI/CD EDP Amazon Kinesis, SQS Kafka Data ingestion pipelines Consultant Details: Criteria Consultant's Data Full Name Primary Phone Primary Email Education Details Graduation Education Details Masters Certification if any Passport Number LinkedIn Profile US work authorization and expiration Expected pay rate on C2C Current Company Name Current location (City/State) Willing to relocate (yes/No) Availability to join new project/ Notice period Have you ever worked or interviewed for this client in the past If yes, as a consultant or as an employee Last 5 digits of Social Security Number Birth month and day (NOT YEAR) Role: AWS Data Engineer Location: Onsite Role at Newark, NJ (Hybrid) Job Description: Experience implementing, supporting data lakes, data warehouses and data applications on AWS for large enterprises Programming experience with Python, Shell scripting and SQL Solid experience of AWS services such as Cloud Formation, S3, Athena, Glue, EMR/Spark, RDS, Redshift, DynamoDB, Lambda, Elastic search, API Gateway, Step Functions, IAM, KMS, SM etc. Server less application development using AWS Lambda Experience in system analysis, design, development, and implementation of data ingestion pipeline in AWS Knowledge of ETL/ELT End-to-end data solutions (ingest, storage, integration, processing, access) on AWS Architect and implement CI/CD strategy for EDP Implement high velocity streaming solutions using Amazon Kinesis, SQS, and Kafka (preferred) Migrate data from traditional relational database systems, file systems, NAS shares to AWS relational databases such as Amazon RDS, Aurora, and Redshift Migrate data from APIs to AWS data lake (S3) & relational databases such as Amazon RDS, Aurora, and Redshift Implement POCs on any new technology or tools to be implemented on EDP and onboard for real use-case AWS Solutions Architect or AWS Developer Certification preferred Good understanding of Lakehouse/data cloud architecture Build reliable and robust Data ingestion pipelines (within AWS, onprem to AWS, etc.) Participate in the architecture and system design discussions Independently perform hands on development and unit testing of the applications. Collaborate with the development team and build individual components into complex enterprise web systems. Work in a team environment with product, production operation, QE/QA and cross functional teams to deliver a project throughout the whole software development cycle. Participate in code review to make sure standards and best practices are met. Keywords: continuous integration continuous deployment quality analyst sthree New Jersey Hot C2C opening forAWS Data Engineer Day 1 Onsite Role atNewark, NJ [email protected] |
[email protected] View all |
Wed Jun 05 00:24:00 UTC 2024 |