Home

Lead Data Engineer at Remote, Remote, USA
Email: [email protected]
From:

saloni chaurasia,

tekinspirations

[email protected]

Reply to:   [email protected]

Hello,

I Hope you are doing great.Please find below position if you have any matching candidate as per requirment .Please send me updated resume with candidate information.

Lead Data Engineer

Location: Hybrid 2-3 days a week in Newark, NJ

Duration: 6 month contract to hire 

Need local to NY, NJ candidates within 45-50 minutes of driving distance 

Data Engineer Lead needs more of an AWS Architect level experience with familiarity on development

.

Must Have: 

They really need to be heavy AWS

Need 3 to 5 years experience as a Lead 

Need 12-15 years of experience profile 

Qualifications 

Bachelor's degree in Computer Science, Software Engineering, MIS or equivalent combination of education and experience

Experience implementing, supporting data lakes, data warehouses and data applications on AWS for large enterprises

Programming experience with Java, Python/ Scala, Shell scripting

Solid experience of AWS services such as CloudFormation, S3, Glue, EMR/ Spark, RDS, Redshift, DynamoDB, Lambda, Step Functions, IAM, KMS, SM etc.

Solid experience implementing solutions on AWS based data lakes

Experience implementing metadata solutions leveraging AWS non-relational data solutions such as ElastiCache and DynamoDB

AWS Solutions Architect or AWS Big Data Certification 

Experience in AWS data lake/data warehouse/business analytics

Experience and understanding of various core AWS services such as IAM, Cloud Formation, EC2, S3, EMR/Spark, Glue, Datasync, CloudHealth, CloudWatch, Lambda, Athena, and Redshift

Experience in system analysis, design, development, and implementation of data ingestion pipeline in AWS

Experience with DevOps and Continuous Integration/ Delivery (CI/ CD) concepts and tools

Experience with business intelligence tools such as Tableau, Power BI or equivalent

Knowledge of ETL/ ELT

Experience in production support from Level 1 to Level 3

Awareness of Data Management & Governance tools

Working experience with Hadoop, HDFS, SQOOP, Hive, Python, and Spark is 

Experience working on Agile projects

Implement and support

end-to-end data lake/warehousing/mart/business intelligence/ analytics/ services solutions (ingest, storage, integration, processing, services, access) in AWS

data lake data intake/request/onboarding services and service documentation

data lake ingestion services for batch/real time data ingest and service documentation

data processing services (ETL/ELT) for batch/real time (Glue/Kinesis/EMR)  and service documentation

data storage services for data lake (S3)/ data warehouses (RDS/Redshift)/ data marts and service documentation

data services layer including Athena, Redshift, RDS, microservices and APIs

pipeline orchestration services including lambda, step functions, MWAA (optional)

data security services (IAM/KMS/SM/encryption/anonymization/RBAC) and service documentation

data access provisioning services (Accounts, IAM Roles RBAC), processes, documentation and education

data provisioning services for data consumption patterns including microservices, APIs and extracts

metadata capture and catalog services for data lake(S3/Athena), data warehouses (RDS/Redshift), Microservices/APIs

metadata capture and catalog services for pipeline/log data for monitoring /support

Implement CI/CD pipelines 

Prepare documentation for data projects utilizing AWS based enterprise data platform

Implement high velocity streaming solutions using Amazon Kinesis, SQS, and SMS

Migrate data from traditional relational database systems to AWS relational databases such as Amazon RDS, Aurora, and Redshift

Migrate data from traditional file systems and NAS shares to AWS data lake (S3) and relational databases such as Amazon RDS, Aurora, and Redshift

Migrate data from APIs to AWS data lake (S3) and relational databases such as Amazon RDS, Aurora, and Redshift

Provide cost/spend monitoring reporting for AWS based data platform data initiatives 

Provide governance/audit reporting for access of AWS based data platform

Lead the implementation of a data lake strategy to enable LOBs and Corporate Functions with a robust, holistic view of data - driven decision making

Serve as delivery lead for EDP data initiatives product owner

Partner with immediate engineering team, product owner, IT, partners on EDP agenda

Provide technology thought leadership, consulting, and coaching/mentoring

Establish development, qa, stage and production migration/support processes

Establish best practices for development and support teams

Deliver end-end data initiatives from ingest-consume via microservices/ apis, jdbc/ odbc, file extracts etc. 

Work with scrum master to develop and own backlog, stories, epics, sprints

Regards,

Saloni Chaurasia

{ Technical  Recruiter }

TEK Inspirations LLC Pvt. Ltd.

|

13573 Tabasco Cat Trail, Frisco, TX 75035, United States

E-Mail:

[email protected]

Keywords: continuous integration continuous deployment quality analyst business intelligence sthree information technology New Jersey New York Texas
Lead Data Engineer
[email protected]
[email protected]
View all
Fri Apr 19 02:07:00 UTC 2024

To remove this job post send "job_kill 1324183" as subject from [email protected] to [email protected]. Do not write anything extra in the subject line as this is a automatic system which will not work otherwise.


Your reply to [email protected] -
To       

Subject   
Message -

Your email id:

Captcha Image:
Captcha Code:


Pages not loading, taking too much time to load, server timeout or unavailable, or any other issues please contact admin at [email protected]
Time Taken: 118

Location: , Indiana