Home

Cloud Data Engineer||Remote at Remote, Remote, USA
Email: [email protected]
Hi,

Hope you are doing great today,

Please find the below requirement and if you are
comfortable then share the updated resume with contact details ASAP.

Job Title:
Cloud Data Engineer

Location:
Remote    

Duration:6-12+
Months

Visa:
USC & GC only

Job Description:

We are
looking for a Cloud Data Engineer, who will be responsible to plan, design,
develop and maintain the data architecture, data models, data pipelines and
standards for various Data Integration & Data Lake and Data Warehouse
projects in the AWS Cloud. Ensure new features and subject areas are modelled
to integrate with existing structures and provide a consistent view. Develop
and maintain documentation of the data architecture, data flow and data models
of the data warehouse appropriate for various audiences. Provide direction on
adoption of Cloud technologies (Redshift, Postgres, Snowflake,) and industry
best practices in the field of data lakes and data warehouse architecture and
modelling. Providing technical leadership to large enterprise scale projects.

This role requires a broad range of skills and the ability to step into
different roles depending on the size and scope of the project.

Responsibilities: -

Demonstrated ability to have successfully completed multiple, complex
technical data engineering projects and create high-level design and
architecture of the solution.

Able to present the solution architecture strategy and define value
proposition to key stake holders.

Able to communicate the capabilities and improvements offered by solution
architecture with both the development teams and business user teams.

Develop POC's and conduct show and tell sessions.

Design, build & deploy the data pipelines using EMR, EMR studio,
Databricks, Spark, Pyspark and Shell Scripting.

Design the Data replication systems and integration systems by evaluating
various products and deciding on the better technology that meets the corporate
objectives.

Perform Data Analysis, Data Modeling, Data Management, and Data lineage.

Design data security framework considering both internal and external sharing
aspects.

Design Data pipes and automate data ingestion and cleansing, transformation
activities using the DevOps methodologies.

Required Skills:-

Must have total 7+ yrs. of experience in IT and working as a Data Engineer

Must have 5+ years of experience in EMR, Databricks, Spark, pyspark
programming, Jupyter notebook and shell scripting

3+ years in Data warehouse, ETL Projects.

Must have experience End to End implementation of cloud data warehouse
(Redshift, Postgres, or Snowflake).

Hands-on experience with Postgres SQL, RedShift, Redshift spectrum, AWS Glue,
Athena, Snowflake utilities, Snow SQL, Snow Pipe, AWS Lambda model techniques
using Python.

Experience in Data Migration from RDBMS to Redshift/Snowflake cloud data
warehouse.

Expertise in data modelling, ELT using Snowflake SQL, implementing complex
stored Procedures and standard DWH and ETL/ELT concepts.

Expertise in advanced concepts like setting up resource monitors, Role-based
access controls (RBAC), virtual warehouse sizing, query performance tuning,
zero copy clone, time travel and understand how to use these features.

Expertise in deploying features such as data sharing, events, and lake-house
patterns.

Deep understanding of relational as well as NoSQL data stores, Methods, and
approaches (Star and Snowflake, Dimensional Modelling).

Experience with data security and data access controls and design.

Experience with AWS or Azure data storage and management technologies such as
S3 and ADLS.

Build processes supporting data transformation, data structures, metadata,
dependency, and workload management.

Proficiency in RDBMS, complex SQL, PL/SQL, Unix Shell Scripting, performance
tuning and troubleshoot.

Provide resolution to an extensive range of complicated data pipeline related
problems, proactively and as issues surface.

Must have expertise in AWS and Azure Platform as a Service (PAAS).

Should be able to troubleshoot problems across infrastructure, platform, and
application domains.

Must have experience of Agile development methodologies.

Strong written communication skills. Is effective and persuasive in both
written and oral communication.

Amit Vikal (AV)

Thoth IT LLC

--

Keywords: sthree information technology green card procedural language
[email protected]
View all
Fri Sep 01 19:30:00 UTC 2023

To remove this job post send "job_kill 596363" as subject from [email protected] to [email protected]. Do not write anything extra in the subject line as this is a automatic system which will not work otherwise.


Your reply to [email protected] -
To       

Subject   
Message -

Your email id:

Captcha Image:
Captcha Code:


Pages not loading, taking too much time to load, server timeout or unavailable, or any other issues please contact admin at [email protected]
Time Taken: 0

Location: ,