Home

Sr. GCP Data Engineer in Remote EST Zone Candidate at Remote, Remote, USA
Email: [email protected]
From:

saravanaraj,

Smartitframe

[email protected]

Reply to:   [email protected]

Hi

Greeting from Smart IT Frame,

Below is the great opportunity. Please have a look, if you are interested, please drop me your updated resume along with your contact information ASAP. In case you are not available please feel free to pass on this email to your friends or colleagues who might be interested.

Role: Sr. GCP Data Engineer

Location : Remote EST time zone.  

Duration:Contract

Job Description

:

             Minimum 8 Years of experience in Data Engineering Projects.

             Minimum 5 Years of experience in GCP.

             Minimum 8 Years of experience in SQL/PLSQL scripting

             Minimum 5 Years of experience in ETL and Data warehousing.

             Ability to build batching solutions

             Exposure to Project management tools like JIRA, Confluence, and GIT

             Strong problem-solving and analytical skills

             Good communication skills

             Strong exposure and experience in Big Query, Composer, Python and CICD Pipelines.

             Good Understanding of Distributed Data Platforms.

             Should have worked as a Sr. Data engineer in a medium/large scale Data Warehouse solution.

             Experience in Migrating Legacy Data Warehousing Solutions to GCP Cloud.

             Deep exposure & hands-on GCP Cloud Native ETL / ELT services with a deep understanding of BigQuery, Looker, or any other reporting platform.

             Possess in-depth knowledge and hands-on development experience operationalizing large-scale ingestion, processing, and consumption using either DataProc, Dataflow, or cloud fusion.

             Strong understanding and experience with Storage infrastructure, event-based architecture using Cloud Functions, Monitoring, Logging, and Auditing services of GCP.

             Strong experience on either one or more MPP Data Warehouse Platforms, prefer BigQuery, CloudSQL, Cloud Spanner, Fire store, or similar.

             Strong Development Experience on at least one or more event-driven streaming platforms, prefer PUB/SUB, Kafka

             Strong Data Orchestration experience using tools such as Cloud Functions, Dataflow, Cloud Composer, Apache Airflow, or related.

             Should be familiar with GCP Data Migration programs that enable the identification of potential risks in time for technical interventions.

Value add skills

             GCP Professional Data Engineer certification is an added advantage.

             Understanding of Terraform scripts

             Understanding of DevOps Pipelines

Keywords: information technology
[email protected]
View all
Tue Nov 07 05:24:00 UTC 2023

To remove this job post send "job_kill 832105" as subject from [email protected] to [email protected]. Do not write anything extra in the subject line as this is a automatic system which will not work otherwise.


Your reply to [email protected] -
To       

Subject   
Message -

Your email id:

Captcha Image:
Captcha Code:


Pages not loading, taking too much time to load, server timeout or unavailable, or any other issues please contact admin at [email protected]
Time Taken: 1

Location: ,