Home

GCP Data Engineer at Remote, Remote, USA
Email: [email protected]
Processing description:
http://bit.ly/4ey8w48
https://jobs.nvoids.com/job_details.jsp?id=2087250&uid=

Hii,

Hope you are doing well.

Please find my consultant "Aishwarya attached resume.

He is on .
Kindly let me know if it works for you.

Name: Aishwarya
Role:
Data
Engineer
Experience: 11 +

Contact Number:

3413564890
Ready to go onsite. Yes

On Thu, Jan 16, 2025 at 11:44AM shalini lagishetty <[email protected]> wrote:

Hello Everyone,

I have below mentioned Job Description, Kindly Go through :

Pyspark/GCP Data Engineer will create, deliver, and support custom data products, as well as enhance/expand team capabilities. They will work on analyzing and manipulating large datasets supporting the enterprise by activating data assets to support Enabling Platforms and analytics. Google Cloud Data Engineers will be responsible for designing the transformation and modernization on Google Cloud Platform using GCP Services

Responsibilities:

Build framework and pipelines on GCP Cloud using Data proc, Pyspark, Kafka and Pub/Sub

Implement schedules/workflows and tasks for Cloud Composer/Apache Airflow;

Create and manage data storage solutions using GCP services such as BigQuery, Cloud Storage, and Cloud SQL

Monitor and troubleshoot data pipelines and storage solutions using GCP's Stackdriver and Cloud Monitoring

Develop efficient ETL/ELT pipelines and orchestration using Data Prep, Google Cloud Composer

Develop and Maintain Data Ingestion and transformation process using Apache PySpark

Automate data processing tasks using scripting languages such as Python or Bash

Ensuring data security and compliance with industry standards by configuring IAM roles, service accounts, and access policies. Automating cloud deployments and infrastructure management using Infrastructure as Code (IaC) tools such as Terraform or Google Cloud Deployment Manager.

Participate in Code reviews, contribute to development best practices and usage of Developer Assist tools to create a robust fail safe data pipelines Colloborate with Product Owners, Scrum Masters and Data Analyst to deliver the User Stories and Tasks and ensure deployment of pipelines

Experience required:

7+ years of application development experience required using one of the core cloud platforms viz. AWS, Azure & GCP

Minimum 1+ years of GCP experience. Experience working in GCP based Big Data deployments (Batch/Real-Time) leveraging Pyspark, Big Query, Google Cloud Storage, PubSub, Data Fusion, Dataproc, Airflow;

Minimum 3+ years coding skills in Python/PySpark and strong proficiency in SQL;

Extracting, Loading, Transforming, cleaning, and validating data + Designing pipelines and architectures for data processing;

Architecting and implementing next generation data and analytics platforms on GCP cloud;

Experience in working with Agile and Lean methodologies;

Experience working with either a Map Reduce or an MPP system on any size/scale;

Experience working in CI/CD model to ensure automated orchestration of pipelines.

Work Location:

No location constraints. Should work in EST/CST hours.

Thanks&Regards,

Shalini
Gelnysys Technologies.Inc

https://www.linkedin.com/in/lagisetty-shalini-40877216a/
[email protected]

www.glenysys.com
75 Executive Centre,Suite 413,Aurora,IL 60504

--

--

Hari Priya

Bench sales recruiter

+1 (
980) 888-1484

[email protected]

www.zavenit.com
1300 W Walnut Hill Ln, Suite 104,
Irving, TX 75038, USA

| |

--

Keywords: continuous integration continuous deployment access management information technology golang card California Illinois Texas
GCP Data Engineer
[email protected]
http://bit.ly/4ey8w48
https://jobs.nvoids.com/job_details.jsp?id=2087250&uid=
[email protected]
View All
10:39 PM 16-Jan-25
Attached File(s):
Aishwarya Gogula-DE_1737047371625.docx
Please check the file(s) for viruses. Files are checked manually and then made available for download.


To remove this job post send "job_kill 2087250" as subject from [email protected] to [email protected]. Do not write anything extra in the subject line as this is a automatic system which will not work otherwise.

Pages not loading, taking too much time to load, server timeout or unavailable, or any other issues please contact admin at [email protected]


Time Taken: 84

Location: , Indiana