Home

I tried to reach you for the position of Data Engineer in This will be a Hybrid Role, with preferred locations as follows, below at Remote, Remote, USA
Email: [email protected]
From:

Raj,

Techgene Solutions LLC

[email protected]

Reply to:   [email protected]

I tried to reach you for the position of  

Data Engineer

in 

This will be a Hybrid Role, with preferred locations as follows, below

Hello,

I hope you are well. I am (Raj) with Techgene Solutions LLC. Were

Data Engineer

and I think your experience is a great fit for this role. If youre interested in learning more, Id love to connect with you. Please review the Job description below and let me know if you are comfortable. I look forward to hearing from you soon. Thank you! 

Please share your profiles to [email protected] or you can reach me at 972-580-0247 Ext 253 and Moblie:469-436-7175

This will be a Hybrid Role, with preferred locations as follows,

700 Hidden Ridge, Irving, TX 75038

7701 E Telecom Pkwy, Temple Terrace, FL 33637

5055 North Point Pkwy, Alpharetta, GA 30022

201 Centennial Ave, Piscataway, NJ 8854

One Verizon Way, Basking Ridge, NJ 7920

Candidate should have a valid work authorization to work out of US and can only work from US for US based positions.

Role : Data Engineer

Visa : Any visa is fine

Vendor : Prodapt 

Client : Verizon

This is a Hybrid role : Candidate must go for Hybrid for any one the above locations.

We are looking for GCP Data Engineers in US(8+ Plus Years of overall experience within which at-least 1 Year of GCP Data Engineering Experience),

Following is the JD

Build and maintain data management workflows. Build and maintain Data ingestion pipelines for batch, micro-batch and real time streaming on big query with Google Cloud

GCP Certified developer or possess expertise on Biq Query and Data Proc

Experience in building Data ingestion pipelines for batch, micro-batch and real time streaming on big data/Hadoop platforms

Hands on experience on Hadoop big data tools HDFS, Hive, Presto, Apache Nifi, Sqoop, Spark, Log Stash, Elastic Search, Kafka & Pulsar

Experience in collecting data from Kafka/Pulsar message bus and transporting the data to public/private cloud platform using NiFi, Data Highway and log stash technologies

Experience in building CI/CD pipeline & Dev Ops is preferred

Development experience with Agile Scrum/Safe methodology

Keywords: continuous integration continuous deployment access management golang
[email protected]
View all
Thu Dec 08 23:18:00 UTC 2022

To remove this job post send "job_kill 198360" as subject from [email protected] to [email protected]. Do not write anything extra in the subject line as this is a automatic system which will not work otherwise.


Your reply to [email protected] -
To       

Subject   
Message -

Your email id:

Captcha Image:
Captcha Code:


Pages not loading, taking too much time to load, server timeout or unavailable, or any other issues please contact admin at [email protected]
Time Taken: 2

Location: ,