Home

GCP Data Engineer || Dallas, TX (Hybrid) || Long Term Contract at Dallas, Texas, USA
Email: [email protected]
GCP Certification highly preferred, please mention if have.

Only Local / Near to

Dallas
, TX

|| under 50 miles radius.

Please mention Visa status and Current location while sharing resume

Best rate $60/hr on C2C

Job Title- GCP Data Engineer

Location- Dallas, TX (visit office biweekly)

Long Term Contract

Job Description:

Responsibilities:

1. Design, develop, and maintain scalable data pipelines and data processing systems on the Google Cloud Platform (GCP).

2. Collaborate with data scientists, analysts, and other stakeholders to understand their data requirements and implement solutions accordingly.

3. Develop and optimize ETL processes to ensure efficient data ingestion, transformation, and loading.

4. Implement data governance and security measures to ensure data quality, integrity, and privacy.

5. Monitor and troubleshoot data pipelines to identify and resolve issues in a timely manner.

6. Work with cross-functional teams to integrate data from various sources and systems.

7. Conduct performance tuning and optimization of data processing jobs.

8. Stay updated with the latest trends and technologies in the field of data engineering and GCP services.

9. Focus Areas:

Building scalable and efficient data pipelines on the GCP.

Data governance and security.

Integration of data from multiple sources and systems.

Performance tuning and optimization.

Staying updated with emerging technologies and best practices in data engineering.

Key Skill Sets:

1. 1. Data Engineering Work: The candidate should have experience in building pipelines using Python/Pyspark on GCP cloud.

2. 2. Dataproc Knowledge: The candidate should have working knowledge of serverless Dataproc and Ephemeral Dataproc.

3. 3. Airflow Expertise: Proficiency in Airflow is required.

4. 4. BigQuery: The candidate must be very strong in writing SQL.

5. 5. ML Experience: Experience in machine learning will be an added advantage for retaining the position.

6. 6. Vertex AI: Knowledge and working experience in model building using Vertex AI etc will be added advantage

Qualifications we seek in you!

Minimum Qualifications:

1. Bachelor's degree in computer science, information systems, or a related field.

2. Experience in data engineering or a similar role.

3. Demonstrated experience in designing and implementing data pipelines using GCP services.

4. Proficiency in Python, SQL, and data manipulation techniques.

5. Strong understanding of cloud computing concepts and distributed systems.

Preferred Qualifications/skills:

7. Master's degree in computer science, information systems, or a related field.

8. Experience with other cloud platforms such as AWS or Azure.

9. Certification in GCP data engineering or related field.

10. Familiarity with machine learning concepts and frameworks.

11. Experience with real-time data processing and streaming technologies.

Your sincerely
,

Ajay Sharma | Sr. Technical Recruiter.

Net
2Source Inc.

Fax: (201) 221-8131
| Email:

[email protected]

Global HQ Address: 270 Davidson Ave, Suite 704, Somerset, NJ 08873, USA

Web:

www.net2source.com

|

Social:

Facebook

|

Twitter

|

LinkedIn

--

Keywords: artificial intelligence machine learning information technology New Jersey Texas
GCP Data Engineer || Dallas, TX (Hybrid) || Long Term Contract
[email protected]
[email protected]
View all
Thu Jul 18 23:32:00 UTC 2024

To remove this job post send "job_kill 1576506" as subject from [email protected] to [email protected]. Do not write anything extra in the subject line as this is a automatic system which will not work otherwise.


Your reply to [email protected] -
To       

Subject   
Message -

Your email id:

Captcha Image:
Captcha Code:


Pages not loading, taking too much time to load, server timeout or unavailable, or any other issues please contact admin at [email protected]
Time Taken: 1

Location: ,