Home

Need Local Dallas, Texas candidates for GCP Data Engineer at Dallas, Texas, USA
Email: [email protected]
Hi

Hope you are doing great.

Please go through the below mentioned requirement and send your updated word format resume if you are interested for this requirement.

Role : GCP Data Engineer

Location : Dallas, TX ( Onsite - biweekly) - Local Candidates only

Duration : 6+ Months

Rate : $58/HR on C2C

Responsibilities:

Design, develop, and maintain scalable data pipelines and data processing systems on the Google Cloud Platform (GCP).

Collaborate with data scientists, analysts, and other stakeholders to understand their data requirements and implement solutions accordingly.

Develop and optimize ETL processes to ensure efficient data ingestion, transformation, and loading.

Implement data governance and security measures to ensure data quality, integrity, and privacy.

Monitor and troubleshoot data pipelines to identify and resolve issues in a timely manner.

Work with cross-functional teams to integrate data from various sources and systems.

Conduct performance tuning and optimization of data processing jobs.

Stay updated with the latest trends and technologies in the field of data engineering and GCP services.

Focus Areas:

Building scalable and efficient data pipelines on the GCP.

Data governance and security.

Integration of data from multiple sources and systems.

Performance tuning and optimization.

Staying updated with emerging technologies and best practices in data engineering.

Key Skill Sets:

Data Engineering Work: The candidate should have experience in building pipelines using Python/Pyspark on GCP cloud.

Dataproc Knowledge: The candidate should have working knowledge of serverless Dataproc and Ephemeral Dataproc.

Airflow Expertise: Proficiency in Airflow is required.

BigQuery: The candidate must be very strong in writing SQL.

ML Experience: Experience in machine learning will be an added advantage for retaining the position.

Vertex AI: Knowledge and working experience in model building using Vertex AI etc will be added advantage

Qualifications we seek in you!

Minimum Qualifications:

Bachelor's degree in computer science, information systems, or a related field.

Experience in data engineering or a similar role.

Demonstrated experience in designing and implementing data pipelines using GCP services.

Proficiency in Python, SQL, and data manipulation techniques.

Strong understanding of cloud computing concepts and distributed systems.

Preferred Qualifications/skills:

Master's degree in computer science, information systems, or a related field.

Experience with other cloud platforms such as AWS or Azure.

Certification in GCP data engineering or related field.

Familiarity with machine learning concepts and frameworks.

Experience with real-time data processing and streaming technologies.

Thanks & Regards,

Mahesh 

Technical Recruiter

--

Keywords: artificial intelligence machine learning information technology golang Texas
Need Local Dallas, Texas candidates for GCP Data Engineer
[email protected]
[email protected]
View all
Fri Jul 19 02:48:00 UTC 2024

To remove this job post send "job_kill 1577634" as subject from [email protected] to [email protected]. Do not write anything extra in the subject line as this is a automatic system which will not work otherwise.


Your reply to [email protected] -
To       

Subject   
Message -

Your email id:

Captcha Image:
Captcha Code:


Pages not loading, taking too much time to load, server timeout or unavailable, or any other issues please contact admin at [email protected]
Time Taken: 15

Location: Dallas, Texas