Home

Remote Opening_Lead GCP Data Engineer_10+ exp_No GC at Remote, Remote, USA
Email: [email protected]
From:

Deepanjan Chakraborty,

Adame Services LLC

[email protected]

Reply to:   [email protected]

Job Title: Google Cloud Data Engineer

Location : Remote

Must Have : Python, Google Cloud, Data Proc, Data Flow, BQ

**Job Description:**

We are seeking a highly skilled Cloud Data Engineer with a minimum of 5 years of experience in designing, building, and optimizing data pipelines. The ideal candidate will have a deep understanding of Google Cloud technologies including Composer, DataFlow, and DataProc clusters, as well as proficiency in Apache Beam, Hadoop, and BigQuery.

**Key Responsibilities:**

- Design, develop, and optimize data pipelines on Google Cloud Platform.

- Implement process automation and application development using Java, Python, Unix, and PL/SQL.

- Hands-on experience with Database CDC ingestion and streaming ingestion applications like Strim and Nifi is a strong plus.

- Collaborate with cross-functional teams to gather requirements and translate them into technical solutions.

- Utilize Agile software development lifecycle disciplines including Analysis, Design, Coding, and Testing.

- Develop high-performance, low-latency pipelines using Big Data technologies such as Hadoop, PySpark, and Apache Beam.

**Requirements:**

- Bachelor's degree in Computer Science, Engineering, or related field.

- Minimum 5 years of experience in designing, building, and optimizing data pipelines.

- Proficiency in Google Cloud technologies such as Composer, DataFlow, and DataProc clusters.

- Strong programming skills in Java, Python, Unix, and PL/SQL.

- Experience with Database CDC ingestion and streaming ingestion applications is preferred.

- Familiarity with Agile software development practices.

- Experience with Big Data technologies and creating high-performance, low-latency pipelines using Hadoop, PySpark, and Apache Beam.

**Additional Skills (Preferred):**

- Knowledge of streaming data technologies such as Kafka, Spark Streaming, or Flink.

- Experience with containerization technologies like Docker and Kubernetes.

- Familiarity with version control systems such as Git.

- Excellent problem-solving and analytical skills.

- Strong communication and teamwork abilities.

Keywords: procedural language
Remote Opening_Lead GCP Data Engineer_10+ exp_No GC
[email protected]
[email protected]
View all
Thu May 16 23:53:00 UTC 2024

To remove this job post send "job_kill 1402487" as subject from [email protected] to [email protected]. Do not write anything extra in the subject line as this is a automatic system which will not work otherwise.


Your reply to [email protected] -
To       

Subject   
Message -

Your email id:

Captcha Image:
Captcha Code:


Pages not loading, taking too much time to load, server timeout or unavailable, or any other issues please contact admin at [email protected]
Time Taken: 0

Location: ,