Remote Opening_Lead GCP Data Engineer_10+ exp_No GC at Remote, Remote, USA |
Email: [email protected] |
From: Deepanjan Chakraborty, Adame Services LLC [email protected] Reply to: [email protected] Job Title: Google Cloud Data Engineer Location : Remote Must Have : Python, Google Cloud, Data Proc, Data Flow, BQ **Job Description:** We are seeking a highly skilled Cloud Data Engineer with a minimum of 5 years of experience in designing, building, and optimizing data pipelines. The ideal candidate will have a deep understanding of Google Cloud technologies including Composer, DataFlow, and DataProc clusters, as well as proficiency in Apache Beam, Hadoop, and BigQuery. **Key Responsibilities:** - Design, develop, and optimize data pipelines on Google Cloud Platform. - Implement process automation and application development using Java, Python, Unix, and PL/SQL. - Hands-on experience with Database CDC ingestion and streaming ingestion applications like Strim and Nifi is a strong plus. - Collaborate with cross-functional teams to gather requirements and translate them into technical solutions. - Utilize Agile software development lifecycle disciplines including Analysis, Design, Coding, and Testing. - Develop high-performance, low-latency pipelines using Big Data technologies such as Hadoop, PySpark, and Apache Beam. **Requirements:** - Bachelor's degree in Computer Science, Engineering, or related field. - Minimum 5 years of experience in designing, building, and optimizing data pipelines. - Proficiency in Google Cloud technologies such as Composer, DataFlow, and DataProc clusters. - Strong programming skills in Java, Python, Unix, and PL/SQL. - Experience with Database CDC ingestion and streaming ingestion applications is preferred. - Familiarity with Agile software development practices. - Experience with Big Data technologies and creating high-performance, low-latency pipelines using Hadoop, PySpark, and Apache Beam. **Additional Skills (Preferred):** - Knowledge of streaming data technologies such as Kafka, Spark Streaming, or Flink. - Experience with containerization technologies like Docker and Kubernetes. - Familiarity with version control systems such as Git. - Excellent problem-solving and analytical skills. - Strong communication and teamwork abilities. Keywords: procedural language Remote Opening_Lead GCP Data Engineer_10+ exp_No GC [email protected] |
[email protected] View all |
Thu May 16 23:53:00 UTC 2024 |