| Remote Lead GCP Data Engineer at Remote, Remote, USA |
| Email: [email protected] |
|
http://bit.ly/4ey8w48 https://jobs.nvoids.com/job_details.jsp?id=1260679&uid= From: Utsav (IT Resource Manager), ChabezTech [email protected] Reply to: [email protected] Job Title: GCP Data Engineer Location - Remote Position Overview: As a GCP Data Engineer, you will play a pivotal role in our data engineering and platform support team. You will be responsible for designing, developing, and maintaining robust data pipelines using tools such as Airflow, Spark, and Dataflow. Additionally, you will provide platform support, assisting data analysts and data scientists with SQL fine-tuning and setting up their Spark environments. The ideal candidate will have a strong background in clustering hardware and infrastructure, along with expertise in GCP services and Spark ecosystem. Key Responsibilities: Design, develop, and maintain data pipelines using Airflow, Spark, and Dataflow. Utilize BigQuery for data warehousing and analysis purposes. Integrate RDBMS, NoSQL databases, and AWS S3 with GCP services. Establish connections between on-premises distribution systems and Google Cloud. Implement dynamic parameters and task groups in Airflow for efficient workflow management. Monitor and fine-tune performance of Airflow jobs and Spark processes. Execute complex SQL queries and optimize query performance. Perform data manipulation and analysis using PySpark Data Frame operations. Implement windowing and ranking functions for advanced data processing. Identify and handle null values in Spark SQL queries. Mentor junior team members and provide technical guidance. Assist platform users with pipeline maintenance and SQL fine-tuning. Collaborate with cross-functional teams to understand business requirements and translate them into technical solutions. Stay updated with the latest trends and advancements in GCP and data engineering technologies. Qualifications: Bachelor's degree in computer science or equivalent experience. 5+ years of experience as a Data Engineer with a focus on GCP. Proficiency in Airflow, Spark, SQL, Dataflow, and BigQuery. Strong understanding of RDBMS and NoSQL databases. Experience integrating AWS S3 with GCP services. Knowledge of Apache Hive, Apache Kafka, Scala, and Hadoop ecosystem components. Hands-on experience with clustering hardware and infrastructure. Excellent communication and collaboration skills. Ability to thrive in a fast-paced, dynamic environment. Thanks & Regards Utsav Manager ChabezTech LLC 4 Lemoyne Dr #102, Lemoyne, PA 17043, USA Office : 717-441-5440 Email: [email protected] | Keywords: sthree information technology Pennsylvania Remote Lead GCP Data Engineer [email protected] http://bit.ly/4ey8w48 https://jobs.nvoids.com/job_details.jsp?id=1260679&uid= |
| [email protected] View All |
| 09:35 PM 28-Mar-24 |