GCP Data Engineer (Local to GA) at Remote, Remote, USA |
Email: [email protected] |
From: supriya, Nitya Software Solutions [email protected] Reply to: [email protected] Data Engineer with GCP Verizon Job Description Onsite WI Artificial Intelligence and Data team is looking for BI engineers with expert level experience in developing enterprise software applications on Google Cloud Data Flow Engineer (3-4 Years Experience) Position Summary: Seeking a skilled Google Cloud Data Flow Engineer with a proven track record in leveraging Google Cloud Platform (GCP) tools to design, build, and maintain scalable and reliable data solutions. The ideal candidate will possess deep expertise in GCP tools, especially Dataflow, BigQuery, and Cloud Composer, and will work collaboratively with cross-functional teams to address data-related technical challenges and enhance our data infrastructure. Key Responsibilities: 1. Design, build, and deploy scalable and robust data pipelines using Google Cloud Dataflow. 2. Harness the capabilities of BigQuery for data analytics, ensuring optimized performance and cost efficiency. 3. Manage workflow orchestration and automation using Cloud Composer. 4. Monitor, troubleshoot, and optimize data pipelines for performance, ensuring data quality and integrity. 5. Stay updated with GCP's latest features and best practices to ensure the company's data infrastructure remains cutting-edge. 6. Document data architectures, processes, and data lineage for transparency and maintainability. Qualifications: 1. 3-4 years of experience as a data engineer with significant exposure to the Google Cloud Platform. 2. Strong expertise in GCP tools, particularly Dataflow, BigQuery, and Cloud Composer. 3. Strong GCP Dataflow, Java/Python & beam skills, Lead experience for delivery streams. 4. Experience in working with streaming / messaging systems like Kafka, Pulsar, GCP PubSub, RabbitMQ and similar tools. Including connectors for systems like Cassandra. 5. Familiarity with other GCP services such as Cloud Storage, and Dataproc.. 6. Strong analytical and problem-solving skills. 7. Familiarity with other cloud platforms (e.g., AWS, Azure) is a plus. 8. Excellent communication skills, both written and verbal. 9. bachelors degree in computer science, Engineering, or a related field. Additional Requirements: 1. Demonstrated ability to work in a fast-paced environment, managing multiple projects simultaneously. 2. Commitment to continuous learning and adapting to the rapidly evolving data landscape. 3. Proven ability to collaborate effectively with both technical and non-technical stakeholders. Keywords: business intelligence Wisconsin GCP Data Engineer (Local to GA) [email protected] |
[email protected] View all |
Thu Mar 28 21:16:00 UTC 2024 |