Home

ETL Lead -- Remote || Telecom Backgroud required at Remote, Remote, USA
Email: markal.saikiran@4iamericas.com
Hi 

Greetings for the day!

This is
saikiran from 4iamericas, we do have a requirement which might be interesting for you. Please review the requirement and let me know your thoughts on it.

Job Title: ETL Lead

Location: Remote

Duration: 9+ Months

Job Description

We are seeking an experienced ETL

Lead

with expertise in Google Cloud Platform (GCP) and a strong background in designing and managing data pipelines using BigQuery, DataFlow, Pub/Sub, and Databricks. The ideal candidate will have a deep understanding of ETL architecture, data integration, and cloud-native
tools, with a focus on scalable, efficient solutions for large-scale data processing. Previous experience in the telecommunications domain is highly preferred. Strong proficiency in SQL and Python is a must.

Key Responsibilities:

ETL Architecture and Design:

Lead the design and implementation of end-to-end ETL processes leveraging GCP services (BigQuery, DataFlow, Pub/Sub, Dataproc) for data extraction,
transformation, and loading.

Architect scalable, efficient, and fault-tolerant ETL pipelines that can handle large volumes of data and complex transformations.

Ensure ETL workflows are optimized for performance, cost-efficiency, and reliability in a cloud-native environment.

Cloud Platform and Data Solutions:

Build and maintain robust ETL pipelines using BigQuery, DataFlow, Pub/Sub, and Databricks to move, transform, and store data in GCP.

Design data pipelines that integrate seamlessly with cloud storage, databases, and other data services on GCP.

Utilize Pub/Sub for real-time data streaming and event-driven architectures.

Data Modeling and Transformation:

Develop and optimize data models and data transformation strategies that support high-performance analytics and business intelligence.

Ensure the data is structured, consistent, and ready for downstream analytics using BigQuery and other platforms.

Automation and CI/CD:

Implement CI/CD pipelines for the automation of ETL workflows and ensure efficient deployment of ETL jobs using tools such as Cloud Build or
similar.

Design automated testing and monitoring processes to ensure data quality, integrity, and system reliability.

Telecommunications Domain Expertise:

Bring an understanding of the telecommunications industry to design and optimize ETL solutions specific to telecom data needs, such as large-scale
call data records (CDRs), network logs, and customer usage data.

Work with telecom-specific data sources and data formats to build optimized ETL pipelines that meet the unique requirements of the industry.

Collaboration and Stakeholder Management:

Work closely with data engineers, data scientists, business analysts, and other stakeholders to understand business needs and translate them
into data solutions.

Provide technical leadership and mentor junior team members, ensuring best practices in ETL development and cloud technologies.

Data Governance and Security:

Implement data governance policies and ensure compliance with security standards for managing sensitive data within the telecom industry.

Collaborate with data privacy and security teams to ensure data protection, especially with PII and other sensitive telecom data.

Required Skills & Qualifications:

Experience:

Minimum of 8 years of hands-on experience in ETL architecture, data engineering, and cloud technologies.

Strong proficiency in Google Cloud Platform (GCP) services, particularly BigQuery, DataFlow, Pub/Sub, and Dataproc.

Solid experience in Databricks for large-scale data processing and analytics.

Extensive experience working with SQL and Python for data manipulation, transformation, and automation.

Previous experience working within the telecommunications domain is highly preferred.

Technical Expertise:

Strong understanding of ETL frameworks, data pipeline orchestration, and cloud-native data processing.

Hands-on experience with GCPs BigQuery, DataFlow, and Pub/Sub for building data pipelines and streaming solutions.

Familiarity with Databricks for processing and transforming large datasets in a cloud environment.

Expertise in SQL for database management, querying, and optimization.

Advanced Python skills for building automation scripts, data transformations, and pipeline management.

Thanks and Regards

Saikiran

813-398-1913

Markal.saikiran@4iamericas.com

www.4iamericas.com

Keywords: continuous integration continuous deployment information technology
ETL Lead -- Remote || Telecom Backgroud required
markal.saikiran@4iamericas.com
https://jobs.nvoids.com/job_details.jsp?id=2238784
markal.saikiran@4iamericas.com
View All
12:18 AM 08-Mar-25


To remove this job post send "job_kill 2238784" as subject from markal.saikiran@4iamericas.com to usjobs@nvoids.com. Do not write anything extra in the subject line as this is a automatic system which will not work otherwise.


Your reply to markal.saikiran@4iamericas.com -
To       

Subject   
Message -

Your email id:

Captcha Image:
Captcha Code:


Pages not loading, taking too much time to load, server timeout or unavailable, or any other issues please contact admin at me@nvoids.com


Time Taken: 0

Location: ,