Home

Big Data at Remote, Remote, USA
Email: simran@urbench.com
https://jobs.nvoids.com/job_details.jsp?id=2189530&uid=
From:

simran,

Big Data

simran@urbench.com

Reply to:   simran@urbench.com

Job Title: Big Data
Location: Phoniex, AZ

Summary:
We are seeking a highly skilled Big Data Engineer to join our dynamic team. This role requires expertise in big data technologies, cloud platforms (specifically GCP), and strong programming and database skills to process, analyze, and manage large-scale datasets. You will work with various data pipelines, automation tools like Airflow, and scripting languages such as Python and SQL to enable data-driven decision-making.
Key Responsibilities:
Design, develop, and optimize scalable data pipelines for processing large volumes of data from multiple sources.
Develop and manage cloud-based data infrastructure on Google Cloud Platform (GCP), including services like BigQuery, Cloud Storage, Pub/Sub, Dataflow, and Dataproc.
Create and manage data workflows and automation using Apache Airflow to ensure efficient, fault-tolerant, and scheduled data processing.
Write efficient, reusable, and scalable Python scripts to process, analyze, and transform data across different stages of the pipeline.
Develop complex SQL queries for data extraction, transformation, and aggregation from large datasets in cloud-based databases.
Work closely with data scientists, analysts, and other stakeholders to understand data requirements and deliver solutions that meet business needs.
Monitor and troubleshoot data pipeline performance and resolve any issues related to data flow, system failures, or data quality.
Ensure proper data governance, security, and compliance in line with industry best practices.
Collaborate on designing and implementing best practices for handling large datasets, including ETL (Extract, Transform, Load) processes, performance tuning, and data integrity checks.
Continuously optimize and improve the efficiency and reliability of the data architecture.
Skills and Qualifications:
Big Data Technologies: Experience with tools and frameworks like Hadoop, Spark, Kafka, and others in processing and managing large datasets.
Programming Languages: Strong proficiency in Python for data processing, automation, and system integration.
SQL: Advanced skills in writing optimized SQL queries for relational and non-relational databases (experience with BigQuery is a plus).
Cloud Experience: Hands-on experience with Google Cloud Platform (GCP), specifically services such as BigQuery, Cloud Storage, Pub/Sub, Dataflow, Dataproc, etc.
Data Orchestration: Proficiency in using Apache Airflow for building, scheduling, and monitoring data workflows and pipelines.
Data Warehousing: Experience with cloud data warehousing solutions, particularly GCP BigQuery, including data modeling, partitioning, and optimizing queries.
ETL Processes: Strong understanding of ETL concepts and the ability to design, implement, and optimize complex ETL workflows.
Data Processing & Optimization: Ability to optimize data processing tasks for performance and scalability, ensuring high throughput and low latency.
Version Control & Collaboration: Familiarity with version control tools like Git for code management and collaboration with other team members.
Problem-Solving: Strong troubleshooting skills and the ability to debug and resolve issues related to data pipelines, processing, and infrastructure.
Preferred Qualifications:
Bachelor's degree in Computer Science, Engineering, Data Science, or a related field.
Experience with other GCP tools such as Cloud Functions, Cloud Dataprep, or Cloud Composer.
Familiarity with containerization technologies like Docker and Kubernetes.
Experience with additional programming languages (e.g., Java, Scala) or tools (e.g., Terraform, Ansible) for cloud infrastructure management.
What We Offer:
Competitive salary and benefits package.
Opportunity to work with cutting-edge big data technologies in the cloud.
Collaborative and innovative work environment.
Career growth and professional development opportunities.
This job description balances the core responsibilities of a Big Data Engineer with specific skills in Python, SQL, GCP, and Airflow, showcasing both technical expertise and the collaborative nature of the role. You can tweak it to reflect your companys specific needs or preferences!

Kind Regards,  
Simran Rathol| Sr. Recruiter                
Direct: 
Email: Simran@urbench.com 

Keywords: information technology Arizona
Big Data
simran@urbench.com
https://jobs.nvoids.com/job_details.jsp?id=2189530&uid=
simran@urbench.com
View All
01:15 AM 20-Feb-25


To remove this job post send "job_kill 2189530" as subject from simran@urbench.com to usjobs@nvoids.com. Do not write anything extra in the subject line as this is a automatic system which will not work otherwise.


Your reply to simran@urbench.com -
To       

Subject   
Message -

Your email id:

Captcha Image:
Captcha Code:


Pages not loading, taking too much time to load, server timeout or unavailable, or any other issues please contact admin at me@nvoids.com


Time Taken: 18

Location: , Arizona