Urgent Hiring Lead Data Engineer with ETL (Streamsets) Location: Scottsdale, AZ (Onsite) at Pima, Arizona, USA |
Email: [email protected] |
From: Sarvan Kumar, Rivago infotech [email protected] Reply to: [email protected] Position Data/ETL Engineer (Technical Lead) Location: 5801 N. Pima Rd Scottsdale, AZ 85250- Day one Onsite Job Description: Overview: We are seeking a talented and motivated Software Developer with expertise in ETL (Extract, Transform, Load) processes, particularly within cloud environments. The ideal candidate will have strong skills in Python programming, experience with cloud platforms (such as AWS, Azure, or GCP) , and proficiency in containerization technologies like Kubernetes and Docker. This role requires a proactive problem solver who can design, develop, and optimize ETL pipelines while ensuring scalability, reliability, and efficiency. Responsibilities: Design, develop, and maintain ETL pipelines to extract data from various sources, transform it according to business requirements, and load it into target data warehouses or databases. Collaborate with cross-functional teams to gather requirements, understand data sources, and define data transformation logic. Implement scalable and efficient ETL solutions within cloud environments, leveraging platform services and infrastructure as code. Optimize ETL processes for performance, reliability, and cost-effectiveness, considering factors like data volume, frequency, and latency requirements. Monitor and troubleshoot ETL pipelines to ensure data integrity, timely execution, and error handling. Stay up-to-date with emerging technologies and best practices in ETL, cloud computing, and data engineering, and propose innovative solutions to enhance the overall data ecosystem. Document ETL workflows, data mappings, and system configurations to facilitate knowledge sharing and maintainability. Provide technical guidance and support to junior team members, fostering a collaborative and learning-oriented environment. Requirements: Proven experience in software development with expertise in Python programming language. Hands-on experience with cloud platforms such as AWS, Azure, or GCP, including services like S3, EC2, Azure Data Factory, Google BigQuery, etc. Strong understanding of ETL concepts, methodologies, and best practices. Proficiency in containerization technologies such as Kubernetes and Docker. Experience with scripting languages for automation and orchestration tasks (e.g., Bash, PowerShell). Familiarity with relational databases (e.g PostgreSQL, MySQL) and NoSQL databases (e.g., MongoDB, Cassandra). Excellent problem-solving skills with a focus on delivering high-quality solutions in a fast-paced environment. Effective communication skills with the ability to collaborate with cross-functional teams and articulate technical concepts to non-technical stakeholders. Preferred Qualifications: Master's degree in Computer Science, Engineering, or related field. Experience with data warehousing technologies and concepts (e.g., Snowflake, Redshift, star schema). Knowledge of data governance, security, and compliance standards in cloud environments. Certification in relevant cloud platforms (e.g., AWS Certified Developer, Azure Developer Associate, Google Cloud Professional Developer). Familiarity with agile development methodologies and DevOps practices. Contributions to open-source projects or active participation in developer communities. Keywords: sthree information technology Arizona Urgent Hiring Lead Data Engineer with ETL (Streamsets) Location: Scottsdale, AZ (Onsite) [email protected] |
[email protected] View all |
Fri May 10 20:06:00 UTC 2024 |