Data Engineer - Remote at Remote, Remote, USA |
Email: [email protected] |
From: Vikas Rai, Samson Software Solution [email protected] Reply to: [email protected] Position: Data Engineer Location: Remote Skill (Primary): Data Fabric-Cloud Datawarehouse-Snowflake. Job Description: Strong Snowflake Data warehousing knowledge Basic-Medium level Python Development ETL-ELT Technologies (Expertise in any of the ELT Tool, preferably in Matillion) Basic understanding of AWS Cloud platform Basic understanding of BI ToolWe are looking for a skilled Data Engineer to join our team. The ideal candidate will have experience in building and optimizing data pipelines, architectures, and data sets. You will be responsible for creating and maintaining scalable data architectures, ensuring data availability, and enabling data-driven decision-making processes. This role requires a deep understanding of data management practices, data warehousing, ETL processes.Key Responsibilities: Data Pipeline Development: Design, construct, test, and maintain highly scalable data management systems. Develop and implement architectures that support the extraction, transformation, and loading (ETL) of data from various sources. Data Integration: Integrate structured and unstructured data from multiple data sources into a unified data system, ensuring data quality and consistency. Data Warehousing: Build and maintain data warehouses and data lakes to store and retrieve vast amounts of data efficiently. Optimize the performance of databases and queries to meet business needs. Data Processing: Implement data processing frameworks (e.g., Hadoop, Spark) to process large datasets in real-time or batch processing. Automation and Monitoring: Automate manual processes, optimize data delivery, and develop data monitoring systems to ensure data integrity and accuracy. Collaboration: Work closely with data scientists, analysts, and other stakeholders to understand data needs and provide technical solutions that meet business requirements. Data Governance: Ensure data governance policies are followed, including data security, data privacy, and compliance with regulations. Performance Tuning: Optimize the performance of ETL processes, databases, and data pipelines to handle large volumes of data and reduce processing times.Required Qualifications: Bachelor s degree in computer science, Engineering, Information Technology, or a related field. 5+ years of experience as a Data Engineer or in a similar role. Proficiency in Snowflake SQL and experience with relational databases (e.g., MySQL, PostgreSQL). Knowledge of data warehousing solutions (e.g., Snowflake, Redshift, BigQuery). Experience with cloud platforms (e.g., AWS, Google Cloud, Azure). Good to have hands on programming languages like Python. Strong understanding of Matillion ETL tool and processes. Experience with data modeling and data architecture design.""" (1.) To be responsible for providing technical guidance or solutions ;define, advocate, and implement best practices and coding standards for the team. (2.) To develop and guide the team members in enhancing their technical capabilities and increasing productivity (3.) To ensure process compliance in the assigned module, and participate in technical discussionsorreview as a technical consultant for feasibility study (technical alternatives, best packages, supporting architecture best practices, technical risks, breakdown into components, estimations). (4.) To prepare and submit status reports for minimizing exposure and risks on the project or closure of escalations. Keywords: business intelligence Data Engineer - Remote [email protected] |
[email protected] View all |
Fri Sep 20 20:02:00 UTC 2024 |