Enterprise ETL Architect at Enterprise, Utah, USA |
Email: [email protected] |
From: Madhu, STM Consulting Inc [email protected] Reply to: [email protected] Position: Enterprise Technical Solutions Architect ( ETL, Business Intelligence, Big Data) Duration: 6-12 months Location: Dallas/Remote C2C/W2 Fine. Job Description: We are seeking a highly skilled and experienced Technical Architect specializing in Big Data Technologies within the supply chain domain. As a Technical Architect, you will play a crucial role in designing, implementing, and optimizing big data solutions tailored to meet the unique challenges and requirements of the supply chain industry. You will collaborate closely with cross-functional teams to ensure the successful delivery of innovative and scalable data solutions that drive operational efficiencies and strategic insights. Responsibilities: Architectural Design: Ability to architect end-to-end ETL solutions, including data ingestion, transformation, storage, and consumption layers, aligned with business requirements and industry best practices. Preferably in Supply Chain Domain Technology Evaluation: Stay abreast of the latest advancements in big data technologies and evaluate their suitability for addressing specific supply chain challenges. Recommend tools, frameworks, and platforms to enhance data processing, analysis, and visualization capabilities. Data Integration: Design robust data integration strategies to consolidate and harmonize data from disparate sources within the supply chain ecosystem, including ERP systems, IoT devices, logistics providers, and external data feeds. Data Modeling: Define and implement appropriate data models, schemas, and structures to support efficient storage, retrieval, and analysis of supply chain data. Apply best practices for data modeling in distributed and real-time processing environments. Performance Optimization: Identify performance bottlenecks and optimization opportunities within big data pipelines and workflows. Collaborate with development teams to implement optimizations and fine-tune system configurations for improved throughput and latency. Security and Compliance: Ensure that big data solutions adhere to industry standards and regulatory requirements related to data security, privacy, and compliance. Implement encryption, access controls, and auditing mechanisms to safeguard sensitive supply chain data. Scalability and Resilience: Design scalable and fault-tolerant architectures that can accommodate growing volumes of supply chain data and withstand system failures or disruptions. Implement clustering, replication, and data partitioning techniques to enhance scalability and resilience. Collaboration and Communication: Work closely with stakeholders across business, IT, and data science teams to understand their requirements, priorities, and pain points. Communicate technical concepts and architectural decisions effectively to both technical and non-technical audiences. Qualifications: Education: Bachelor's or Master's degree in Computer Science, Information Technology, Engineering, or a related field. 15+ years of overall IT experience with a demonstrated track record of major solution design/architecture contributions in a large enterprise. Preferably in Supply Chain Domain Minimum of 5 years in the customer facing roles with a strong focus on data/integration operations Expertise in ETL (Extract, Transform, Load) tools such as Microsoft SSIS (SQL Server Integration Services), or Apache NiFi/Airflow and ability to design and implement complex transformation in Python, Microsoft SQL server transformation. Deep understanding of distributed computing principles, data management concepts, and big data ecosystem components (e.g., Hadoop, Spark, Kafka, Apache Delta Lake). Data scientist team. Strong understanding of various integration methodologies via batch using various data formats ( CSV, XML, JSON, and Parquet), and hands-on experience in Rest APIs, Hands-on experience with cloud platforms (e.g., AWS, Azure, GCP) and containerization technologies (e.g., Docker, Kubernetes), code management tools and frameworks (e.g., Ci/CD, Git) for understanding and managing big data solutions. Strong analytical and problem-solving skills, with the ability to translate business requirements into technical solutions. Excellent communication and interpersonal skills, with the ability to collaborate effectively across teams and influence decision-making. Certifications in relevant technologies (e.g., AWS Certified Solutions Architect, Cloudera Certified Professional) are a plus. Note: We also require expertise in Spark (data processing technology). Keywords: continuous integration continuous deployment information technology wtwo Enterprise ETL Architect [email protected] |
[email protected] View all |
Wed Apr 17 21:57:00 UTC 2024 |