Required AWS Solution Architect Remote !!!. at Chicago, Illinois, USA |
Email: adilmomentousa1@gmail.com |
Hello, Momento USA is a global technology consulting, talent acquisition and creative development firm that addresses clients' most pressing needs and challenges. We are currently looking for AWS Solution Architect Remote !!!. Please see the job description below for your reference Position: AWS Solution Architect Location: Chicago, IL (Remote) Note : Client is looking for someone who worked on SynXis Central Reservation System - Hotel CRS Solutions PURPOSE: At Client, were working to Advance Care through data-driven decisions and automation. This mission serves as the foundation for every decision as we create the future of travel. We cant do that without the best talent talent that is innovative, curious, and driven to create exceptional experiences for our guests, customers, owners, and colleagues. Client seeks an experienced Solution Architect who will be an exceptional addition to our growing Data Platform Architecture team. The Solution Architect will work closely with data engineering, data product managers and data science teams to meet data requirements of various initiatives. As a Solution Architect, you will work on the creation of Enterprise Data Platform for Client. As a valued member of Data Platform Architecture team, you will work on shaping and building new analytical solutions to deploy in support of our internal clients Youll help to create solutions to drive the next wave of innovation. Youll recommend tools and capabilities based on your research of the current environment and knowledge of various on-premises, Cloud-based, and hybrid resources. Youll work with our engineering, architecture, and migration teams to inform strategy and architecture design. This is an opportunity to stay on top of the latest Cloud resources as you lead efforts to prototype using multiple techniques and new technologies. Youll be able to broaden your skillset into areas like automation, scripting, and containerization while developing critical systems for several of our internal clients. Join our team as we transform the way we manage data and information by taking advantage of Cloud technology. You will be a part of a ground-floor, hands-on, highly visible team which is positioned for growth and is highly collaborative and passionate about data. We are looking for a highly motivated AWS expert who is excited about leveraging his skills and his strategic ideas to improve mission execution, who is passionate about creating an amazing user experience that delights end-users and makes their jobs easier. This candidate builds fantastic relationships across all levels of the organization and is recognized as a problem solver who looks to elevate the work of everyone around them. - Provides expert guidance to projects to ensure that their processes and deliverables align with the Hyatt target state architecture. - Design and develops enterprise data solutions based on architecture and standards leveraging leading architecture practices and advanced data technologies. - Implement solutions for data migration, data delivery and ML pipelines. - Implement Identity Access Management Roles and policies. - Build resilient, reliable, performant, and secure Data Platform and Applications. - Work closely with application development and data engineer teams on day-to-day tasks along with project planning and implementation. - Automate deployments of AWS services and BI Applications. The ideal candidate demonstrates a commitment to Hyatt core values: respect, integrity, humility, empathy, creativity, and fun. TECHNICAL QUALIFICATIONS: - 6+ years of experience within the field of application/platform engineering or related technical work including business intelligence, analytics. - 4+ years of experience with AWS Senior Cloud Data Engineering, management, maintenance, or architecting, implementing best practices and industry standards. - Experience with data warehousing platforms such as Snowflake, Redshift or similar. - Strong knowledge and established experience with AWS services including but not limited to: S3, EC2, RDS, Lambda, Cloud Formation, Kinesis, Data Pipelines, EMR, Step Functions, VPC, IAM, and Security Groups. - Experience with DB technologies (e.g., SQL, Python, PostgreSQL, AWS Aurora, AWS RDS, MongoDB, Redis). - Experience with CI/CD tools, pipelines, and scripting for automation. (GitHub Actions, Jenkins, AWS Code Pipeline tools, Cloud formation and Terraform). - High degree of knowledge in IAM Roles and Policies - Strong knowledge configuring AWS cloud monitoring and alerts for cloud resource availability. - Strong scripting experience using PowerShell and/or Python. - High degree of knowledge in PaaS and SaaS application performance. - Understand enterprise level application architecture diagrams and IT security requirements. ADDITIONAL QUALIFICATIONS: - Experience and comfort solving problems in an ambiguous environment where there is constant change - Have the tenacity to thrive in a dynamic and fast-paced environment, inspire change, and collaborate with a variety of individuals and organizational partners - Experience designing and building scalable and robust data pipelines to enable data-driven decisions for the business - Effective problem solving and analytical skills. Ability to manage multiple projects and report simultaneously across different stakeholders - Rigorous attention to detail and accuracy. - Demonstrate d ability to troubleshoot technical problems and issues. - Passionate about programming and learning new technologies. - Experience planning and executing on-premises to AWS migrations - BA or BS degree Computer Engineering, Computer Science, or related fields. - Strong verbal and oral communication. PREFERRED EXPERIENCE: - Experience with large scale enterprise streaming services such as Kafka. - Experience with Kubernetes and Docker containers or AWS Fargate. - Experience implementing applications with both Windows and Linux server OS - Experience with networking, security groups, or policy management in relation to Cloud resources across multiple operating systems, including UNIX, Linux, or Windows - Advance CS degree. Position Responsibilities Deliverables - Design and build efficient and reliable data pipelines to move data across different data systems and enterprise Data Warehouse (Snowflake), Oracle, and external partner systems through APIs, AWS S3 buckets. - Be an expert in cloud data warehouse, cloud ETL tools and capabilities to engineer solutions and automate large scale data flows from varying systems/sources. - Diligently work with data scientists, other internal data groups and business partners to build data solutions which will support a variety of predictive and reporting applications. - Infrastructure Components Setup: Install infra components and third-party software on-prem & establish connectivity with hosted apps for enterprise integrations and automation. - Strong understanding of real time data integration with tools like Snowpipe, Kafka, snowflake sink - Knowledge of oracle cloud - Experience deploying software components e.g., on-prem software agents connecting, and private links to cloud apps like Snowflake, Kafka, Jira. - Experience with integration tools like Snap logic, HVR , Matillion etc. - Continuous Integration/Continuous Deployment (CI/CD): Setup pipelines for automated deployment, testing, and integration of applications using tools like GitHub, Gitlab, Jenkins - Experience in hospitality industry would be a plus with experience on Sabre SynXis CRS - Knowledge of Iaac using Terraform, CloudFormation etc. Thanks, Adil M Sr. Technical Lead Momento USA | Exceeding Customer Expectations Email: adil@momentousa.com -- Keywords: continuous integration continuous deployment business analyst machine learning business intelligence sthree database information technology Illinois https://jobs.nvoids.com/job_details.jsp?id=1184752 |
adilmomentousa1@gmail.com View All |
12:40 AM 06-Mar-24 |