Solution Architect|SR.DataPlatform Engineer - Chicago, IL at Chicago, Illinois, USA |
Email: [email protected] |
From: Mathews Alexander, BLACKAPPLE SOLUTIONS LLC [email protected] Reply to: [email protected] Position: Solution Architect/ SR. Data Platform Engineer Location: Chicago, IL (Remote for other location candidates) Duration: 6 Months Note: Should be a strong solution architect. Good to have experience on Saber application (if we can find with Saber experience, will give us edge) Must have strong experience with AWS + Databases + APIs. Experience on hospitality industry Job Description: PURPOSE: As a Solution Architect, you will work on the creation of Enterprise Data Platform for Hyatt. As a valued member of Data Platform Architecture team, you will work on shaping and building new analytical solutions to deploy in support of our internal clients Provides expert guidance to projects to ensure that their processes and deliverables align with the Hyatt target state architecture. Design and develops enterprise data solutions based on architecture and standards leveraging leading architecture practices and advanced data technologies. Implement solutions for data migration, data delivery and ML pipelines. Implement Identity Access Management Roles and policies. Build resilient, reliable, performant and secure Data Platform and Applications. Work closely with application development and data engineer teams on day-to-day tasks along with project planning and implementation. Automate deployments of AWS services and BI Applications. TECHNICAL QUALIFICATIONS: 6+ years of experience within the field of application/platform engineering or related technical work including business intelligence, analytics. 4+ years of experience with AWS Senior Cloud Data Engineering, management, maintenance, or architecting, implementing best practices and industry standards. Experience with data warehousing platforms such as Snowflake, Redshift or similar. Strong knowledge and established experience with AWS services including but not limited to: S3, EC2, RDS, Lambda, Cloud Formation, Kinesis, Data Pipelines, EMR, Step Functions, VPC, IAM, and Security Groups. Experience with DB technologies (e.g., SQL, Python, PostgreSQL, AWS Aurora, AWS RDS, MongoDB, Redis). Experience with CI/CD tools, pipelines, and scripting for automation. (GitHub Actions, Jenkins, AWS Code Pipeline tools, Cloud formation and Terraform). High degree of knowledge in IAM Roles and Policies Strong knowledge configuring AWS cloud monitoring and alerts for cloud resource availability. Strong scripting experience using PowerShell and/or Python. High degree of knowledge in PaaS and SaaS application performance. Understand enterprise level application architecture diagrams and IT security requirements. ADDITIONAL QUALIFICATIONS: Experience and comfort solving problems in an ambiguous environment where there is constant change Have the tenacity to thrive in a dynamic and fast-paced environment, inspire change, and collaborate with a variety of individuals and organizational partners Experience designing and building scalable and robust data pipelines to enable data-driven decisions for the business Effective problem solving and analytical skills. Ability to manage multiple projects and report simultaneously across different stakeholders Rigorous attention to detail and accuracy. Demonstrate d ability to troubleshoot technical problems and issues. Passionate about programming and learning new technologies. Experience planning and executing on-premises to AWS migrations BA or BS degree Computer Engineering, Computer Science, or related fields. Strong verbal and oral communication. PREFERRED EXPERIENCE: Experience with large scale enterprise streaming services such as Kafka. Experience with Kubernetes and Docker containers or AWS Fargate. Experience implementing applications with both Windows and Linux server OS Experience with networking, security groups, or policy management in relation to Cloud resources across multiple operating systems, including UNIX, Linux, or Windows Advance CS degree. Position Responsibilities Design and build efficient and reliable data pipelines to move data across different data systems and enterprise Data Warehouse (Snowflake), Oracle, and external partner systems through APIs, AWS S3 buckets. Be an expert in cloud data warehouse, cloud ETL tools and capabilities to engineer solutions and automate large scale data flows from varying systems/sources. Diligently work with data scientists, other internal data groups and business partners to build data solutions which will support a variety of predictive and reporting applications. Infrastructure Components Setup: Install infra components and third-party software on-prem & establish connectivity with hosted apps for enterprise integrations and automation. Strong understanding of real time data integration with tools like Snowpipe, Kafka, snowflake sink Knowledge of oracle cloud Experience deploying software components e.g., on-prem software agents connecting, and private links to cloud apps like Snowflake, Kafka, Jira. Experience with integration tools like Snaplogic, HVR , Matillion etc Continuous Integration/Continuous Deployment (CI/CD): Setup pipelines for automated deployment, testing, and integration of applications using tools like GitHub, Gitlab, Jenkins Experience in hospitality industry would be a plus with experience on Sabre SynXis CRS Knowledge of Iaac using Terraform, CloudFormation etc. Keywords: continuous integration continuous deployment business analyst machine learning business intelligence sthree database information technology Illinois Solution Architect|SR.DataPlatform Engineer - Chicago, IL [email protected] |
[email protected] View all |
Mon Mar 25 21:17:00 UTC 2024 |