Home

Hafil - AWS Cloud / DevOps Engineer
[email protected]
Location: Austin, Texas, USA
Relocation: YES
Visa: H1-B
Profile Summary
Senior AWS DevOps Engineer with 16+ years of experience in IT, specializing in DevOps processes,
Cloud Computing, automation, real-time data streaming, and project/release management
across agile environments.
Hands-on experience in configuring and administering AWS accounts, roles, and permissions,
enhancing security and compliance across cloud environments.
Expertise in designing, deploying, and managing databases on AWS, including Amazon RDS and
DynamoDB.
Proven ability in performance tuning and optimizing database performance to meet the demands
of cloud-based applications, ensuring security, compliance and data protection across AWS
environments.
Extensive experience in implementing high availability and disaster recovery strategies,
minimizing downtime and enhancing system reliability.
Hands-on experience configuring and managing AWS cloud infrastructure using tools like Docker,
Kubernetes, Terraform, and Ansible, ensuring scalability, repeatability, and recoverability.
Skilled in CI/CD automation using tools such as Jenkins, GitHub Actions, and Terraform,
streamlining deployments and minimizing errors.
Proficient in Python and shell scripting to develop automation solutions that improve operational
efficiency and system administration.

Strong collaboration and problem-solving skills, with the ability to work independently or in cross-
functional teams to deliver end-to-end cloud solutions tailored to business requirements

In addition to my DevOps experience, I possess strong skills in Java and Python, where I have built
automation scripts and data analysis tools, enhancing operational efficiency.
Collaborated with clients across various regions (India, Malaysia, UK, US) in sectors like Credit Risk
Decisioning, Life Insurance, Pensions, and Banking.
Technical Skills
Operating Systems: Linux, Windows, Z/OS Mainframe
Version Control Tools: Subversion, Gitlab, GitHub and Bitbucket
Build Tools and Application Servers: Apache Maven, Apache Tomcat, WebSphere, NGinx
Continuous Integration Tools: Jenkins, Gitlab CI/CD
Configuration & Artifact Management Tools: Ansible, JFrog, AWS CodeArtifact
Bug Tracking Tools: Jira and Confluence
Monitoring Tools: Splunk, CloudWatch, Kafka
Scripting Tools: Python, Bash, Groovy, PowerShell, Java
AWS: EC2, S3, IAM, Route53, RDS, Dynamo DB, ECS, Lambda, EKS, CloudFront
Database and Datawarehouse: PosetgreSQL, Apache Spark
Container Technologies: Docker, Kubernetes
Other Tools and Technologies: QFTest, RTC, xFramework, QTP, Selenium, Cucumber, Robot
Framework, TOSCA, Vision Plus, Mainframe, CardLink, COBOL, VSAM, CICS, JCL
Education
Master s degree in Computer Applications from Anna University in 2008
Bachelor s degree in Mathematics from Calicut University in 2005
Muhammad Hafil Abdul Rahiman
Senior AWS DevOps Engineer
LinkedIn: www.linkedin.com/in/hafil-m

Phone: +19452678804
Email: [email protected]

Certification
AWS Certified Solutions Architect Associate (2024)
Professional Experience
Client: Bank of America (Experian Malaysia, US) Jun 2018 Present
Role: Senior AWS DevOps Engineer
Developed and maintained CI/CD pipelines using AWS services and Jenkins, automating
infrastructure activities to streamline release cycles and enhance deployment efficiency.
Implemented Infrastructure as Code (IaC) with AWS CloudFormation, Terraform, and CDK,
customizing configurations for consistent, repeatable, and scalable deployments.
Designed and managed containerized infrastructures using Docker and Kubernetes, ensuring
secure, reliable, and scalable deployments across multiple environments.
Designed, deployed, and managed AWS database services, including RDS and DynamoDB
ensuring high performance, security, and compliance with AWS best practices.
Implemented security best practices to protect data in AWS RDS and other managed database
services, ensuring compliance with industry standards and safeguarding against potential threats.
Engineered high availability and disaster recovery solutions, minimizing downtime and ensuring
business continuity across production and non-production environments.
Provided end-to-end database solutions to meet customer requirements, collaborating with

cross-functional teams to ensure optimal design, implementation, and performance of cloud-
native database services.

Developed and optimized CI/CD pipelines to streamline the deployment of AWS database
resources, automating updates and fixes across environments to reduce errors and improve
operational efficiency.
Maintained and monitored AWS database performance using CloudWatch, proactively
addressing latency and performance issues to maintain operational health and minimize
downtime.
Built and integrated streaming data pipelines using Kafka to enable real-time data processing and
messaging between microservices, improving system responsiveness.
Created and managed deployment pipelines using AWS Lambda, API Gateway, and DynamoDB,
translating business requirements into effective code and configuration for new applications.
Modernized legacy systems by designing and deploying cloud-native solutions on AWS, leveraging
services such as EC2, S3, RDS, and DynamoDB for high availability and fault tolerance.
Developed and maintained scalable data extraction processes, extracting data from various
databases such as DB2 and MySQL, leveraging Python and SQL to automate and optimize data
retrieval.
Engineered automation solutions in Java and Python to optimize DevOps workflows, enhancing
overall operational efficiency.
Maintained version control systems (Git, Bitbucket, GitHub, SVN), creating build definitions and
scripts to support CI/CD integration throughout the development lifecycle.
Developed alerting systems using Splunk and Slack to monitor job statuses, enabling proactive
detection of potential failures.
Environment: Terraform, Jenkins, Python, Java, AWS, Agile, MySQL, DB2, Kubernetes, Docker, Ansible,
Jira, Confluence, Splunk, Slack, GitLab, Grafana, Kafka

Client: Standard Chartered Bank (Malaysia) Oct 2017 Apr 2018
Role: DevOps Automation Engineer
Managed SCB's credit card portfolio for APAC on IBM Mainframe ZOS using CardLink, ensuring
high availability and performance.
Automated credit card functions using IntelliJ IDEA, Genie, and Java Cucumber, enhancing
efficiency and reducing manual intervention.
Developed and maintained automation scripts in Java and Python, as well as CI/CD pipelines,
leveraging Jenkins and Ansible for continuous integration and deployment, including provisioning
and maintaining environments on AWS.
Implemented real-time data streaming solutions by integrating Kafka into existing microservices,
improving data flow and enabling efficient communication between distributed components.
Collaborated with cross-functional teams to design and deploy cloud solutions that align with
business objectives, ensuring seamless integration with on-premises systems.
Automated infrastructure activities such as continuous deployment, application server setup, and
stack monitoring using Jenkins.
Developed scalable data extraction pipelines using Python, automating the extraction process
from multiple databases such as DB2, and MySQL, and ensuring efficient data handling for credit
card operations.
Deployed and managed Docker containers and Kubernetes clusters to support scalable
infrastructure, ensuring high availability and fault tolerance for critical applications.
Enhanced the DevOps toolchain by integrating Git and JIRA for streamlined code version control,
issue tracking, and security analysis, while maintaining version control systems and build
definitions/scripts.
Created automation scripts in Python that streamline processes and enhance overall operational
efficiency.
Deployed and monitored scalable infrastructure on AWS, utilizing configuration management
tools, and set up data environments by creating customer profiles based on requirements.
Supported SIT and UAT environments, providing configurations and troubleshooting to ensure
successful testing phases.
Environment: Mainframe, CardLink, Cobol, JCL, CICS, DB2, MySQL, Cucumber, Jenkins, AWS, Agile,
Ansible, Jira, Git, IntelliJ IDEA, Java, Python
Client: HSBC (India, Malaysia & UK) Jun 2008 Aug 2017
Role: Senior Software Engineer, DevOps practitioner
Supported real-time authorizations of credit card transactions with eCafe and eChamps, ensuring
high availability and performance.
Worked on OnDemand, Entitlements, Electronic Document Management, Archive Retrieval, and
Financial Authorization Systems, leveraging Jenkins and Ansible for automation.
Conducted requirement analysis, ensured business requirements were covered in test scripts,
and created functional specs and project estimates using JIRA for tracking.
Integrated automation tools within the Cards system using Vision Plus and CI/CD pipelines,
enhancing overall efficiency.
Executed test cases on RTC using xConsole and automated scripts developed in Java, while
collaborating with stakeholders for progress and quality assurance, documented in Confluence.
Supported SIT and UAT environments, providing configurations and troubleshooting, utilizing
AWS for environment provisioning.
Leveraged my Java and Python skills to develop automated testing frameworks that improve
software reliability.

Coordinated a multi-channel insurance project to enhance customer experience and income
growth, implementing DevOps practices for streamlined operations.
Acted as module test lead and project coordinator, managing team tasks in RTC to ensure smooth
project execution and comprehensive test coverage.
Participated in a transition project for Data Quality, CutOver, Migration, and Decommissioning
for HSBC Life (UK), using Terraform for Infrastructure as Code.
Utilized my Java and Python expertise to create and execute automated test scripts that enhance
testing efficiency.
Created automation scripts in Python that streamline processes and enhance overall operational
efficiency.
Executed functional and regression test cases. Ran daily and monthly batches from JCL Libraries
to ensure premium payments were processed, using Grafana for monitoring.
Performed live data transfers from the production environment to test regions, ensuring all data
was desensitized, using AWS S3 for secure storage.
Environment: Mainframe, VisionPlus, CardLink, Cobol, JCL, CICS, Cucumber, Jenkins, AWS, Agile,
Ansible, Jira, Git, Confluence, Grafana, Terraform, Python, Java
Keywords: continuous integration continuous deployment sthree database zos information technology

To remove this resume please click here or send an email from [email protected] to [email protected] with subject as "delete" (without inverted commas)
[email protected];4030
Enter the captcha code and we will send and email at [email protected]
with a link to edit / delete this resume
Captcha Image: