Home

Nazma H - Guidewire QA
[email protected]
Location: Austin, Texas, USA
Relocation: Remote / Local
Visa: H4
Nazma H
Guidewire QA
+15178935038
[email protected]
Austin, TX
Remote / Local
H4


I bring over a decade of experience as a certified IT Quality Assurance (QA) Lead and Property & Casualty with a specialization in Commercial Lines of Business. My management skills are proven in leading cross-functional QA teams to meet critical project timelines, and my deep understanding of insurance operations has been pivotal in optimizing policy processing, forms, rating, and scoring procedures.
Technically proficient in Automation, I've employed tools like Selenium, Tricentis TOSCA, and CA DevTest to streamline regression and functional scenarios, enhancing efficiency and precision. Comfortable with both Agile and Waterfall methodologies, I adapt to project demands to ensure success.
I am well-versed in the Commercial Insurance domain and offer excellent project delivery and customer management skills. My ability to train, lead, and motivate teams, combined with my dedication to quality, sets me apart as a valuable asset to any client seeking robust IT and industry expertise.

EDUCATION & CERTIFICATIONS
Bachelor s Degree in Electronics and Communication Engineering, JNTU, India
Certified Specialist in PolicyCenter 10.0- Configuration
ISTQB Certified Tester
Certificate for Automation Specialist Level 1- 2021 (skilljar.com) (Tricentis TOSCA)

SUMMARY
Possess over a decade of experience in executing both Manual and Automated testing for a diverse range of Client/Server and Web-Based applications, demonstrating a strong understanding of the testing landscape.
Bring hands-on experience with open-source tools such as Selenium (Selenium IDE, Selenium WebDriver, Selenium GRID), coupled with Java programming, TestNG, Eclipse, Maven, and Jenkins, showcasing technical proficiency in crafting automated test frameworks.
Offer a rich track record of 5+ years of onsite experience in Harleysville, Pennsylvania, US, where I've excelled as a QA Lead and onsite coordinator, demonstrating leadership abilities and effective collaboration skills.
Deeply experienced in Property & Casualty (P&C) Insurance, specifically in Commercial lines of Businesses (Auto, Property, Workers Compensation, General Liability, Commercial Umbrella, and Business owners), providing a solid industry background.
Command excellent knowledge of various policy administration tools, such as Guidewire PolicyCenter, along with a comprehensive understanding of insurance laws, regulations, and company policies, ensuring full regulatory compliance.
Highly proficient in Guidewire Rating, with extensive experience in the validation of rate books and rate algorithms, rate capping ensuring accurate insurance premium calculations.
Bring considerable experience in verification, validation (cross-functional testing), and SDLC Process. Expertise in Integration, Smoke Testing, Web Services Testing, Database Testing, Regression testing, showcasing a broad testing skillset.
Designed and implemented automated Quality Assurance activities, including the development of procedures/ standards, process design, checklist creation, and metrics collection, showcasing strategic planning and organization skills.
Display deep knowledge of QA Processes and Methodologies, Quality Metrics, and all aspects of QA Process management and Project delivery, including Planning, Estimation, Control, and Execution, highlighting strong project management abilities.
Proficient in all stages of Non-Functional Testing categories like Data Conversion, Database, Recovery, Usability Testing, Load Testing, Performance, Exploratory, and UAT Testing, ensuring comprehensive product testing.
Skilled in Data Migration testing, Functional and Performance Automation using Tricentis TOSCA, demonstrating a range of testing abilities.
Proficient in Backend Testing and Data Integrity Testing using complex SQL Queries, ensuring data accuracy and consistency.
Experience in REST API/web services testing using Postman and ReadyAPI, confirming seamless integration and functionality of web services.
Led the design and development of the Automation testing Framework, customizing it to suit the customer's landscape using CA DevTest, showcasing adaptability and customer-centricity.
Involved in Integrations of Guidewire Policy Center with external systems like Guidewire Claim Center and Billing Center
Developed and deployed Virtual Services (viz., service stubs) using CA DevTest, providing flexible testing environments.
Proficient in Bug Reporting and Bug Tracking using various Bug Tracking Systems like Bugzilla, Jira, demonstrating an organized and systematic approach to defect management.

TECHNICAL SKILLS
Hardware/Operating Sys Windows
Programming Languages Java, JavaScript
Frameworks TestNG, POM, BDD, Data Driven Testing Framework.
Automation Testing Tricentis TOSCA, CA DevTest, ReadyAPI, POSTMAN, Selenium IDE, Selenium RC, Selenium WebDriver, Selenium Grid, Maven, Jenkins
Reporting Tool JIRA, Bugzilla, HP ALM.
IDEs/Dev tools/Editors Eclipse IDE
Version Control Tools GIT, SVN
Database SQL Server 2008, Oracle, Snowflake, Databricks
Browsers Internet Explorer, Mozilla Firefox, Chrome
SDLC Models Agile (Scrum), Waterfall





PROFESSIONAL EXPERIENCE
Client: Nationwide Mutual Insurance Company, Columbus, OH
October 2015 February 2022
Role: QA Lead for Rating, Interfaces and Configuration teams
Environment: Java, Selenium WebDriver, Tricentis TOSCA, CA DevTest, SQL Server 2012, Eclipse, TestNG, Snowflake, Databricks, ReadyAPI, Postman, Agile, Jira, Maven, Postman, Splunk, Github, Jenkins, Agile.

Description: Guidewire PolicyCenter Implementation (Commercial Lines) and State rollouts, Agent Portal Implementation.

I am being part of the core implementation team in following LOBs:
Businessowners package, Workers Compensation, Commercial Auto, Commercial Property, General Liability and Commercial Umbrella

Phases: I am Involved in following activities activities/
Project Planning: Define the scope and objectives of the QA processes for the Guidewire implementation. Establish clear timelines and deliverables in line with the project's overall schedule.
Test Strategy Development: Develop a comprehensive test strategy that covers all aspects of the Guidewire system, including the PolicyCenter, Rating System, APIs, Downstream systems and Guidewire portal. The strategy should outline the types of testing to be conducted (functional, integration, performance, etc.) and the tools and methodologies to be used.
Test Case Design: Design detailed test cases that cover all the necessary commercial lines insurance workflows and scenarios in the Guidewire system. This could include Policy Submissions, Changes, Renewals, Cancellations, Reinstatements, and Rewrites.
Test Execution: Oversee the execution of the test cases, ensuring they are run in a systematic and controlled manner. This involves managing the testing team and ensuring they have the resources and support they need.
Defect Management: Manage the defect lifecycle from discovery to resolution. This involves tracking defects, prioritizing them based on severity and impact, and liaising with the development team to ensure they are addressed in a timely manner.
Risk Management: Identify potential risks that could impact the quality of the Guidewire implementation. Develop mitigation strategies to address these risks and communicate them to the relevant stakeholders.
Stakeholder Communication: Maintain open and regular communication with all project stakeholders, including the client, project manager, development team, and others. This involves providing regular updates on the progress of the QA activities and addressing any issues or concerns that arise.
Quality Assurance: Ensure that the Guidewire system meets the required quality standards and aligns with the needs of the business. This involves verifying that the system functions correctly, is user-friendly, and is reliable under different conditions.
Performance Testing: Develop a comprehensive test plan that outlines the scope, strategy, and schedule of the Guidewire Rating performance testing activities. The plan should align with the overall project timeline and take into consideration any potential risk areas.
Process Improvement: Continually review and improve the QA processes, incorporating lessons learned from each test cycle. This can help increase the efficiency and effectiveness of the testing activities and contribute to the overall success of the project.
Team Leadership and Development: Lead and mentor the QA team, fostering a positive and collaborative working environment. This includes providing ongoing feedback and training to team members to enhance their testing skills and knowledge.

Responsibilities:
Devised a strategic approach for functional automation validation of Guidewire PolicyCenter workflows, including Policy Submissions, Changes, Renewals, Cancellations, Reinstatements, Audits and Rewrites.
Collaborated closely with Guidewire PolicyCenter 8/9/10.x system, Integration & BSA teams on detailed requirement specifications and process flowcharts, contributing to key design decisions.
Ensured the alignment and signoff of Functional Specification requirements from BSA & Integration team stakeholders, facilitating efficient project progression.
Streamlined rating algorithms and rate tables using TCS's in-house tool, Rate Assure, enhancing the efficiency of rating processes.
Defined and finalized the project automation scope following a comprehensive functional and technical analysis of the Guidewire product.
Engineered an automation testing framework using Selenium WebDriver with Java, bolstering the validation process of new and existing component releases.
Developed sophisticated automation scenarios focused on data-driven testing, constructing a hybrid framework from scratch using Page Object Model Design Pattern, Page Factory, and TestNG.
Played a vital role in setting up a test environment for automated script execution using Selenium WebDriver, Maven, and TestNG.
Verified the interaction between Guidewire PolicyCenter and Integration layer services (IIB), ensuring seamless integration.
Leveraged the SPLUNK tool for application log search, monitoring, and analysis, supporting effective application troubleshooting.
Virtualized integration services using the CA DevTest workstation application to streamline testing processes.
Developed and executed integration testing scripts in Tricentis TOSCA, reinforcing the quality of integration processes.
Conducted Manual Testing of RESTful APIs using the Postman tool to ensure the correctness of API responses.
Automated RESTful APIs using Java and RestAssured framework, validating JSON responses for methods like GET, POST, PUT, and DELETE.
Designed a comprehensive validation strategy for data extraction, transformations, and loading processing in an enterprise-wide ETL Implementation.
Established procedures for validation of Transformation and Cleansing Logic in data loading from Databricks Source tables to Target tables in Snowflake, enhancing data integrity.
Actively participated in Sprint cycles, Sprint Planning, and Daily Stand Ups (Agile/Scrum) with PM, Dev, and QA team members, facilitating effective cross-team collaboration.
Led various reviews, root cause analyses, and automation estimation, planning, tracking, and coordination activities with various stakeholders, providing timely status reporting for ongoing releases.

Client: Moody s Analytics, USA
October 2014-October 2015
Role: Offshore QA Analyst, India (TCS-Tata Consulting Services)
Environment: Load Runner, Java, SQL Server 2012, Eclipse, SOAP UI, SVN, Agile, Rally.
Description: The Risk Origins software incorporates Moody s Analytics industry-leading risk assessment capabilities and helps lenders demonstrate compliance to regulators
Responsibilities:
Conducted rigorous Load Testing on a Staging Environment, mirroring the Production Environment, to ensure optimal system performance under high traffic conditions.
Employed Reliability Testing strategies using Average Load over extended durations with the Controller tool, providing a thorough understanding of system robustness.
Utilized the LR-Controller for Load Testing using Peak Load over shorter durations, ensuring the system's ability to handle sudden spikes in user activity.
Took the lead in analyzing application and component behavior under heavy loads, optimizing server configurations to maintain performance standards.
Collaborated closely with developers during testing, identifying and rectifying memory leaks, bugs, and optimizing server settings at web, app, and database levels.
Crafted comprehensive System and Functional Test Plans and Test Case Scenario Documents, ensuring coverage of all critical functionalities.
Analyzed test cases and contributed to the preparation of test data based on Test bed criteria.
Developed detailed test cases specifically designed for performance testing.
Conducted SOA/Web Service testing using SOAP UI, ensuring seamless integration and functionality of web services.
Oversaw Peer Reviews of essential documents like Test Plans and Test Cases written by Team Lead & Team members, guaranteeing the highest quality standards.
Managed data FTP across various platforms and applications, ensuring data availability and integrity.
Authored efficient SQL queries for Data Reporting, facilitating accurate data retrieval.
Develop a comprehensive test plan that outlines the scope, strategy, and schedule of the Guidewire Rating performance testing activities. The plan should align with the overall project timeline and take into consideration any potential risk areas.
Analyze the results of the performance tests to identify any bottlenecks or issues that could impact the system's performance under different loads. This may involve working closely with the development team to understand the underlying architecture and identify areas for optimization.
Led the bug reporting process, actively participating in every phase of the Bug Life Cycle.
Engaged in proactive discussions with Developers and Analysts regarding Issues and New Functionality, promoting a collaborative approach to problem-solving.
Prepared detailed Final Test reports, including test steps and execution snapshots, providing a clear record of testing activities.
Promptly escalated Critical problems/issues identified during Testing and reported them in the Defect Tracking tool.
Interacted regularly with developers to report and track bugs, ensuring timely resolution and minimal impact on project timelines.
Compiled and prepared final consolidated test reports for management review, presenting a comprehensive picture of testing outcomes and areas for improvement.

Client: RAPDRP, India
March 2012-August 2014
Role: IT Analyst (TCS-Tata Consulting Services)
Environment: Oracle CCB, Selenium WebDriver, SVN, Rally, JMeter, Load Runner 11.5, SOAP UI
Description: The project is aimed at customization of Customer Care & Billing application using Oracle CCB. The project consists of various modules like Web Self Service(WSS), Geographical Information System (GIS), Energy Audit (EA), CCB (Customer care& Billing), CCC (Centralized customer care), Meter Data Acquisition System (MDAS) etc., involving various technologies
Responsibilities:
Transformed manual test cases into automated test scripts utilizing Selenium WebDriver in Eclipse with Java, significantly enhancing testing efficiency.
Entrusted with the task of updating and maintaining existing Selenium scripts, ensuring continued relevance and accuracy.
Set up a robust test environment for the execution of automated scripts using Selenium WebDriver, facilitating seamless and accurate testing.
Created a comprehensive Requirement Traceability Matrix, test scenarios, and test cases for functional cases, providing clear testing guidelines and ensuring complete coverage.
Leveraged Selenium GRID to execute test cases across multiple Operating Systems, ensuring software compatibility and robustness.
Engaged in Rest web services testing using the SOAP UI tool, validating the integration and functionality of web services.
Participated in maintaining developed code/software in Tortoise SVN, ensuring code security and version control.
Contributed to the development of a Proof of Concept (POC) using Load Runner 11.5 and Apache J Meter, demonstrating the feasibility of performance testing solutions.
Led Performance testing of the Web Self Service module using Apache J Meter, identifying and addressing performance bottlenecks.
Took charge of Performance Tuning for identified performance application issues, optimizing system performance.
Configured UI Pages using ORACLE CCB product for Billing and Collection functionalities, enhancing user experience and operational efficiency.
Conducted security testing for applications using SOA, ensuring the integrity and confidentiality of application data.
Collaborated with business decision makers to define and code business and system validation rules, aligning system performance with business needs.
Actively involved in the Change Request (CR) process and enhancements outside the design document, promoting continuous improvement.
Participated in Defect Analysis and prepared detailed DPCA reports, identifying key areas of software quality improvement.
Regularly participated in Daily status calls, Weekly status calls, Reviews, Walk-through meetings, and Regression and TPR defect and status calls, promoting effective communication and teamwork.
Keywords: cprogramm quality analyst user interface access management information technology hewlett packard California Ohio Texas

To remove this resume please click here or send an email from [email protected] to [email protected] with subject as "delete" (without inverted commas)
[email protected];705
Enter the captcha code and we will send and email at [email protected]
with a link to edit / delete this resume
Captcha Image: