Home

Sudheer - Lead Performance Tester
[email protected]
Location: Columbus, Ohio, USA
Relocation: yes
Visa: H1B
NAME - SUDHEER GANGAVARAPU
Phone 313-638-0513
Email - [email protected]
Location- Columbus, Ohio
LinkedIn - https://www.linkedin.com/in/sudheer-gangavarapu-99484a240


Summary

13+years of experience in software engineering platforms and has delivered many engineered solutions to businesses in multiple domains Banking, Media, retail, and Travel & Loyalty. Expertise in program and Project management, driving the core team to deliver the best-engineered solution in agile methodology and best practices with alignment to business vision and helping customers to meet their business goals.
Has very good leadership skills in software testing, which includes involvement in QA activities throughout the stages of the SDLC, from requirement understanding and analysis, test plan, test case designing and execution, defect tracking, and reviewing, till the causal analysis of the test reports in every cycle.
He was involved in end-to-end Performance Testing Life-cycle starting with Requirements gathering, Planning, Test Strategy, Proof of concept, Scripting, Test Execution, Analysis, Documentation up to Sign-off Decisions, and providing Performance tuning recommendations, as applicable. He can also summarize & communicate actionable performance test results to both Technical & Business stakeholders.
Very good at building strong relationships between teams within IT Solutions and related business teams and maintains professional relationships with peers in other corporations and outside organizations.
Proficient in using the ELK stack for centralized logging, real-time analysis, and performance monitoring
Knowledge of Groovy scripting within JMeter for customizing test scripts, enhancing automation, and handling complex test scenarios (good to have)
Ability to provide technical and strategic solutions, collaborate with Project Manager, Business Users, Developers, and Testers to deliver solutions
Owned responsibilities for every phase of Application Performance Testing from Requirement Gathering, Analysis, Feasibility Study, Estimations, Scripting, Execution, Bottleneck Analysis, and Reporting.
Having good working experience in large-scale Enterprises and complex applications on different platforms like Mobile, Web & Cloud.
Extensive experience in gathering requirements from Server Logs, Test Plan preparation, Reporting & Closure preparation.
Proficient in using BlazeMeter for cloud-based performance testing, enabling large-scale load testing and reporting
maintain automated test scripts using Selenium for browser-based testing, ensuring consistent and reliable test execution across different environments
Involved in testing web services, such as SOAP/JSON and microservices.
Expertise in LoadRunner Web (UI), Mobile & API scripting
Expert in Workload modeling & Test Strategy preparation
Expert in Correlation, Parameterization & Custom request creation for different technologies
Hands-on experience in Legacy applications scripting (Ajax) in LoadRunner
Well experience in Load, Endurance, Stress, and Baseline testing
Expertise in identifying Bottlenecks in different layers and working with respective teams to fix the issues
Utilize JMeter 4.0 (and above) for performance testing
Extensive knowledge in Log analysis using Splunk
Hands-on experience with Selenium for automating browser-based testing, ensuring efficient and repeatable test execution
Automate test execution and result collection using Python-based automation frameworks
Good experience in APM tools Dynatrace, DATADOG, App Dynamics
Expertise in JMeter. Grafana, Neoload and Silk performer
Expertise in writing Beanshell and Java scripting
Manage Docker environments, including creating and maintaining containers, images, and orchestration, to support continuous deployment and microservices architectures
Experience in using SolarWinds DPA for monitoring and optimizing database performance, identifying slow queries, and troubleshooting database bottlenecks
Good knowledge of Kafka, and Jenkins with Performance tools integration and Test execution
Experienced and certified in resilience and Chao s engineering.
Expertise in creating Test data using scripting languages, SQL queries, and Excel.


Technical Skills:

Performance Testing LoadRunner, Jmeter, Blaze meter, Neo load, Grafana, and Silk performer
Languages C, JAVA
Network Protocols HTTP, FTP, HTTPS, Web (UI & API), TCP
Test Management Jira, QC
Scripting Languages JavaScript, BeanShell
Monitoring Tools Dynatrace, Datadog, AppDynamics, Perfmon
SOA SoapUI, Postman, Swagger
Sniffing Tools Fiddler, Developer Toolbar
Log Analysis Splunk
Version Control GIT
Continuous Integration Jenkins
Operating system - Windows, Unix
Server IIS, Tomcat, WebLogic, JBoss, WebSphere
Framework J2EE, .net
Database SQL Server, Oracle, MySQL
Cloud AWS and Azure



EDUCATION

Master of Computer Applications (Software System) in 2009 from Periyar University.


Work Experience

Client: EverNorth(Cigna), CT June 2023 Till Date
Performance Test Lead

Responsibilities:
Gathering and analyzing the NFRs and preparing the test plan based on NFRs
Setup the performance test environment in the cloud
Create the scripts for identified business scenarios and run the smoke tests to validate the scripts and env.
Scripts are executed on a daily, weekly, and monthly basis and compare the results with benchmark results.
Executed resilience testing by limiting the resources to observe the performance with limited resources and recovery time.
Identify and share the performance issues with respective teams to fix the issue and run the test again to check the fix.
Experience in integrating BlazeMeter with JMeter scripts for scalable and distributed performance testing
Designed and executed test scripts for APIs using Apache JMeter
Designing new scripts using Neoload and validating the scripts for the smoke test
Enhance the test scripts with parameterization using variables in Neolaod
Correlate dynamic values using variable extractors in Neoload advanced parameters
Wring Java script when Neoload script is required to save the variables to external files or other customization.
Create and maintain JMeter test scripts, including the integration of Groovy scripts for complex test scenarios and dynamic data handling
Design the test scenario by creating populations and load variations policy in Neoload
Executing test scripts in Neoload by assigning cloud load generators
Handling Ad hoc requests raised by other teams related to prod issues.
Monitoring Used Dynatrace, and Datadog to capture the OS level metrics.
Verify the error logs using Splunk after every load test execution
Identifying, analyzing, and documenting all the bugs observed during testing and Analyzing the issues with Dev POC
Use JMeter and BlazeMeter for scripting and test execution
Build reports for trending analysis.
Use Python to develop scripts that automate performance testing tasks and processes
Performed the analysis and published the Performance Test Results that included System Metrics, Response Time Metrics, and any Tuning recommended.
Analyzed the database/store procedure and provided recommendations to the DB team for tuning which brought down the response times, and CPU utilization and improved overall system performance
Analyze, understand, and communicate technical data, specifications, designs, etc.
Train the shore teams to work on Performance problems to identify code-level performance concerns and communicate with the team
Proficient in performance testing and tuning Kafka messaging systems to ensure high throughput and low latency
Follow up through the entire process to ensure the defect work ow is followed and the performance recommendations are implemented
Utilize Python for automating test cases, data processing, and integrating with various testing tools and frameworks
Analyze and understand Production support and Production concerns in the test environment.
Identify and escalate issues and risks as appropriate.

Test Environment: Neoload, Jmeter, Groovy Script, Java, J2EE, Dynatrace, Kafka, Datadog, Splunk, BlazeMeter, HTTP/HTML Kubernetes, AWS and Jira.


Client: Elevancehealth, IN Dec 2022 June 2023
Performance Test Lead

Responsibilities:
Gathering the requirements and preparing the test strategy based on NFR
Create the scripts for identified business scenarios and run the smoke tests to validate the scripts and env.
Scripts are executed on a daily, weekly, and monthly basis and compare the results with benchmark results.
Identify and share the performance issues with respective teams to fix the issue and run the test again to check the fix.
Designing new scripts using Load Runner and validating the scripts for the smoke test
Ability to analyze BlazeMeter test results, including performance metrics and bottlenecks, to provide actionable insights
Enhance the test scripts with parameterization using the Load Runner TrueClient protocol.
Used Apache Jmeter for creating test scripts for UI and APIs
Handling Ad hoc requests raised by other teams related to prod issues.
Monitoring Used Dynatrace, and Datadog to capture the OS level metrics.
Verify the error logs using Splunk after every load test execution
Identifying, analyzing, and documenting all the bugs observed during testing and Analyzing the issues with Dev POC
Experience in designing and executing test scenarios for Kafka-based architectures, simulating various producer and consumer loads
Build reports for trending analysis.
Performed the analysis and published the Performance Test Results that included System Metrics, Response Time Metrics, and any Tuning recommended.
Analyzed the database/store procedure and provided recommendations to the DB team for tuning which brought down the response times, and CPU utilization and improved overall system performance
Analyze, understand, and communicate technical data, specifications, designs, etc.
Train the shore teams to work on Performance problems to identify code-level performance concerns and communicate with the team
Integrate Python scripts with performance testing tools and frameworks to streamline workflows
Follow up through the entire process to ensure the defect work ow is followed and the performance recommendations are implemented
Analyze and understand Production support and Production concerns in the test environment.
Identify and escalate issues and risks as appropriate.

Test Environment: LoadRunner, Apache Jmeter Willy, Grafana, BlazeMeter, Kubernetes, Splunk, Azure, Kafka, Jira.


Client: Selective Insurance, Glastonbury, CT Sep 2021 Nov 2022
Performance Test Lead

Responsibilities:
Gathering the requirements and preparing the test strategy based on NFR
Create the scripts for identified business scenarios and run the smoke tests to validate the scripts and env.
Scripts are executed on a daily, weekly, and monthly basis and compare the results with benchmark results.
Identify and share the performance issues with respective teams to fix the issue and run the test again to check the fix.
Designing new scripts using Neoload and adding them to daily health check runs.
Designing WLM based on peak hour prod requests count and response timings.
Handling Ad hoc requests raised by other teams related to prod issues.
Monitoring Used to Myxalytics tool to capture the OS level metrics.
Configure the JIRA workflow for the project for improvement processes for screens, workflow procedures, and reports of applications as per business requirements.
Identifying, analyzing, and documenting all the bugs observed during testing and creating a ticket in Jira.
Responsible for developing the scripts to support Jenkins (Continuous Integration) of the scripts with the build server.
Build reports for trending analysis.
Performed the analysis and published the Performance Test Results that included System Metrics, Response Time Metrics, and any Tuning recommended.
Analyzed the database/store procedure and provided recommendations to the DB team for tuning which brought down the response times, and CPU utilization and improved overall system performance
Analyze, understand, and communicate technical data, specifications, designs, etc.
Train the shore teams to work on Performance problems to identify code-level performance concerns and communicate with the team
Follow up through the entire process to ensure the defect work ow is followed and the performance recommendations are implemented
Analyze and understand Production support and Production concerns in the test environment.
Identify and escalate issues and risks as appropriate.

Environment: Pega, Angular, SOAPUI, Postman, Jenkins, Neoload, JMeter, Myxalytics, HTTP/HTML and Windows.


Client: State Auto Insurance, Columbus, OH Feb 2020 Aug 2021
Performance Test Lead

Responsibilities:
Preparing Performance Test strategy and workload model for Client UI vendor portal and Policy center.
Worked on UI and API performance testing of vendor UI portal and integration services.
Creating the scripts for identified business scenarios and running the smoke tests to validate the scripts and env
Identify and eliminate performance bottlenecks early in the development lifecycle
Monitor the performance metrics using Appdynamics and Splunk
Creating and sharing across daily, weekly, and version release performance test reports
Designed and executed the web UI scripts using the Silk Performer tool.
Designed and executed the API scripts using the API tool.
Implemented distributed Load testing Master/Slave concept in the Jmeter tool.
Implemented the Mobile performance testing using the see test cloud in the Jmeter
Attend status meetings with the project teams to ensure smooth execution of the project and send out status reports to the stakeholders
Center Worked with HP for licensing requests and issues
Used AppDynamics for monitoring the application servers under test
Suggest multiple enhancement areas for the application under test to improve performance.
Contributed to meeting the performance goals of the application
Present the Performance Test results with any recommendations or enhancements
Manage the submittal of the performance defects through the proper Defect Life Cycle for proper escalation
Oversee and escalate any issues or risks that may arise during the Performance Test Life Cycle
Involved in testing web services, such as SOAP/JSON and microservices.

Environment: Guidewire, Angular, Silk Performer, Jmeter, Web services, App Dynamics and Splunk


Client: State Auto Insurance, Columbus, OH- offshore Jan 2019 Jan 2020
Performance Test Engineer

Responsibilities:
Creating performance test scripts using the Silk Performer tool.
Identify and review performance test conditions.
Preparation of Performance Test strategy and Test plan
Test Design and Test data preparation for Performance scripts
Executing Test scripts to validate the performance of Transactions
Identifying Bottlenecks and providing recommendations
Identify the Performance Issues and Log defects in TFS and JIRA based on projects.
Multiple Executions with the help of identifying issues and analysis for AppDev and DevOps Team.
Re-Execution to close the defect.
Generate the Performance Test Summary report with Client-side and Server-side metrics.
Report status and risks/issues to leads.
Manage and mentor the team
Developed the Test Scenarios by translating the business models into scenarios
Analyze the test data requirements and create test data that is based on the test data requirements
Responsible for creating and maintaining test scripts
Conduct Load/Stress and Endurance tests for multiple iterations on each release. Identify test results meet the success criteria
Monitor the execution using the monitoring tool called FogLight.
Gather the monitoring data and analyze the test results to identify performance bottlenecks
Coordinate with the respective teams to resolve the issues on time.
Provide daily and weekly status reports to the stakeholders


Environment: Guidewire, Angular, Silk performer, meter, AppDynamics, and Splunk

Client: CNA Corp, Chicago, IL- offshore Oct 2015 Dec 2018
Performance Test Engineer

Responsibilities:
Involved in preparing the test plan with reference from NFRS
Design the test scripts with mandatory enhancements for all the scenarios that are in scope in Load Runner
Designing load test scenarios according to the tests mentioned in the test plan in ALM.
Involved in monitoring the metrics using Dynatrace.
Involved in analyzing the results and pinpointing the issues by digging across various servers and providing the proper recommendation to the concerned developers.
Involved in preparing detailed test reports with recommendations.
Provide daily and weekly status reports to the stakeholders.
Revising test scripts as needed to meet new and changing speciation
Do need the test criteria and project schedules, and base lined the Test Plan with the help of project meetings and walkthroughs.
Ran SQL queries for database testing for the verification of results retrieved.
Documented the errors using the Quality Center and tracked them to closure by communicating and co-coordinating with the development team.
Identify performance bottlenecks and work with the respective teams to resolve the issues on time.


Environment: Java, J2EE, SQL server and Ajax, LoadRunner, SoapUI, REST API, Perform and Linux.

Client: Transcontinental foundation-TC Media, Canada - offshore Nov 2013 Nov 2015 Performance Test Engineer

Responsibilities:
Create JMeter Scripts Based on Test conditions.
Execution of the performance test
Monitor the performance of servers
Gather instrumentation data to Analyze the bottlenecks
Test Report creation and Sign Off Decisions, Implementing
Recommendations for the release
Worked on SOAPUI for web services testing which involved testing web services in both SOAP and REST.
Performed complex queries using SQL involving various joins for database testing and documented the
Obtained results.
Involved in XML data validation for the inputs and output for data transmission purposes and testing of web applications.
Generated and automated the generation of daily, weekly, and quarterly status reports.
Generated defect status reports, QA analysis reports, risk analysis documents, requirements traceability
Reports and test result summary reports.
Used GitHub as a source repository to share the code with both the on-site and the offshore teams.
Tools such as Rally have been used for task tracking.

Environment: JAVA, SQL Server, Jmeter, J2EE, SOAPUI, REST API, Ant report generation.

Client: AXA Asia, Hong Kong offshore Nov 2010 Oct 2013
Role: Performance Test Engineer

Responsibilities:
Involved in preparing the test plan with reference from NFRS
Design the test scripts with mandatory enhancements for all the scenarios that are in scope in Jmeter
Designing load test scenarios according to the tests mentioned in the test plan in Jmeter.
Involved in monitoring the metrics.
Involved in analyzing the results and pinpointing the issues by digging across various servers and providing the proper recommendation to the concerned developers.
Involved in preparing detailed test reports with recommendations.
Manage and mentor the team
Analyze the Business and User requirements and create test plans, test speculations, and test procedures for performance testing.
Developed the Test Scenarios by translating the business models into scenarios
Analyze the test data requirements and create test data that is based on the test data requirements.
Responsible for creating and maintaining test scripts


Environment: Pega, Oracle, Jmeter, Connect Direct MQ, CA Wily and Quality Center
Keywords: cprogramm quality analyst user interface message queue database active directory information technology hewlett packard California Colorado Connecticut Illinois Ohio

To remove this resume please click here or send an email from [email protected] to [email protected] with subject as "delete" (without inverted commas)
[email protected];3614
Enter the captcha code and we will send and email at [email protected]
with a link to edit / delete this resume
Captcha Image: