shirisha - Software Quality Analyst |
[email protected] |
Location: Ellicott City, Maryland, USA |
Relocation: |
Visa: |
Shirisha Donthineni Cell-(469)747-7644 Morrisville, NC 27560 [email protected]
Summary: Over Seven years of professional experience in Information Technology with extensive skills on Software analysis, testing on all phases of the development life cycle, including requirements gathering, project planning, scheduling, testing, defect tracking, management, and reporting. Knowledge of Software Development Life Cycle (SDLC), which includes Requirements, Design, Development, System Testing, Configuration, Installation, Release Authorization, and Maintenance Phases. Experience in writing Test strategy, Test scenario, Test plan, Test cases, Bug reports and developing Test scripts using test automation SKILLS: JIRA, Bugzilla, Confluence, LogRocket, Datadog, Okta, Docker, AWS Cloud, Test Director, Browser Stack, Launch Darkly, Selenium & QTP tools Experience: Organization: JobNimbus, Lehi 05/2021 07/2023 Role: Software Quality Analyst Environment: JIRA, Confluence, Manual/Automation, Postman and Figma Actively participated in the detailed Design Reviews, Test plan reviews, Defect reviews, Ideation and Refinement meetings Log defects into the bug tracking system and provide necessary defect reports Coordinating the acceptance test processes and documenting and addressing all the issues during the process Interacting with the development team and operations team to ensure the quality of software to user expectations Involved Helpdesk team, resolve the issues which are raised by Users and log bugs accordingly Conducted End User Trainings Prepare the test traceability matrix to confirm the test coverage and mapped it with test cases Organization: WCF Insurance, Sandy 03/2020 02/2021 Role: Quality Assurance Engineer Environment: AQT, JIRA, Confluence, Squish, Visual AQT, JIRA, Confluence, Squish, Visual Paradigm, Manual and Automation Written Test cases/scripts in Jira as per the business requirements from Visual Paradigm Handed a Sign off sheet successfully by all stakeholders Monitor Error and Scheduled Maintenance logs to track the failures of any Job Created pages in Confluence to store all project related information Performed Re-testing, Regression, Sanity, Smoke, Functionality, PKI and Security Vulnerability Testing Organization: OODA Health, SLC, UT 07/2018 9/2019 Role: Quality Control Engineer Environment: JIRA, Browser Stack, MYSQL, Confluence, Docker Container, GitLab, AWS Cloud, Okta, SSO, VB, Java, Manual, Mobile, and automation Work closely with developers, architects, product owners and service center teams in establishing effective test cases based on business requirements Involved in writing test cases and ensure that the test cases are complete and are traceable to requirements and design Also involved in the deployment process of the build. Complete tasks within the allotted time and ensure that the technical and functional objectives are met Performed Manual testing, Mobile testing and also automated test frameworks using Selenium Tested the application using BrowserStack for mobile testing across on-demand browsers, operating systems, and real mobile devices, without requiring users to install or maintain an internal lab of virtual machines Organization: Micro Focus 02/2013 7/2018 Role: Technical Support Engineer Environment: RHEL, SLES, Windows, Bugzilla, Knova, Siebel Extensive Product Knowledge and issue resolution abilities Working knowledge on LDAP, Active Directory, SAML, OAuth, AWS and PKI Certificates Advocated customer requests and offered innovative ideas to improve product quality Documented solutions to the known issue by using Knova (Knowledgebase system) Involved in testing the product whenever the newer versions been released. Provided escalation level, technical support for customers, partners and internal staff for released products Strong Knowledge of TCP/IP protocols, Agile, Scrum, VLAN, TCP, SSL, HTTP, DNS, Network problem troubleshooting and solving skills Organization: App Labs (Lindon, UT) 09/2012 - 11/2012 Role: Test Engineer (Contractor) Successfully mapped business requirements to test requirements and uploaded to QC Effectively communicated with team members to facilitate test execution and achieve 100% quality assurance Responsible for Test Case Overviews, Weekly Status Reporting, Test Case Pass/Fail Matrix, Defect Reporting and Test case Matrix Worked with users to develop agile methodologies, Data Analysis, User Acceptance Testing procedures and strategies Generated test scenarios/cases/sets and executed test cases on main modules Organization: Metaminds Software Solutions (Hyderabad, India) 05/2005 - 10/2009 Role: QA Engineer Client: Edgenet, NC Project: Globalization m2o Description: Software products designed for release in the global market need to support multiple languages and regional settings. The design of global software differs from products intended to support one language and a limited number of regional settings. The above situation defines the concept of globalization. Product/applications that support multiple languages and multiple the user interfaces are kept separate from the core instruction code (which is the language compilation. Globalization refers to the process of designing, developing and engineering products that can be launched worldwide. Responsibilities: Created Manual test cases using use cases and prepared Regression Test suites Identifying the Test requirements based on business requirements of the application and building the cases accordingly Extensively used SQL Query Analyzer, PL/SQL Developer to do the Backend Testing by writing the SQL Queries (Select Statements and JOINS) Actively participated in Design Reviews and Weekly Status Meetings Project: Appliances/Integration Description: The new functionality is designed to make the shopping experience more streamlined, efficient, and user-friendly while maintaining a high degree of order accuracy and minimal returns. The Appliance selling features outlined in this document will be built upon (and integrated with) other projects currently in design/implementation Enhanced Customer Lookup (Lowe s project), Integrated Genesis Selling, Installed Sales, and UI Customization. The specific business requirements used as the basis for this document are outlined in the Business. Responsibilities: Actively participated in detailed design reviews, test plan review and come up with good test Scenarios and test data needs Analyzed the Business requirements, functional specifications, Use Case documents and created the Test Plan document, Test cases for System testing Extensively involved in conducting the System integration testing, Regression Testing, Backend Testing, GUI Testing, and Usability Testing Performed Data Driven Testing using QTP to test the application Project: Integrated Genesis Selling Tool (IGS) Description: TCL (Total Closed Loop) Installed Sales solution is divided into 6 projects. TCL Installed Sales Project 2 will automate labor pricing, online contracts, and a new web-based installed sale selling tool (detailed in this HLD) to dramatically improve the installed sales selling process, making it easier for the stores to create and complete installed sales projects. Corporate productivity will also be improved by eliminating the tedious process of printing and distributing hard-copy contracts to the stores, and by streamlining the management of labor pricing. Responsibilities: Developed & Implemented Test cases, Test scripts, and Test approach documents based on Business & Functional Requirement documents Written Test Cases and Performed Manual Testing like Positive testing, negative Testing, and Black Box Testing Performed Functional testing and Regression testing using QTP Performed GUI Test, Functional testing, Smoke testing, System testing, and White box testing Project: Made to Order: (m2o) Core Functionality Description: "m2o (made to order)" is a modular, flexible and powerful knowledge-based configuration, pricing and quoting system. It is used to guide users through the process of specifying complex custom products and/or complex processes. While the user is using the system, they are creating what is referred to as a Unit, Mark or Item. The system is designed to allow users to begin building their Unit at any point in the question and answer session. Users are not confined to following a pre-specified path of questions; rather they can begin specifying their Unit with the attributes that are most important to their individual needs. Responsibilities: Involved in designing of integration test cases. Writing the Scenarios Identifying the Test requirements based on business requirements of the application and building the cases accordingly Extensively used SQL Query Analyzer, PL/SQL Developer to do the Backend Testing Actively participated in Design Reviews and Weekly Status Meetings Identifying the Test requirements based on business requirements of the application and building the cases accordingly Performed Manual Adhoc Testing and raising the issues, logging or reporting them through Request Tracking Tool Keywords: quality analyst user interface information technology procedural language North Carolina Utah |