Appendix D Sample Master Test Plan
"Planning is a process that should build upon itself – each step should create a new understanding of the situation which becomes the point of departure for new plans."
— Planning, MCDP 5
U.S. Marine Corps
Sample Master Test Plan
Below is a sample master test plan that was created to test the STQE.net Web site, which later became known as StickyMinds.com.
STQE.net Master Test Plan, Release 1
Version 1.5
Test Plan Identifier
STQE.net MTP 1.5
References
The following documents have been used in the preparation of this document:
- SQE.NET Requirements Definition, Version 3.2
- SQE.NET Web Site Control Structure, Version 1.00
- SQE.NET Test Objectives, Version 1.5
- SQE Systematic Software Testing Course Notes
- STQE.NET Issue form
Introduction
Software Quality Engineering has contracted with an outside software development vendor to create a World Wide Web (WWW) site to function as a knowledge and information sharing site for software testing and quality engineering professionals. The target audience will be the same as the Software Testing and Quality Engineering magazine, software managers (development, testing, and quality) and test professionals, and software engineers who are interested in building and delivering better software.
Unlike many WWW sites, this site, to be known as SQE.net, is a software-driven database application using Microsoft Site Builder with ASP coding and the MS-SQL database. This Master Test Plan (MTP) covers the testing activities of the software and does not cover the initial or on-going tasks of adding, editing, publishing, and verifying the content.
The SQE.net site will be introduced in releases with each release having increasing functionality:
- Release 1.0, also known as "Live Beta," will be an opportunity for the interest area moderators and product providers (vendors) to begin to enter data into the databases. Formal testing of SQE.net's initial capabilities will begin with this release. After formal testing and loading content, the site will be accepted into production and the public "Grand Opening" of the site will be announced. After all functionality for Release 1.0 has been delivered, internal bug fix releases will be denoted with a letter suffix, i.e. 1.0a, 1.0b, etc.
Future enhancements, such as job postings, banner ad management, a "What's new" feature, and a comprehensive site search engine, will take place in subsequent releases. This master test plan, covering the testing for Release 1 includes the following testing levels:
- Unit and Integration Testing: the vendor as part of its development of the site software will perform these levels. This plan will not discuss these levels of testing.
- Smoke Test: The smoke test will be conducted by the SQE test team. The test plans are written in a manner which can easily be automated. The purpose of the smoke test is to verify that the software is stable enough to conduct further functional testing.
- Functional Test: The functional test is designed to verify that the functions at each user level work as designed. Many of the tests designed in the smoke test may be re-used along with some additional tests to create the functional test plan. These tests will also be written in a format that can be easily automated. It is at this level of test that the data will be verified to be in the correct state, updated to the appropriate database, etc. This level of test will be conducted at a more detailed level due to lack of formal testing in the unit test phase.
- System Testing: This level of test will test the functions of the system as a whole system. Again, many tests from previous test phases may be reused in the system test phase along, with new tests. The approach will be to verify the functional test plan utilizing more than one browser, operating system and monitor size.
- Performance Test: This level of test will verify that the system can perform adequately with a high volume of users. This test will be performed manually, utilizing the tests from previous test phases. The tests are being designed in a manner such that they can be reused and automated. Performance tests will be performed on the live production site.
- Acceptance Testing: This level of test is to test the Web site from an end-user perspective. The scenarios should be constructed from different types of users entering the site, what they would likely do and the questions they would likely ask. This test can be constructed using the tests from previous test phases and adding additional tests as needed.
- Beta Testing: This level occurs on the live SQE.NET site and is performed concurrently with acceptance testing. The content providers (moderators and product providers), who will be adding data to the site, will provide feedback on the performance and functionality of the site.
The philosophy of the testing is risk-based testing. All test objectives and tests will be prioritized for each level of testing as critical, high, medium, or low priority.
Test Items
The software items to be tested include the following:
- SQE.NET site software: Testing will be on the latest version available from the development vendor. Each version will be identified with an internal version code. The testing will be done with the software using the SQL Server database only and not with the Access database used for development.
- Netscape Navigator Version 3.02 and Internet Explorer (IE) Version 4.02: We have specified that SQE.NET is to work with Netscape 3.0 and IE 4.0 and above with frames. Testing may be done on any version, but formal system and acceptance testing will be performed using these versions. Additional browser testing will be done on the latest MS-Windows browsers, including Netscape 4.0 and the upcoming IE 5.0.
- Microsoft Windows Platform: Most testing will be performed on PCs running a minimum of Microsoft Windows 95, OSR 1. However, the reference platform for testing is Microsoft Windows 98 with the latest service pack.
- Macintosh Platform: Minimal testing will be performed on the Macintosh platform with Netscape running the latest version of Mac OS Version 8.0. SQE will recruit and rely on Beta testing on the Mac platform.
- UNIX: No formal testing will be done on UNIX workstations. SQE will recruit and rely on Beta testing for UNIX.
Software Risk Issues
As this is the initial release of the STQE.net, testing will be required to verify all requirements of the site. Software risk issues are identified and prioritized in the STQE.net Test Objectives spreadsheet that is included in Appendix A of this plan.
Features and Functions to Test
Test objectives are listed as requirements-based, design-based, or code-based and further separated into groups:
Requirements Based
RB-FN |
Features - Navigation Bar |
RB-FH |
Features - Home |
RB-UM |
Features - User Member Management (Join, Sign-In, Update Profile) |
RB-FI |
Features - Interest Areas |
RB-FB |
Features - Books |
RB-FT |
Features - Tools and Services |
RB-FC |
Features - Calendar of Events |
RB-FD |
Features - Disclosures and Terms |
RB-FS |
Features - Sponsors and Advertisers |
RB-FA |
Features - Administrators |
RB-SG |
Scenarios - Guests |
RB-SU |
Scenarios - User Members (Logged-In) |
RB-SM |
Scenarios - Moderators |
RB-SP |
Scenarios - Providers (Vendors) |
RB-SA |
Scenarios - Administrator |
Design Based
DB-US |
Usage |
DB-SC |
Security |
DB-ML |
Multi-Language |
DB-PF |
Performance (Volume and Stress) |
DB-BC |
Browser Configurations |
DB-SR |
Site Failure/Restart |
DB-BK |
Backup/Recovery |
Code Based
CB-LK |
Links |
CB-HS |
Syntax (HTML and ASP Code) |
CB-TG |
Metatags and Graphics Tags |
Features Not to Test
We expect to test all of the objectives in the Test Objectives Inventory (Appendix A). However, if time does not permit, some of the low-priority items may be dropped.
Approach
The testing will be done manually until the site is sufficiently stable to begin developing automatic tests. The testing will cover the requirements for all of the different roles participating in the site: guests, members, vendors, moderators, and administrators.
Automated Testing Tools
We are going to implement automated testing using commercially available, off-the-shelf tools. A tool will be used for feedback/defect tracking. A tool will be implemented for test scripting using a mix of manual and automated tests. Capture/playback tools will be used on a limited basis for automating parts of the smoke test. Other utilities, such as link testers and HTML syntax checkers, will be used as needed. We do not plan any automated performance testing for Release 1.
The smoke tests will be the first series of tests to automate. Work will begin when the GUI interface and database are stable.
Defect Tracking
Testing issues and feedback from beta users will be reported on the STQE.net Issue Form and entered into a tool. Within one business day, we will analyze and classify any new issue as a software defect, enhancement, could not reproduce, not a problem, or failure. Severity level and fix priority of software defects will be set. Issue classes, severity categories, and fix priorities are listed in Appendix B.
Change Management
When the vendor turns the software over to SQE for testing, all changes to the site software will come under change control. The project manager will approve all changes moved into the test environment. A change notice must define the modules being changed and the reason for the change, including all repaired defects. Except for critical fixes that are blocking current testing efforts, changes will be scheduled not to impact testing.
Except for emergency repairs, changes will not be moved into the live environment until the test manager approves a new version for production. After the software is moved to the live environment, testing will confirm that the software matches the configuration in test and perform a smoke test.
Test Cycles
Each time a new version is released to the test environment, the following process will be undertaken:
- Configuration will be checked and recorded.
- Smoke test will be run.
- If successful, the system and/or acceptance test suite will be updated to account for changes uncovered by the Smoke Test and then run. The incidents will be analyzed and software defects reported.
- Ad hoc testing will be performed by the testers of new or changed functionality or functionality that has been error prone.
- New tests will be developed.
While Release 1 is in the "live beta" status, updates that "pass" a test cycle will be moved to the production host and made "live."
Metrics
Metrics will be kept for test effort, incidents, defects, and test cases executed for each test cycle.
Item Pass Fail Criteria
The entrance criteria for each level of testing are defined in Appendix C. The exit criteria are the entrance criteria for the following test level. The Web site will not be opened for content providers when any critical defects exist in those functions involved with the addition of content.
Release 1 of the site will not be opened to the general public until all critical and high-severity defects have been resolved. The project manager will have the discretion to determine that some critical and high defects may be deferred, where the effects of their failures do not affect guests and members in the use of the site.
Suspension Criteria and Resumption Req mts
With each update from the vendor, a smoke test will be performed. If this test does not pass, further testing is halted until a version is delivered that will pass that test. Testing will resume when an update that can pass the smoke test has been delivered.
Test Deliverables
The following documents will be prepared:
- Master Test Plan (this document)
- Test Design
- Test Procedures
- Test Logs
- Test Summary Report
- Test Data
- Automated Test Scripts
- Incident Reports
- Incident Log
Remaining Test Tasks
The vendor will perform the unit and integration testing. The browsers and the operating systems are accepted, as is.
Test Environment
Testers will identify the browser used during all tests. Four Web sites will be used in this development process:
- Development: This site, located on the developer network, is the vendor's development environment.
Note The development uses an Access database for SQL tables for faster development. All other sites use MS-SQL databases.
- Development Staging: Updates to the software will be moved to the Development Staging site for the Smoke Test. This site, located on the developer network, uses the MS-SQL database (same as production).
- SQE Test: This site, located at SQE, will be used for Functional, System, and Acceptance testing.
- Live Production: This site will be located at an ISP supporting 24x7 operations. Performance testing will be done on this site.
A separate test site may be needed for automated testing.
Staffing and Training Needs
The following roles are identified:
- Project Manager: Responsible for managing the total implementation of the SQE.NET Web site. This includes creating requirements, managing the vendor relationship, overseeing the testing process, and reporting to senior management.
- Test Manager: Responsible for developing the master test plan, reviewing the test deliverables, managing the test cycles, collecting metrics and reporting status to the Project Manager, and recommending when testing is complete.
- Test Engineer: Responsible for designing the tests, creating the test procedures, creating the test data, executing tests, preparing incident reports, analyzing incidents, writing automated test procedures, and reporting metrics to the test manager.
- PC/Network Support: Responsible for maintaining the PCs and network at the SQE office to support the testing.
The test manager and test engineers should be familiar with the STEP methodology from having taken the SST course.
Responsibilities
Role |
Candidate |
Timing |
---|---|---|
Project Manager |
Jennifer Brock |
All, Part-Time |
Test Manager |
John Lisle |
All, Part-Time |
Test Engineers |
Jennifer Brock, John Lisle, Paul Danon |
All, Part-Time |
PC / Network Support |
Jim Sowder |
All, Part-Time |
Schedule
See Appendix D for the schedule to develop the test planning and design documents. The following table represents the plan for the expected test cycles.
Testing Cycle |
Event |
Who |
Milestone |
---|---|---|---|
Test Cycle 1 |
Start |
3/8/1999 |
|
Run Smoke Test |
JB, JD, WM |
3/8/1999 |
|
Complete System Test (except performance) |
JB, JD, WM |
3/12/1999 |
|
Complete Acceptance |
JB, JD, WM |
3/12/1999 |
|
Turnover |
Content Providers |
WM |
3/15/1999 |
Test Cycle 2 |
Start |
3/22/1999 |
|
Run Smoke Test |
JB, JD, WM |
3/22/1999 |
|
Complete Acceptance |
JB, JD, WM |
3/26/1999 |
|
Test Cycle 3 |
Start |
4/5/1999 |
|
Run Smoke Test |
JB, JD, WM |
4/5/1999 |
|
Complete Acceptance |
JB, JD, WM |
4/9/1999 |
|
Test Cycle 4 |
Start |
4/12/1999 |
|
Run Smoke Test |
JB, JD, WM |
4/12/1999 |
|
Complete Acceptance |
JB, JD, WM |
4/16/1999 |
|
Test Cycle 5 |
Start |
4/19/1999 |
|
Run Smoke Test |
JB, JD, WM |
4/19/1999 |
|
Complete System Test |
JB, JD, WM |
4/23/1999 |
|
Complete Acceptance |
JB, JD, WM |
4/23/1999 |
|
Turnover |
General Public |
WM |
5/3/1999 |
Planning Risks and Contingencies
- Web Site Not Ready for Content Providers – This will cause a delay to the live beta. We need to give the content providers at least four weeks to enter data before opening the site to the public.
- Web Site Not Ready for Content Addition and General Public – This could be because the software is not ready or because insufficient content is available. This will cause a delay to the opening of the site.
- Web Testing Software Not Available – This will delay the introduction of automated testing, and more manual testing will be required. May need to recruit more staff to do the testing.
- Test Staff Shortages – All of the test staff are part-time and have other priorities. No slack time is allocated for illness or vacation.
- Host Web Site for Live Production – The search for the Host of the live site is a separate project and not completed.
- Configuration Management Problems – The project team has experienced problems with the development vendor's configuration/change management.
Approvals
This plan needs to be approved by the project manager for the Web site and the SQE project sponsor.
Appendix A for STQE net MTP
Refer to electronic spreadsheet for Test Objectives Inventory.
Appendix B for STQE net MTP
Incident Classification |
Definition |
---|---|
Software Defect |
Clearly a defect in the software, maybe requirements based, code based, or design based. |
Enhancement |
An enhancement to existing application. It could be code related, data, or process. |
Could Not Reproduce |
Could not recreate situation; made several attempts before categorizing as such. |
Not a Problem |
Could reproduce and determined application, process, and data were intentionally designed to behave as they are. |
Failure – Environment |
Failure occurred and has been determined to be due to a problem with the environment. Same failure does not occur when the environment has been corrected. |
Failure – Testware Defect |
Failure occurred and determination made. Testware was incorrect. Testware needs to be corrected. |
Failure – Test Execution |
Failure occurred and determination made was related to improper execution of the test. |
Failure – Other |
Failure occurred and does not fit into above categories. |
Severity |
Definition |
---|---|
Low |
Minor flaw not affecting operation or understanding of feature. |
Medium |
Feature is usable, but some functionality is lost or user may misinterpret and use improperly. |
High |
Important functionality is lost or feature is not usable, but there is a work-around or feature is not critical to the operations. |
Critical |
Important feature is not usable. Emergency fix is authorized. |
Fix Priority |
Response |
---|---|
Low |
Correct in next scheduled enhancement release or update documentation and do not fix. |
Medium |
Fix after high-priority defects and enhancements. Document work-around or affect on users. |
High |
Fix within 72 working hours, stop work on enhancements, if necessary. |
Critical |
Fix ASAP, within 12 hours; overtime authorized; skip full acceptance testing, if necessary. Don't go home until fixed. |
Appendix C for STQE net MTP
Test Level |
Description |
---|---|
Unit Test |
Component/Module for unit test is 100% complete:
|
Functional Test |
Components/Modules for integration test are 100% complete:
|
System Test |
Components/Modules for System test are 100% complete:
|
Performance Test |
Components/Modules for Performance test are 100% complete:
|
Acceptance Test |
Components/Modules for Acceptance test are 100% complete:
|
Live Beta |
Components/Modules for Live Beta are 100% complete:
|
Appendix D for STQE net MTP
Deliverable |
Event |
Who |
Milestone |
---|---|---|---|
Master Test Plan |
First Draft |
JBL |
2/8/1999 |
Review |
JBL, JB, WM |
2/9/1999 |
|
Version 1.0 |
JBL |
2/12/1999 |
|
Review |
JBL, JB, WM, DG |
2/24/1999 |
|
Version 1.1 |
JBL |
3/1/1999 |
|
Test Objectives |
First Draft (Partial) |
JBL |
2/8/1999 |
Version 0.9 (Partial) |
JBL |
2/12/1999 |
|
Review |
JBL, JB, WM, DG |
2/24/1999 |
|
Version 1.0 |
JBL |
3/1/1999 |
|
Review |
JBL, JB, WM |
3/3/1999 |
|
Version 1.1 |
JBL |
3/8/1999 |
|
Web Control Structure |
First Draft (Partial) |
JBL |
2/11/1999 |
Review |
JBL, JB, WM, DG |
2/15/1999 |
|
Version 1.0 |
JBL |
3/1/1999 |
|
Review |
JBL, JB, WM |
3/3/1999 |
|
Version 1.1 |
JBL |
3/9/1999 |
|
Smoke Test Design |
First Draft |
JBL |
3/8/1999 |
Review |
JBL, JB, WM |
3/12/1999 |
|
Version 1.0 |
JBL |
3/15/1999 |
|
Review |
JBL, JB, WM |
3/18/1999 |
|
Version 1.1 |
JBL |
3/24/1999 |
|
System/Acceptance Test Design |
First Draft |
JBL |
3/12/1999 |
Review |
JBL, JB, WM |
3/15/1999 |
|
Version 1.0 |
JBL |
3/24/1999 |
|
Review |
JBL, JB, WM |
3/27/1999 |
|
Version 1.1 |
JBL |
3/31/1999 |