Appendix D Sample Master Test Plan

"Planning is a process that should build upon itself – each step should create a new understanding of the situation which becomes the point of departure for new plans."

— Planning, MCDP 5

U.S. Marine Corps

Sample Master Test Plan

Below is a sample master test plan that was created to test the STQE.net Web site, which later became known as StickyMinds.com.

STQE.net Master Test Plan, Release 1

Version 1.5

Test Plan Identifier

STQE.net MTP 1.5

References

The following documents have been used in the preparation of this document:

  1. SQE.NET Requirements Definition, Version 3.2
  2. SQE.NET Web Site Control Structure, Version 1.00
  3. SQE.NET Test Objectives, Version 1.5
  4. SQE Systematic Software Testing Course Notes
  5. STQE.NET Issue form

Introduction

Software Quality Engineering has contracted with an outside software development vendor to create a World Wide Web (WWW) site to function as a knowledge and information sharing site for software testing and quality engineering professionals. The target audience will be the same as the Software Testing and Quality Engineering magazine, software managers (development, testing, and quality) and test professionals, and software engineers who are interested in building and delivering better software.

Unlike many WWW sites, this site, to be known as SQE.net, is a software-driven database application using Microsoft Site Builder with ASP coding and the MS-SQL database. This Master Test Plan (MTP) covers the testing activities of the software and does not cover the initial or on-going tasks of adding, editing, publishing, and verifying the content.

The SQE.net site will be introduced in releases with each release having increasing functionality:

Future enhancements, such as job postings, banner ad management, a "What's new" feature, and a comprehensive site search engine, will take place in subsequent releases. This master test plan, covering the testing for Release 1 includes the following testing levels:

The philosophy of the testing is risk-based testing. All test objectives and tests will be prioritized for each level of testing as critical, high, medium, or low priority.

Test Items

The software items to be tested include the following:

  1. SQE.NET site software: Testing will be on the latest version available from the development vendor. Each version will be identified with an internal version code. The testing will be done with the software using the SQL Server database only and not with the Access database used for development.
  2. Netscape Navigator Version 3.02 and Internet Explorer (IE) Version 4.02: We have specified that SQE.NET is to work with Netscape 3.0 and IE 4.0 and above with frames. Testing may be done on any version, but formal system and acceptance testing will be performed using these versions. Additional browser testing will be done on the latest MS-Windows browsers, including Netscape 4.0 and the upcoming IE 5.0.
  3. Microsoft Windows Platform: Most testing will be performed on PCs running a minimum of Microsoft Windows 95, OSR 1. However, the reference platform for testing is Microsoft Windows 98 with the latest service pack.
  4. Macintosh Platform: Minimal testing will be performed on the Macintosh platform with Netscape running the latest version of Mac OS Version 8.0. SQE will recruit and rely on Beta testing on the Mac platform.
  5. UNIX: No formal testing will be done on UNIX workstations. SQE will recruit and rely on Beta testing for UNIX.

Software Risk Issues

As this is the initial release of the STQE.net, testing will be required to verify all requirements of the site. Software risk issues are identified and prioritized in the STQE.net Test Objectives spreadsheet that is included in Appendix A of this plan.

Features and Functions to Test

Test objectives are listed as requirements-based, design-based, or code-based and further separated into groups:

Requirements Based

RB-FN

Features - Navigation Bar

RB-FH

Features - Home

RB-UM

Features - User Member Management (Join, Sign-In, Update Profile)

RB-FI

Features - Interest Areas

RB-FB

Features - Books

RB-FT

Features - Tools and Services

RB-FC

Features - Calendar of Events

RB-FD

Features - Disclosures and Terms

RB-FS

Features - Sponsors and Advertisers

RB-FA

Features - Administrators

RB-SG

Scenarios - Guests

RB-SU

Scenarios - User Members (Logged-In)

RB-SM

Scenarios - Moderators

RB-SP

Scenarios - Providers (Vendors)

RB-SA

Scenarios - Administrator

Design Based

DB-US

Usage

DB-SC

Security

DB-ML

Multi-Language

DB-PF

Performance (Volume and Stress)

DB-BC

Browser Configurations

DB-SR

Site Failure/Restart

DB-BK

Backup/Recovery

Code Based

CB-LK

Links

CB-HS

Syntax (HTML and ASP Code)

CB-TG

Metatags and Graphics Tags

Features Not to Test

We expect to test all of the objectives in the Test Objectives Inventory (Appendix A). However, if time does not permit, some of the low-priority items may be dropped.

Approach

The testing will be done manually until the site is sufficiently stable to begin developing automatic tests. The testing will cover the requirements for all of the different roles participating in the site: guests, members, vendors, moderators, and administrators.

Automated Testing Tools

We are going to implement automated testing using commercially available, off-the-shelf tools. A tool will be used for feedback/defect tracking. A tool will be implemented for test scripting using a mix of manual and automated tests. Capture/playback tools will be used on a limited basis for automating parts of the smoke test. Other utilities, such as link testers and HTML syntax checkers, will be used as needed. We do not plan any automated performance testing for Release 1.

The smoke tests will be the first series of tests to automate. Work will begin when the GUI interface and database are stable.

Defect Tracking

Testing issues and feedback from beta users will be reported on the STQE.net Issue Form and entered into a tool. Within one business day, we will analyze and classify any new issue as a software defect, enhancement, could not reproduce, not a problem, or failure. Severity level and fix priority of software defects will be set. Issue classes, severity categories, and fix priorities are listed in Appendix B.

Change Management

When the vendor turns the software over to SQE for testing, all changes to the site software will come under change control. The project manager will approve all changes moved into the test environment. A change notice must define the modules being changed and the reason for the change, including all repaired defects. Except for critical fixes that are blocking current testing efforts, changes will be scheduled not to impact testing.

Except for emergency repairs, changes will not be moved into the live environment until the test manager approves a new version for production. After the software is moved to the live environment, testing will confirm that the software matches the configuration in test and perform a smoke test.

Test Cycles

Each time a new version is released to the test environment, the following process will be undertaken:

While Release 1 is in the "live beta" status, updates that "pass" a test cycle will be moved to the production host and made "live."

Metrics

Metrics will be kept for test effort, incidents, defects, and test cases executed for each test cycle.

Item Pass Fail Criteria

The entrance criteria for each level of testing are defined in Appendix C. The exit criteria are the entrance criteria for the following test level. The Web site will not be opened for content providers when any critical defects exist in those functions involved with the addition of content.

Release 1 of the site will not be opened to the general public until all critical and high-severity defects have been resolved. The project manager will have the discretion to determine that some critical and high defects may be deferred, where the effects of their failures do not affect guests and members in the use of the site.

Suspension Criteria and Resumption Req mts

With each update from the vendor, a smoke test will be performed. If this test does not pass, further testing is halted until a version is delivered that will pass that test. Testing will resume when an update that can pass the smoke test has been delivered.

Test Deliverables

The following documents will be prepared:

Remaining Test Tasks

The vendor will perform the unit and integration testing. The browsers and the operating systems are accepted, as is.

Test Environment

Testers will identify the browser used during all tests. Four Web sites will be used in this development process:

A separate test site may be needed for automated testing.

Staffing and Training Needs

The following roles are identified:

The test manager and test engineers should be familiar with the STEP methodology from having taken the SST course.

Responsibilities

Role

Candidate

Timing

Project Manager

Jennifer Brock

All, Part-Time

Test Manager

John Lisle

All, Part-Time

Test Engineers

Jennifer Brock, John Lisle, Paul Danon

All, Part-Time

PC / Network Support

Jim Sowder

All, Part-Time

Schedule

See Appendix D for the schedule to develop the test planning and design documents. The following table represents the plan for the expected test cycles.

Testing Cycle

Event

Who

Milestone

Test Cycle 1

Start

 

3/8/1999

 

Run Smoke Test

JB, JD, WM

3/8/1999

 

Complete System Test (except performance)

JB, JD, WM

3/12/1999

 

Complete Acceptance

JB, JD, WM

3/12/1999

Turnover

Content Providers

WM

3/15/1999

Test Cycle 2

Start

 

3/22/1999

 

Run Smoke Test

JB, JD, WM

3/22/1999

 

Complete Acceptance

JB, JD, WM

3/26/1999

Test Cycle 3

Start

 

4/5/1999

 

Run Smoke Test

JB, JD, WM

4/5/1999

 

Complete Acceptance

JB, JD, WM

4/9/1999

Test Cycle 4

Start

 

4/12/1999

 

Run Smoke Test

JB, JD, WM

4/12/1999

 

Complete Acceptance

JB, JD, WM

4/16/1999

Test Cycle 5

Start

 

4/19/1999

 

Run Smoke Test

JB, JD, WM

4/19/1999

 

Complete System Test

JB, JD, WM

4/23/1999

 

Complete Acceptance

JB, JD, WM

4/23/1999

Turnover

General Public

WM

5/3/1999

Planning Risks and Contingencies

  1. Web Site Not Ready for Content Providers – This will cause a delay to the live beta. We need to give the content providers at least four weeks to enter data before opening the site to the public.
  2. Web Site Not Ready for Content Addition and General Public – This could be because the software is not ready or because insufficient content is available. This will cause a delay to the opening of the site.
  3. Web Testing Software Not Available – This will delay the introduction of automated testing, and more manual testing will be required. May need to recruit more staff to do the testing.
  4. Test Staff Shortages – All of the test staff are part-time and have other priorities. No slack time is allocated for illness or vacation.
  5. Host Web Site for Live Production – The search for the Host of the live site is a separate project and not completed.
  6. Configuration Management Problems – The project team has experienced problems with the development vendor's configuration/change management.

Approvals

This plan needs to be approved by the project manager for the Web site and the SQE project sponsor.

Appendix A for STQE net MTP

Refer to electronic spreadsheet for Test Objectives Inventory.

Appendix B for STQE net MTP

Incident Classification

Definition

Software Defect

Clearly a defect in the software, maybe requirements based, code based, or design based.

Enhancement

An enhancement to existing application. It could be code related, data, or process.

Could Not Reproduce

Could not recreate situation; made several attempts before categorizing as such.

Not a Problem

Could reproduce and determined application, process, and data were intentionally designed to behave as they are.

Failure – Environment

Failure occurred and has been determined to be due to a problem with the environment. Same failure does not occur when the environment has been corrected.

Failure – Testware Defect

Failure occurred and determination made. Testware was incorrect. Testware needs to be corrected.

Failure – Test Execution

Failure occurred and determination made was related to improper execution of the test.

Failure – Other

Failure occurred and does not fit into above categories.

Severity

Definition

Low

Minor flaw not affecting operation or understanding of feature.

Medium

Feature is usable, but some functionality is lost or user may misinterpret and use improperly.

High

Important functionality is lost or feature is not usable, but there is a work-around or feature is not critical to the operations.

Critical

Important feature is not usable. Emergency fix is authorized.

Fix Priority

Response

Low

Correct in next scheduled enhancement release or update documentation and do not fix.

Medium

Fix after high-priority defects and enhancements. Document work-around or affect on users.

High

Fix within 72 working hours, stop work on enhancements, if necessary.

Critical

Fix ASAP, within 12 hours; overtime authorized; skip full acceptance testing, if necessary. Don't go home until fixed.

Appendix C for STQE net MTP

Test Level

Description

Unit Test

Component/Module for unit test is 100% complete:

  • Items to test are outlined; unit tester should know expected results.
  • All programs in unit test compile cleanly.
  • A listing of all unit-tested programs exists.

Functional Test

Components/Modules for integration test are 100% complete:

  • Unit tests are executed and all open high or critical severity level defects are closed.
  • Unit tests are executed and all high or emergency priority to fix defects are closed.
  • All programs scheduled for integration test compile cleanly.
  • A listing of all integration programs is complete.
  • High-level integration test plan complete and peer reviewed.

System Test

Components/Modules for System test are 100% complete:

  • Integration and unit tests are complete and all open high or critical severity level defects are closed.
  • Integration and unit tests are complete and all open high or emergency priority to fix defects are closed.
  • All programs scheduled to run as part of the system test execute with no failures. This will be verified by running the smoke test.
  • System test plan is complete and reviewed.
  • Change management process is in place and adhered to.

Performance Test

Components/Modules for Performance test are 100% complete:

  • System tests are complete and all open high or critical severity level defects are closed.
  • All programs scheduled to run as part of the system test execute with no failures.
  • Performance test plan is complete and reviewed.
  • Change management process is in place and adhered to.

Acceptance Test

Components/Modules for Acceptance test are 100% complete:

  • System test is complete and all high or critical severity level defects are closed.
  • System test is complete and all high or emergency priority to fix defects are closed.
  • All programs scheduled to run as part of the acceptance test execute with no failures. This will be verified by running the smoke test.
  • Acceptance test plan is complete and reviewed.
  • Change management process is in place and adhered to.

Live Beta

Components/Modules for Live Beta are 100% complete:

  • Acceptance test is complete and all high or critical severity level defects are closed.
  • Acceptance test is complete and all high or emergency priority to fix defects are closed.
  • All programs scheduled to run as part of the live beta execute with no failures. This will be verified by running the smoke test.
  • Data is refreshed.
  • Change management process is in place and adhered to.

Appendix D for STQE net MTP

Deliverable

Event

Who

Milestone

Master Test Plan

First Draft

JBL

2/8/1999

 

Review

JBL, JB, WM

2/9/1999

 

Version 1.0

JBL

2/12/1999

 

Review

JBL, JB, WM, DG

2/24/1999

 

Version 1.1

JBL

3/1/1999

Test Objectives

First Draft (Partial)

JBL

2/8/1999

 

Version 0.9 (Partial)

JBL

2/12/1999

 

Review

JBL, JB, WM, DG

2/24/1999

 

Version 1.0

JBL

3/1/1999

 

Review

JBL, JB, WM

3/3/1999

 

Version 1.1

JBL

3/8/1999

Web Control Structure

First Draft (Partial)

JBL

2/11/1999

 

Review

JBL, JB, WM, DG

2/15/1999

 

Version 1.0

JBL

3/1/1999

 

Review

JBL, JB, WM

3/3/1999

 

Version 1.1

JBL

3/9/1999

Smoke Test Design

First Draft

JBL

3/8/1999

 

Review

JBL, JB, WM

3/12/1999

 

Version 1.0

JBL

3/15/1999

 

Review

JBL, JB, WM

3/18/1999

 

Version 1.1

JBL

3/24/1999

System/Acceptance Test Design

First Draft

JBL

3/12/1999

 

Review

JBL, JB, WM

3/15/1999

 

Version 1.0

JBL

3/24/1999

 

Review

JBL, JB, WM

3/27/1999

 

Version 1.1

JBL

3/31/1999



Категории