Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

...

Risk

Contingency

QA environment not available

Utilize development or production environment

Insufficient time to fully test the Touchstone application with all major web browsers

Execute ~80% of application functionality with MIT's standard browser (Mozilla Firefox 2.0) and ~20% of the functionality with other browsers.
Prioritize the risk of the functionality to be tested. Identify high risk functionality and ensure that high risk items are tested thoroughly. All high priority items will be tested in both browsers. As time permits additional functionality will be tested based priority and risk with the most attention focused on the Mozilla browser.

Test time increases due to changes in scope requiring additional test analysis and/or test case creation

If test time cannot be increased, reduce/cut overlap in multi-browser testing and execute highest priority test cases initially followed by lower priority tests until test time runs out.
If test time cannot be increased, reevaluate priorities and risk and test according to new priorities.

Excessive defect rate or defect correction time

Execute test cases in unrelated/unblocked functional areas of the application based on designated priority
Extend testing window if possible. If not possible then execute test cases in unrelated/unblocked functional areas of the application based on designated priority. Identify problem areas and ensure that additional attention is focused on these areas for regression testing.

Use cases not complete or available

Parts of the application will not be tested, or Questcon will require additional time to assist MIT in authoring the necessary use cases.

Server or client hardware or software specifications not available

Assumptions will be made which may be incorrect. This may lead to inaccurate testing and possible missed defects in the production system.

...

The following table outlines the various types of testing considered for this test effort, any additional comments about the testing, and the individual or group responsible for completing the testing.

Type of Testing

Included
Y/N

Comments

Team Responsible

Automation

N

MIT personnel will utilize jMeter to automate portions of the backend testing; no test automation tools will be used for the UI testing (use of Flash prevents this).

N/A

Conversion

N

There is no pre-existing system; therefore no data conversion is necessary.

N/A

Exploratory

Y

Some level of exploratory testing will be conducted based on heuristics related to typical rich-content internet applications.

Questcon

Functional

Y

Functional testing will be performed based on test cases derived on the documented use cases and front-end functional design.

Questcon

Installation / Upgrade

N

Because this is a web application no installation testing is necessary.

N/A

Integration

Y

Some integration testing will naturally occur as the front-end of the Touchstone application interfaces with and utilizes the back-end APIs.

Questcon

Parallel

N

There is no existing system that Touchstone is replacing.

N/A

Performance

N

Performance testing is coved by a separate Test Plan.

N/A

Regression

Y

Questcon expects to run at least a minimum regression test set prior to release to production.

Questcon

Security

Y

Backend security testing will be done by MIT. Questcon will execute basic security/login testing on the front-end

MIT - Tester (backend)
Questcon (front-end)

UAT

Y

The user community will be tasked with performing ad-hoc user acceptance testing, domain specific metadata testing (metadata titles, tag lists, etc.), as well as previously designated documented functional test cases for multiple browser/OS configurations (primarily Safari or Firefox/Mac configurations).

MIT - User Community

Unit

Y

Questcon expects the MIT developers to perform unit testing prior to releasing code to the test environment.

MIT - Developers

...

Key Deliverables

Description

Expected Delivery Date

Resource

Test Plan

This document.

After all needed data is delivered

Questcon

Test Case Designs

List of objectives and priorities of the tests.

4 business days after test plan is approved and necessary information and environmental needs met.
This is based off an estimated 250 test cases.

Questcon or MIT designated resource

Test Cases

Steps and expected results.

6 business days after test case design completeion.
This is based off an estimated 250 test cases.

Questcon or MIT designated resource

Status Reports

Accomplishments, issues and plans.

Weekly

Questcon or MIT designated resource

Test Logs

Run logs Test execution data that indicates test run status, number of test runs and test metrics summaries.

Ongoing during test execution

Questcon or MIT designated resource

Defect Reports

Entered in Jira as they are discovered.

Ongoing during test execution

Questcon or MIT designated resource

Test Summary Report

Details the results of the testing effort.

2 business days after test execution is completed.

Questcon or MIT designated resource

...

Milestone

Target Timeframe

Summation of Activities

Develop test strategy / plan

01/15/2008 - 02/05/2008
~ 15 business days

  • Analyze existing design documents, notes, and other available materials
  • Develop test plan document

Review test plan

02/05/2008 - 02/11/2008
??~ 4 business days ??

  • Review, clarify, correct, and update the test plan
  • Client approval of test plan

Perform test analysis

//2008 - //2008
~ 4 business days

  • Develop test case design document

Review test case design

//2008 - //2008
~ 4 business days

  • Review, clarify, correct, and update the test case design

Build functional test cases / scenarios

//2008 - //2008
~ 6 business days

  • Combine test objectives into test cases
  • Document data, procedures, and results
  • Prioritize test cases
  • Determine which test cases will be executed in different browser/OS configurations

Setup test environment

//2008 - //2008
~ 5 business days

  • Setup web server and database server
  • Load application under test
  • Setup logins and authorizations

Setup test data

//2008 - //2008
~ 2 business days

  • Review & analyze test cases to target data to load in test environment
  • Load initial test data set

Execute functional & exploratory tests

//2008 - //2008
_~ 6 business days _

  • Execute documented test cases, as well as exploratory tests
  • Communicate with the development team when issues are found
  • Maintain a test run log
  • Track test metrics

Investigate / correct defects

//2008 - //2008
~ 1 business day

  • Investigate and validate that a defect has been found
  • Log defects in Jira
  • Work with the development team, as necessary, to identify the cause of the defect
  • Accept and retest defect corrections from the development team

Execute regression tests

//2008 - //2008
~ 6 business days

  • Execute a prioritized subset of test cases as regression of the system once all functional and exploratory testing is complete
  • Validate that no new errors have been introduced as a result of correcting known defects or configuration management / version control issues
  • Investigate and validate that a defect has been found
  • Log defects in Jira
  • Work with the development team, as necessary, to identify the cause of the defect
  • Accept and retest defect corrections from the development team

Execute UAT

//2008 - //2008
~ 10 business days

  • Work with the user community to identify and manage the execution of user acceptance tests
  • Communicate with the development team when issues are found
  • Investigate and validate that a defect has been found
  • Log defects in Jira
  • Work with the development team, as necessary, to identify the cause of the defect
  • Accept and retest defect corrections from the development team

Create test summary

//2008 - //2008
2 business days

  • Create and deliver a test summary report to include:
    • Summation of planned/actual test activities
    • Deviation from planned activities
    • Summary of defects (open defects)
    • Summary of test metrics

Total:

65 business days

_ ** This estimate could change depending on the actual number of test cases, complexity of test cases, or items we are not aware of at this time._