Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.
Comment: Corrected links that should have been relative instead of absolute.

Touchstone Functional Testing - Phase 1 - Test Plan

1.0 Document identifiers

1.1 Document Author

The document author is:

Author

Title

Telephone

Email Address

Will Smithee

Senior Practice Manager

336-232-5208

will_smithee@questcon.com

1.2 Document Revisions

Issue

Date

Author

Reason for Change

0.1

01/27/2008

Will Smithee

Initial draft

 

 

 

 

1.3 References

The following documents were used as sources of information for this test plan:

2.0 Introduction

2.1 Purpose

The objective of this test plan is to outline the functional testing effort to be undertaken for the Touchstone project.

2.1.1 Project Description

MIT Touchstone is a new suite of technologies for authenticating a variety of web applications, being introduced by IS&T. MIT Touchstone does provide a single sign-on solution for applications that have been coded and configured to use the system. Within the context of Touchstone enabled applications, users will be able to seamlessly transition between systems without being prompted for additional authentication information.
The intended audience of this document includes all IT personnel involved in the development, testing, and support of Touchstone.

2.1.2 Project Technologies

MIT Touchstone utilizes/integrates with the following technologies:

  • Stanford's WebAuth
  • Internet 2's Shibboleth
  • SAML (the Security Assertion Markup Language)
  • A new account management system for some users outside of the traditional MIT community
  • HTTP/S (extensive redirects)
  • SSL
  • MIT X.509 certificates
  • Kerberos (via the HTTP/SPNEGO protocol)
  • TLS
  • OpenID
  • Web Services
  • MySQL (including replication)
  • Apache
  • Tomcat
  • IDP High Availability Package
  • LDAP
  • KDC
  • DNS load balancing

2.2 Scope

Currently there are 17 total use cases identified for CAMS and none for the IdPi. This document assumes an estimated 50 total use cases for planning purposes.

2.2.1 Items To Be Tested

Each of the following UI components and front end functionality developed as part of the Touchstone project will be tested:

  • CAMS Use Cases (IdPe)
    • Self-initiated account creation
      • sending activation code via e-mail
      • password strength validation.
    • Self service password reset
    • User may set or modify full name information
    • Users that specify an MIT email address will have their MIT ID number automatically associated with the CAMS account
    • User may register an OpenID account
    • User may associate MIT ID number with account
    • Accounts manager password reset
    • Accounts manager may associate MIT ID number with account
    • User may associate alternate security IDs with the account
    • User may associate other email addresses with their account
    • Administrators may set the status of alt_security_id types
    • OverView of Account Admin
    • Account Deactivation
  • IdPi Use Cases
    • Username/password
      • Cookies enabled/disabled
      • Javascript enabled/disabled
    • Kerberos
      • Cookies enabled/disabled
      • Javascript enabled/disabled
      • With and without valid tickets
    • X.509 User Certificates
      • Cookies enabled/disabled
      • Javascript enabled/disabled
      • with and without valid certificate installed
    • Preference for authentication method via cookie
    • Open issue - variety of browsers supported (question mostly about mobile support)

** Currently there are 17 total use cases identified for CAMS and none for the IdPi. This document assumes an estimated 50 total use cases for planning purposes.

2.2.2 Items Not To Be Tested

The following modules and types of tests are considered to be outside the scope of immediate test effort.  As functionality is added, test cases for this functionality will need to be developed.  Immediate test planning focus should be on CAMS versions 0.5 and 0.6, continuing to later versions as time progresses.

  • CAMS Use Cases (IdPe)
    • Sponsored account creation
    • Sponsor may review a list of accounts that they have sponsored
  • IdPi Use Cases
    • TBD

** Currently there are 17 total use cases identified for CAMS and none for the IdPi. This document assumes an estimated 50 total use cases for planning purposes.

2.3 Risks & Contingencies

The following risks have been identified, which may impact the testing effort.

Risk

Contingency

QA environment not available

Utilize development or production environment

Insufficient time to fully test the Touchstone application with all major web browsers

Execute ~80% of application functionality with MIT's standard browser (Mozilla Firefox 2.0) and ~20% of the functionality with other browsers.
Prioritize the risk of the functionality to be tested. Identify high risk functionality and ensure that high risk items are tested thoroughly. All high priority items will be tested in both browsers. As time permits additional functionality will be tested based priority and risk with the most attention focused on the Mozilla browser.

Test time increases due to changes in scope requiring additional test analysis and/or test case creation

If test time cannot be increased, reduce/cut overlap in multi-browser testing and execute highest priority test cases initially followed by lower priority tests until test time runs out.
If test time cannot be increased, reevaluate priorities and risk and test according to new priorities.

Excessive defect rate or defect correction time

Execute test cases in unrelated/unblocked functional areas of the application based on designated priority
Extend testing window if possible. If not possible then execute test cases in unrelated/unblocked functional areas of the application based on designated priority. Identify problem areas and ensure that additional attention is focused on these areas for regression testing.

Use cases not complete or available

Parts of the application will not be tested, or Questcon will require additional time to assist MIT in authoring the necessary use cases.

Server or client hardware or software specifications not available

Assumptions will be made which may be incorrect. This may lead to inaccurate testing and possible missed defects in the production system.

3.0 Approach

3.1 Testing Strategy

The overall approach to this test effort will be to validate that the Touchstone UI meets the needs of the MIT to provide a mechanism for single sign on using a federated id. Validation will be performed based on test cases derived from the documented use cases, front-end functional designs, as well as exploratory testing heuristics.

MIT has indicated that the user community is largely standardized on Mozilla's Firefox v2.0 web browser in a PC or Mac environment. Rather than re-executing all tests with each browser, Questcon will execute approximately 80% of the test cases using Firefox and approximately 20% of the tests using IE 7 (in a PC/Windows architecture). There will be some overlap in testing and touch points, but not enough to significantly impact the schedule. Questcon will analyze the test cases to identify the best candidates for execution using IE 7. IE 7 test cases will be chosen based on the amount of functionality traversed in the application. In other words Questcon will attempt to "touch" as much of the application as possible using IE 7.

Furthermore, a significant portion of the user community utilizes the Mac OS X operating system with the Safari browser. Some duplicate testing will be performed by Questcon utilizing Safari and Firefox for the Mac (10% or less). MIT should designate a group of users to execute more tests using the Safari/Mac and Firefox/Mac combination of browser and operating system. Questcon will assist the users in identifying the best tests to execute.

The following table outlines the various types of testing considered for this test effort, any additional comments about the testing, and the individual or group responsible for completing the testing.

Type of Testing

Included
Y/N

Comments

Team Responsible

Automation

N

MIT personnel will utilize jMeter to automate portions of the backend testing; no test automation tools will be used for the UI testing (use of Flash prevents this ??).

N/A

Conversion

N

There is no pre-existing system; therefore no data conversion is necessary.

N/A

Exploratory

N

Some level of exploratory testing will be conducted based on heuristics related to typical rich-content internet applications.

N/A

Functional

Y

Functional testing will be performed based on test cases derived on the documented use cases and front-end functional design.

Questcon

Installation / Upgrade

N

Because this is a web application no installation testing is necessary.

N/A

Integration

N

Some integration testing will naturally occur as the front-end of the Touchstone application interfaces with and utilizes the back-end APIs.

N/A

Parallel

N

There is no existing system that Touchstone is replacing.

N/A

Performance

N

Performance testing is coved by a separate Test Plan.

N/A

Regression

Y

Questcon expects to run at least a minimum regression test set prior to release to production.

Questcon

Security

Y

Backend security testing will be done by MIT. Questcon will execute basic security/login testing on the front-end

MIT - Tester (backend)
Questcon (front-end)

UAT

Y

The user community will be tasked with performing ad-hoc user acceptance testing, domain specific metadata testing (metadata titles, tag lists, etc.), as well as previously designated documented functional test cases for multiple browser/OS configurations (primarily Safari or Firefox/Mac configurations).

MIT - User Community

Unit

Y

Questcon expects the MIT developers to perform unit testing prior to releasing code to the test environment.

MIT - Developers

3.2 Tools

The following tools will be used as part of the overall Touchstone testing effort:

Tool

Purpose

Used By

Atlassian Jira

Web-based defect tracking system accessed by http://mv.ezproxy.com.ezproxy.canberra.edu.au/jira

Touchstone Project Team (MIT & Questcon)

3.3 Environmental Needs

Questcon anticipates the following server and client configurations for the QA environment:

All To Be Determined (TBD) values will be updated once MIT provides the necessary information.

3.3.1 IdPe Server Configuration

Hardware

O/S

Other

Virtual Machine

RHEL3

  • Apache 2.0
  • Tomcat 5.5.25
  • Openssl 0.9.7+
  • mod_jk ???
  • Jdk 1.5.x
  • reCAPTCHA
  • MySQL 5.x
  • Shibboleth IdP 1.3.3
  • Spring/SASH stack 2.0.7

This information is an assumption and will be updated when MIT specifies what their production server configuration will be.

3.3.2 IdPi Server Configuration

Hardware

O/S

Other

Dell

RHEL3

  • Apache 2.0
  • Tomcat 5.5.25
  • Openssl 0.9.7+
  • mod_jk ???
  • Jdk 1.5.x
  • Shibboleth IdP 1.3.2
  • Stanford WebAuth 3.5.4 (with local mods)
  • Perl 5.8.x
  • mod_auth_kerb 5.3? (with local mod)

This information is an assumption and will be updated when MIT specifies what their production server configuration will be.

3.3.4 Client Configuration

Hardware

O/S

Other

PC

Windows XP Professional SP 2

  • Mozilla Firefox v2+
  • Microsoft IE v7

Macintosh PowerPC

Mac OS X

  • Firefox v2+
  • IE v7
  • Safari v2+

Mobile Platforms

TBD

TBD

This information is an assumption and will be updated when MIT specifies what their desired client configuration will be for the purposes of this testing phase.

4.0 Schedule of Deliverables and Resources

All of the time estimates specified here are based on and assumed 50 use cases with a resulting ~5 test cases each on average. Should the number of use cases change, number of resulting test cases change or the complexity of the test cases be greater than anticipated actual times may either be lengthened or shortened.

4.1 Deliverables

This section identifies the deliverables, delivery date and resource responsible for each deliverable.

Key Deliverables

Description

Expected Delivery Date

Resource

Test Plan

This document.

After all needed data is delivered (first half March 08)

Questcon

Test Case Designs

List of objectives and priorities of the tests.

4 business days after test plan is approved and necessary information and environmental needs met.
This is based off an estimated 250 test cases.

Questcon or MIT designated resource

Test Cases

Steps and expected results.

6 business days after test case design completeion.
This is based off an estimated 250 test cases.

Questcon or MIT designated resource

Status Reports

Accomplishments, issues and plans.

Weekly

Questcon or MIT designated resource

Test Logs

Test execution data that indicates test run status, number of test runs and test metrics summaries.

Ongoing during test execution

Questcon or MIT designated resource

Defect Reports

Entered in Jira as they are discovered.

Ongoing during test execution

Questcon or MIT designated resource

Test Summary Report

Details the results of the testing effort.

2 business days after test execution is completed.

Questcon or MIT designated resource

4.2 Test Schedule

The planned test schedule of the Touchstone project does not has an anticipated start date or completion date set yet. All dates are subject to several assumptions, some of which have been identified in 2.3 Risks & Contingencies.

Information below is null and void, working with new Questcon contractor to revise and update the schedule during the first week of March 08. 

Milestone

Target Timeframe

Summation of Activities

Develop test strategy / plan

01/15/2008 - 02/05/2008
~ 15 business days

  • Analyze existing design documents, notes, and other available materials
  • Develop test plan document

Review test plan

02/05/2008 - 02/11/2008
??~ 4 business days ??

  • Review, clarify, correct, and update the test plan
  • Client approval of test plan

Perform test analysis

//2008 - //2008
~ 4 business days

  • Develop test case design document

Review test case design

//2008 - //2008
~ 4 business days

  • Review, clarify, correct, and update the test case design

Build functional test cases / scenarios

//2008 - //2008
~ 6 business days

  • Combine test objectives into test cases
  • Document data, procedures, and results
  • Prioritize test cases
  • Determine which test cases will be executed in different browser/OS configurations

Setup test environment

//2008 - //2008
~ 5 business days

  • Setup web server and database server
  • Load application under test
  • Setup logins and authorizations

Setup test data

//2008 - //2008
~ 2 business days

  • Review & analyze test cases to target data to load in test environment
  • Load initial test data set

Execute functional & exploratory tests

//2008 - //2008
_~ 6 business days _

  • Execute documented test cases, as well as exploratory tests
  • Communicate with the development team when issues are found
  • Maintain a test run log
  • Track test metrics

Investigate / correct defects

//2008 - //2008
~ 1 business day

  • Investigate and validate that a defect has been found
  • Log defects in Jira
  • Work with the development team, as necessary, to identify the cause of the defect
  • Accept and retest defect corrections from the development team

Execute regression tests

//2008 - //2008
~ 6 business days

  • Execute a prioritized subset of test cases as regression of the system once all functional and exploratory testing is complete
  • Validate that no new errors have been introduced as a result of correcting known defects or configuration management / version control issues
  • Investigate and validate that a defect has been found
  • Log defects in Jira
  • Work with the development team, as necessary, to identify the cause of the defect
  • Accept and retest defect corrections from the development team

Execute UAT

//2008 - //2008
~ 10 business days

  • Work with the user community to identify and manage the execution of user acceptance tests
  • Communicate with the development team when issues are found
  • Investigate and validate that a defect has been found
  • Log defects in Jira
  • Work with the development team, as necessary, to identify the cause of the defect
  • Accept and retest defect corrections from the development team

Create test summary

//2008 - //2008
2 business days

  • Create and deliver a test summary report to include:
    • Summation of planned/actual test activities
    • Deviation from planned activities
    • Summary of defects (open defects)
    • Summary of test metrics

Total:

65 business days

_ ** This estimate could change depending on the actual number of test cases, complexity of test cases, or items we are not aware of at this time._