Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

...

  • Stanford's WebAuth
  • Internet 2's Shibboleth
  • SAML (the Security Assertion Markup Language)
  • A new account management system for some users outside of the traditional MIT community
  • HTTP/S (extensive redirects)
  • SSL
  • MIT X.509 certificates
  • Kerberos (via the HTTP/SPNEGO protocol)
  • TLS
  • OpenID
  • Web Services
  • MySQL (including replication)
  • Apache
  • Tomcat
  • IDP High Availability Package
  • LDAP
  • KDC
  • DNS load balancing

2.2 Scope

Currently there are 17 total use cases identified for CAMS and none for the IdPi.  This document assumes an extimated 50 total use cases for planning purposes.

2.2.1 Items To Be Tested

Each of the following UI components and front end functionality developed as part of the Touchstone project will be tested:

  • CAMS Use Cases (IdPe)
    • Self-initiated account creation
    • Users that specify an MIT email address will have their MIT ID number automatically associated with the CAMS account
    • Self service password reset
    • User may register an OpenID account
    • User may set or modify full name information
    • User may associate MIT ID number with account
    • Accounts manager password reset
    • Accounts manager may associate MIT ID number with account
    • User may associate alternate security IDs with the account
    • User may associate other email addresses with their account
    • Administrators may set the status of alt_security_id types
    • OverView of Account Admin
    • Account Deactivation
  • IdPi Use Cases
    • TBD

** Currently there are 17 total use cases identified for CAMS and none for the IdPi.  This document assumes an extimated 50 total use cases for planning purposes.

2.2.2 Items Not To Be Tested

...

  • CAMS Use Cases (IdPe)
    • Sponsored account creation
    • Sponsor may review a list of accounts that they have sponsored
  • IdPi Use Cases
    • TBD

** Currently there are 17 total use cases identified for CAMS and none for the IdPi.  This document assumes an extimated 50 total use cases for planning purposes.

2.3 Risks & Contingencies

...

Risk

Contingency

QA environment not available

Utilize development or production environment

Insufficient time to fully test the Touchstone application with all major web browsers

Execute ~80% of application functionality with MIT's standard browser (Mozilla Firefox 2.0) and ~20% of the functionality with other browsers.
Prioritize the risk of the functionality to be tested. Identify high risk functionality and ensure that high risk items are tested thoroughly. All high priority items will be tested in both browsers. As time permits additional functionality will be tested based priority and risk with the most attention focused on the Mozilla browser.

Test time increases due to changes in scope requiring additional test analysis and/or test case creation

If test time cannot be increased, reduce/cut overlap in multi-browser testing and execute highest priority test cases initially followed by lower priority tests until test time runs out.
If test time cannot be increased, reevaluate priorities and risk and test according to new priorities.

Excessive defect rate or defect correction time

Execute test cases in unrelated/unblocked functional areas of the application based on designated priority
Extend testing window if possible. If not possible then execute test cases in unrelated/unblocked functional areas of the application based on designated priority. Identify problem areas and ensure that additional attention is focused on these areas for regression testing.

Use cases not complete or available

Parts of the application will not be tested, or Questcon will require additional time to assist MIT in authoring the necessary use cases.

Server or client hardware or software specifications not available

Assumptions will be made which may be incorrect.  This may lead to inacurrate testing and possible missed defects in the production system.

3.0 Approach

3.1 Testing Strategy

...

Milestone

Target Timeframe

Summation of Activities

Develop test strategy / plan

01/15/2008 - 02/05/2008
~ 15 business days

  • Analyze existing design documents, notes, and other available materials
  • Develop test plan document

Review test plan

02/05/2008 - 02/11/2008
~ 4 business days 

  • Review, clarify, correct, and update the test plan
  • Client approval of test plan

Perform test analysis

//2008 - //2008
~ 4 business days

  • Develop test case design document

Review test case design

//2008 - //2008
~ 4 business days

  • Review, clarify, correct, and update the test case design

Build functional test cases / scenarios

//2008 - //2008
 ~ 6 business days

  • Combine test objectives into test cases
  • Document data, procedures, and results
  • Prioritize test cases
  • Determine which test cases will be executed in different browser/OS configurations

Setup test environment

//2008 - //2008
~ 5 business days

  • Setup web server and database server
  • Load application under test
  • Setup logins and authorizations

Setup test data

//2008 - //2008
~ 2 business days

  • Review & analyze test cases to target data to load in test environment
  • Load initial test data set

Execute functional & exploratory tests

//2008 - //2008
~ 6 business days 

  • Execute documented test cases, as well as exploratory tests
  • Communicate with the development team when issues are found
  • Maintain a test run log
  • Track test metrics

Investigate / correct defects

//2008 - //2008
~ 1 business day

  • Investigate and validate that a defect has been found
  • Log defects in Jira
  • Work with the development team, as necessary, to identify the cause of the defect
  • Accept and retest defect corrections from the development team

Execute regression tests

//2008 - //2008
~ 6 business days

  • Execute a prioritized subset of test cases as regression of the system once all functional and exploratory testing is complete
  • Validate that no new errors have been introduced as a result of correcting known defects or configuration management / version control issues
  • Investigate and validate that a defect has been found
  • Log defects in Jira
  • Work with the development team, as necessary, to identify the cause of the defect
  • Accept and retest defect corrections from the development team

Execute UAT

//2008 - //2008
~ 10 business days

  • Work with the user community to identify and manage the execution of user acceptance tests
  • Communicate with the development team when issues are found
  • Investigate and validate that a defect has been found
  • Log defects in Jira
  • Work with the development team, as necessary, to identify the cause of the defect
  • Accept and retest defect corrections from the development team

Create test summary

//2008 - //2008
2 business days

  • Create and deliver a test summary report to include:
    • Summation of planned/actual test activities
    • Deviation from planned activities
    • Summary of defects (open defects)
    • Summary of test metrics

Total:

65 business days

 ** This estimate could change depending on the actual number of test cases, complexity of test cases, or items we are not aware of at this time.