Effective usability and accessibility testing

Written By: Peter Abrahams
Published:
Content Copyright © 2008 Bloor. All Rights Reserved.

Creating a usable and accessible Information and Communication Technology (ICT) solution should start at the design stage and continue through every phase of the life cycle, including the monitoring of the production systems.

Designing the user interface with input from usability and accessibility experts will ensure a good design, by removing a large number of simple pitfalls and including existing best practice.

The use of well-trained developers, good tools and a variety of automated testing tools should ensure that sloppy coding does not compromise the original design. Bad coding is likely to result in some unexpected usability and accessibility issues.

However, even with the best designers and the best development process no one can be certain that the end-user will be happy, or, even better, delighted, with the experience. The only way to find that out is to test the system with a cross section of the user population.

Although this is obvious, the problem is how to do it adequately. A full user test requires documentation of:

  • The test scenarios
  • How the user tackled the tasks
  • What was found to be easy
  • What was difficult or impossible
  • What errors occurred
  • How long the task took both in elapsed time, number of keys pressed and number of mouse clicks
  • User comments
  • User ratings of each task and the complete set

Collecting all this information and then presenting it to the procuring organisation for approval or to the development team for amendment can be expensive and time consuming.

Usability laboratories have been around for many years to try and meet this need. The laboratories tend to consist of two rooms, one for the end-users testing the system and the second for the observers with a one-way mirror between the two rooms. The laboratory includes video and audio equipment to record the end user interactions so they can be analysed further off-line. The observers make notes and comments which have to subsequently be related to the video and audio and test scripts, this is time consuming and often difficult. These laboratories are:

  • Expensive to set up and run
  • Require skilled technicians to replay and edit the videos
  • Require the end-users to come to the laboratory rather than being able to test in their normal environment
  • Do not allow detailed analysis of the user interactions with system

There is no doubt that a great deal has been learnt through the use of such laboratories and user interaction design has been improved. However, the cost of running such tests can significantly increase the overall cost of the systems development and may often not be considered cost justifiable.

The answer is to use computer technology to reduce the costs and improve the quality and quantity of the data collected. With all the data in digital format the analysis and reporting can be improved and expedited. Also the use of PCs can make the laboratory portable so the testing can be done in the most suitable environment.

TechSmith developed SnagIt, a screen capture product, and then a screen video recording tool Camtasia Suite. Some users of Camtasia Suite started to use it to document user testing. Based on this experience and collection of further requirements TechSmith developed a tool specifically designed for usability testing called Morae.

The heart of Morae is the recorder function, which collects:

  • A continuous stream of the user’s screen
  • All the user interaction via the keyboard and mouse
  • A video stream of the user showing their reactions and verbal comments
  • Start and end of each task
  • Survey results at the end of each task and the session

All of this information is streamed in real-time to one or more observers over standard WANs or LANs. The observers can add comments and mark key events. Morae automatically synchronises the comments and markers with the recorder streams, thus the observer can concentrate on describing the events without having to try and collect timestamps.

After the session is complete all the recordings and observer comments are available for further analysis. The analysis can be for one user session or across multiple users carrying out the same set of tasks. The analysis can include:

  • Going to a particular marker and viewing the user interaction to better understand the event, and then adding further comments to the log
  • Looking at all events of a specific type across multiple sessions to find root causes
  • Graphical representation of the data collected comparing speed of tasks across users
  • Specific analysis such as the use of the mouse; a user may prefer not to use the mouse and will only use it if there is no other obvious way to complete the task

The analysis will:

  • Bring out areas of concern that need to be reported back to the developers
  • Show a generally delighted user community that should reassure the organisation procuring the system

To produce these reports, the graphs, user comments and example interactions can be extracted and built into a document or presentation.

Morae provides a complete package for user testing including set-up, recording, analysis and reporting. It is easier to use and more flexible than the traditional usability laboratory. It should make user testing affordable for a much wider range of developments.