last update: June 8, 2006

0H420: Usability Engineering and Testing


Prof. Dr. M. Rauterberg, Full Professor,
Faculty Industrial Design, User Centred Engineering, Technical University Eindhoven


This course offers 2SPs (80 hrs) for successful participation, plus 1SP (40 hrs) for extra work.

After following this lecture students should be able to:

  • Select the appropriate approach for usability testing in a specific setting
  • Set up a usability test for a specific purpose, taking into account constraints of the design project they are involved in (e.g. Time constraints and limited access to representative users)
  • Conduct a usability test and translate the results to input that is appropriate for the design project
  • Assess evaluation methods on validity, thoroughness, efficiency and reliability of their outcomes


  • Overview of analytical and empirical evaluation methods
  • Discussion of costs and benefits associated with evaluation methods
  • Discussion of appropriateness of testing approaches for specific design contexts
  • Planning an analytical evaluation
  • Conducting an analytical evaluation
  • Planning a usability test
  • Conducting a usability test
  • Analysing the results and translating them to input for the design project
  • Choosing and combining evaluation methods


The lecture will start with an introduction to usability engineering, user centred design, and to conducting usability tests in practice, discussing issues as selecting complementary methods and tailoring them to the constraints of design practice. Then, an overview will be given of the process of planning a usability test. This will cover topics such as the aims of usability, phrasing research questions, selecting subjects and tasks and running the actual experiment. The students will apply this knowledge on cases of design projects for which usability tests have to planned.

The main part of the lecture with assignments will consist of planning and conducting a combination of evaluation methods for a given system or product. Students have to hand in assignments during this process to keep track of progress and the decisions that have been made. The set of assignments is intended to learn to choose and reason about all decisions that are taken to implement evaluation methods and to discuss the possible limitations of the findings.

Students will learn to select methods to fulfill the aims of the usability test. They will gain hands-on experience in planning and conducting analytical and empirical evaluations. They will experience how to optimise the trade-offs inherent in conducting usability tests under real-life constraints. Literature will be handed out during the lecture.

Course schedule





31 March cancelled  
7 April cancelled  

14 April

1. Introduction, user-centred design, usability problems, cost/benefit analysis of evaluation methods


        Cost justifying Usability/Return of Investment

        Gould & Lewis, 1985

        Hartson et al, 2001

        Saul Greenberg

       ISO standards [overview]

       BECTA design, 2002

21 April

No lecture - Eastern holiday


28 April

2. Usability evaluation methods (UEM): heuristic evaluation, guidelines for motivating interfaces, cognitive walkthroughs, the evaluator effect


examples at

Hall of fame/shame

Little things matter

        Nielsen and Mack, 1994,
chapter 5 [CHIpaper]

        Heuristic Evaluation:
Nielsen's website

        Mayhew, 1999,
Chapter 2 and 4

        Malone and Lepper, 1987

       Metrics-1: Rauterberg, 1995

        Metrics-2: Rauterberg, 1996

        Game Design: NASA 1995 Flight Desk Desing Case

       UEM: Sullivan, 1991

       UEM: Karat et al, 1992

       UEM: John & Marks, 1996

       UEM: Doubleday et al, 1997

       UEM: Gray & Salzman, 1998

       UEM: Jacobsen, 1999,
[see also appendix]

       Evaluator Effect: Herzog & Jacobsen, 2001,

5 May

No lecture - Liberation day


12 May

No lecture - Examinations

19 May

Student teams presentations of assignment 1 and 2

3. Empirical evaluation methods

4. Usability Test in Lab

5. Planning a usability test and thinking aloud techniques


        Dumas and Redish, 1999, chapters 7, 11, 12, 13, 14, 18, 19, 20, 21, 22

        Boren and Ramey, 2000

        Hartson et. al., 2001

        NIST (2001): Common Industry Format [all documents, zipped]

       Usability Testing (2003)

       UCD at IBM (2001)

26 May

Student teams presentations of assignment 3


       Molich et al (1998)

        Jamesson (2000)

       Bradbury (2001)


2 June

6. Comparing, choosing and combining methods




        Lewis, 1994

        Jacobsen et al., 1998

        Observer Quick Guide

        Rauterberg., 1992
An iterative-cycle software process model.

        Greger Viken Teigre, 1998
Useful and Useless Support for Knowledge Teamwork:
A Tribute to Ungrateful Users


reservations of the Usability Lab at IPO can be done via
Martin Boschmann

video cameras goes via
Mr Nico v/d/ Ven
tel 040 247 5229 

9 June

No lecture - Whit sun holiday


16 June

Student teams presentations of assignment 4 and 5

8. Closing discussion



Course work

The grade for this course will be determined by the work done on a set of assignments. The assignments will cover a number of steps relevant for planning and conducting a combination of analytical and empirical evaluation methods. Furthermore, they will include discussions about the trade-offs of the decisions taken and the validity of the findings of the evaluation. Grades will be determined based on the rigour with which the work is done, whether relevant concepts discussed in the lectures are embedded in the work and the report, and extra initiative to ensure good quality of work.


Date due


1.      Make a plan to conduct a heuristic evaluation, plus argumentations for your choices

19 May


2.      Conduct the heuristic evaluation, write list of problems, write report and discussion

19 May


3.      Make a setup and all the preparations for conducting the usability test, plus argumentation for your choices

26 May


4.      Run the lab usability test and collect the data, write lists of problems, discuss the user effect, etc.

2 June


5.      Combine the usability problem lists of the two methods, assess the methods, and discuss the outcome

16 June



Student evaluation

Students will be working in two person teams and will be graded based on the reports written and the presentation given during the lecture using a 1-10 point grading scale.