Abstract
We describe the impact on analyst performance of an extended problem report format. Previous studies have shown that Heuristic Evaluation can only find a high proportion of actual problems (thoroughness) if multiple analysts are used. However, adding analysts can result in a high proportion of false positives (low validity). We report surprising interim results from a large study that is exploring the DARe model for evaluation method effectiveness. The DARe model relates the effectiveness of an evaluation method to evaluators’ command of discovery and analysis resources. Previous work has shown that Heuristic Evaluation poorly supports problem discovery and analysis: heuristics tend to be inappropriately applied to problem predictions. We developed an extended problem report format to let us study analyst decision making during usability inspection. Our focus was on the quality of insights into analyst behaviour delivered by this extended report format. However, our first use of this format revealed unexpected improvements in validity (false positive reduction) and appropriate heuristic application. We argue that the format has unexpectedly led to more care and caution in problem discovery and elimination, and in heuristic application. Evaluation performance can thus be improved by indirectly ‘fixing the analyst’ via generic fixes to inspection methods. In addition, we provide the first direct evidence of how evaluators use separate discovery and analysis resources during usability inspection.
Original language | English |
---|---|
Title of host publication | People and Computers XVII — Designing for Society |
Editors | Eamon O'Neill, Philippe Palanque, Peter Johnson |
Place of Publication | London |
Publisher | Springer |
Pages | 145-161 |
ISBN (Print) | 9781852337667 |
DOIs | |
Publication status | Published - 2004 |