ABSTRACT
Usability specialists were better than non-specialists at performing heuristic evaluation, and “double experts” with specific expertise in the kind of interface being evaluated performed even better. Major usability problems have a higher probability than minor problems of being found in a heuristic evaluation, but more minor problems are found in absolute numbers. Usability heuristics relating to exits and user errors were more difficult to apply than the rest, and additional measures should be taken to find problems relating to these heuristics. Usability problems that relate to missing interface elements that ought to be introduced were more difficult to find by heuristic evaluation in interfaces implemented as paper prototypes but were as easy as other problems to find in running systems.
- 1.Bias, R. Walkthroughs: Efficient collaborative testing. IEEE Software 8, 5 (September 1991), 94-95. Google ScholarDigital Library
- 2.Carroll, J.M. Infinite detail and emulation in an ontologically minimized HCI. Proc. ACM CHI'90 (Seattle, WA, 1-5 April 1990), 321-327. Google ScholarDigital Library
- 3.Carroll, .I.M., Kellogg, W.A., and Rosson, M.B. The task-artifact cycle. In Carroll, J.M. (Ed.), Designing Interaction: Psychology at the Human-Computer Interface. Cambridge University Press, Cambridge, U.K., 1991.74-102. Google ScholarDigital Library
- 4.D esurvire, H., Lawrence, D., and Atwood, M. Empiricism versus judgement: Comparing user interface evaluation methods on a new telephone-based interface. A CM S1GCHI Bulletin 23, 4 (October 1991), 58-59. Google ScholarDigital Library
- 5.Engelbeck, G., and Roberts, T.L. The effect of several voicemenu characteristics on menu selection performance. Behaviour & Information Technology in press.Google Scholar
- 6.Gould, J.D., and Boles, S.J. Speech filingmAn office system for principals. IBM Systems Journal 23, I (1984), 65-81.Google ScholarDigital Library
- 7.Halstead-Nussloch, R. The design of phone-based interfaces for consumers. Proc. ACM CHI'89 (Austin, TX, 30 April--4 May 1989), 347-352. Google ScholarDigital Library
- 8.Jeffries, R., Miller, J.R., Wharton, C., and Uyeda, K.M. User interface evaluation in the real world: A comparison of four techniques. Proc. ACM CHI'91 (New Orleans, LA, 27 April- 2 May 1991), 119-124. Google ScholarDigital Library
- 9.Karat, C.-M., Campbell, R., Fiegel, T. Comparisons of empirical testing and walkthrough methods in user interface evaluation. Proc. ACM CHI'92 (Monterey, CA, 3-7 May 1992). Google ScholarDigital Library
- 10.Kellogg, W.A. Qualitative artifact analysis. Proc. INTER- ACT' 90 3rd IFIP Conf. Human-Computer Interaction (Cambridge, U.K., 27-31 August 1990), 193-198. Google ScholarDigital Library
- 11.Lewis, C., Poison, P., Wharton, C., and Rieman, J. Testing a walkthrough methodology for theory-based design of walkup-and-use interfaces. Proc. ACM CHI'90 (Seattle, WA, 1-5 April 1990), 235-241. Google ScholarDigital Library
- 12.Molich, R., and Nielsen, J. Improving a human-computer dialogue. Communications of the ACM 33, 3 (March 1990), 338- 348. Google ScholarDigital Library
- 13.Nielsen, J. Usability engineering at a discount. In Salvendy, G., and Smith, M.J. (Eds.), Designing and Using Human- Computer interfaces and Knowledge Based Systems, Elsevier Science Publishers, Amsterdam, 1989. 394-401. Google ScholarDigital Library
- 14.Nielsen, J. Paper versus computer implementations as mockup scenarios for heuristic evaluation. Proc. INTER- ACT" 90 3rd IFIP Conf. Human-Computer interaction (Cambridge, U.K., 27-31 August 1990), 315-320. Google ScholarDigital Library
- 15.Nielsen, J. Applying heuristic evaluation to a highly domainspecific interface. Manuscript submitted for publication.Google Scholar
- 16.Nielsen, j. Usability Engineering. Academic Press, San Diego, CA, 1992. Google ScholarDigital Library
- 17.Nielsen, J., and Molich, R. Heuristic evaluation of user interfaces. Proc. ACM CHI'90 (Seattle, WA, 1-5 April 1990), 249-256. Google ScholarDigital Library
- 18.Teitelbaum, R.C., and Granda, R.E. The effects of positional constancy on searching menus for information. Proc. ACM CHI'83 (Boston, MA, 12-15 December 1983), 150-153. Google ScholarDigital Library
- 19.Wharton, C., Bradford, J., Jeffries, R., and Franzke, M. Applying cognitive walkthroughs to more complex interfaces: Experiences, issues, and recommendations. Proc. ACM CHI' 92 (Monterey, CA, 3-7 May 1992). Google ScholarDigital Library
Index Terms
- Finding usability problems through heuristic evaluation
Recommendations
Heuristic evaluation for games: usability principles for video game design
CHI '08: Proceedings of the SIGCHI Conference on Human Factors in Computing SystemsMost video games require constant interaction, so game designers must pay careful attention to usability issues. However, there are few formal methods for evaluating the usability of game interfaces. In this paper, we introduce a new set of heuristics ...
Heuristic evaluation and usability testing: case study
IDGD'11: Proceedings of the 4th international conference on Internationalization, design and global developmentThe goal of this user centered design (UCD) study was to to identify usability issues on the Bogaziçi University Industrial Engineering (BUIE) department website user interface (UI) and also to provide a re-design guideline for the website. In this ...
Heuristic Evaluation as a Complement to Usability Testing: A Case Study in Web Domain
ITNG '15: Proceedings of the 2015 12th International Conference on Information Technology - New GenerationsUsability testing is one of the most used methods to define the level of usability of a software product. However, there is always uncertainty to determine the best method that complements user testing in a depth usability assessment. Nowadays, the ...
Comments