ABSTRACT
This paper presents the results of a study that compared two think-aloud styles: the classic approach and a relaxed think-aloud on the nature and number of participant utterances produced. Overall, ten categories of utterance were extracted from the verbal data ranging from categories that had a direct impact on usability problem analysis, to those which simply described procedural actions. There were no categories of utterance that were unique to either method. The interactive think-aloud led to the production of more utterances that could be directly used in usability problem analysis. Participants provided explanations, opinions and recommendations during classic think-aloud, even though they were not instructed to do so. This finding suggests that the social context of testing may override the classic instruction to think aloud.
- Barendregt, W., Bekker, M. M., Bouwhuis, D. G. and Baauw, E. Predicting effectiveness of children participants in user testing based on personality characteristics. Behaviour & Information Technology, 26, 2 (2007), 133-147. Google ScholarDigital Library
- Barnum, C. M. Usability testing and research. Longman, London, UK, 2002. Google ScholarDigital Library
- Boren, M. T. and Ramey, J. Thinking Aloud: Reconciling theory and practice. IEEE Transactions on Professional Communication, 43, 3 (2000), 261--278.Google Scholar
- Bower, V. A. and Snyder, H. L. Concurrent versus retrospective verbal protocol for comparing window usability. Proc. of the Human Factors Society 34th Annual Meeting, HFES Press (1990), 1270--1274.Google ScholarCross Ref
- Buur, J. and Bagger, K. Replacing usability testing with user dialogue. Communications of the ACM, 42, 5 (1999), 63--66. Google ScholarDigital Library
- Carter, P. Liberating usability testing. Interactions, 14, 2 (2007), 18--22. Google ScholarDigital Library
- Chi, M. T. H. Quantifying qualitative analyses of verbal data: A practical guide. The Journal of the Learning Sciences, 6, 3 (1997), 271--315.Google ScholarCross Ref
- Clemmensen, T., Hertzum, M., Hornbæk, K., Shi, Q. X., and Yammiyavar, P. Cultural cognition in usability evaluation. Interacting with Computers, 21, 3 (2009), 212--220. Google ScholarDigital Library
- Dumas, J. S. and Loring, B. Moderating usability tests: Principles and Practice for Interacting. Morgan Kaufmann Publishers, Burlington, USA, 2008. Google ScholarDigital Library
- Ebling, M. R., and John, B. E. On the contributions of different empirical data in usability testing. Proc. CHI 2000, ACM Press (2000), 289--296. Google ScholarDigital Library
- Ericsson, A. and Simon, H. A. Protocol Analysis: Verbal reports as data. MIT Press, London, UK, 1993.Google ScholarCross Ref
- Field, A. Discovering statistics using SPSS. Third Edition. Sage Publication, London, UK. 2009.Google Scholar
- Hertzum, M., Hansen, K. D., and Andersen, H. H. K., Scrutinising usability evaluation: does thinking aloud affect behaviour and mental workload?. Behaviour & Information Technology, 28, 2 (2009), 165--181. Google ScholarDigital Library
- Krahmer, E. and Ummelen, N. Thinking about thinking aloud: A comparison of two verbal protocols for usability testing. IEEE Transactions on Professional Communication, 47, 2(2004), 105--117.Google ScholarCross Ref
- Kuniavsky, M. Observing the user experience: A practitioner's guide to user research. Morgan Kaufmann Publishers, San Francisco, USA, 2003. Google ScholarDigital Library
- Nørgaard, M. and Hornbæk, K. What do usability evaluators do in practice?: An explorative study of think aloud usability tests", Proc. CHI 2006, ACM Press (2006), 209--218. Google ScholarDigital Library
- Shi, Q. X. A field study of relationship and communication between Chinese Evaluators and Users in Thinking Aloud Usability Tests. Proc. NordiCHI 2008, ACM Press (2008), 344--352. Google ScholarDigital Library
- Tamler, H. "How much to Intervene in a Usability Testing Session". Common Ground, 8, 3(1998), 11--15. http://www.htamler.com/papers/intervene/Google Scholar
- Van den Haak, M. J., De Jong, M. D. T. and Schellens, P. J. Constructive Interaction: An analysis of verbal interaction in a usability setting. IEEE Transactions on Professional Communication, 49, 4(2006), 311--324.Google ScholarCross Ref
- Van Kesteren, I. E. H., Bekker, M. M., Vermeeren, A. P. O. S., and Lloyd, P. A. Assessing usability evaluation methods on their effectiveness to elicit verbal comments from children subjects. Proc. CHI 2003, ACM Press (2003), 41--49. Google ScholarDigital Library
- Yang, S. C. Reconceptualizing think-aloud methodology: refining the encoding and categorizing techniques via contextualized perspectives. Computers in Human Behaviour, 19, 1(2003), 95--115.Google ScholarCross Ref
Index Terms
- Keep talking: an analysis of participant utterances gathered using two concurrent think-aloud methods
Recommendations
Rethinking Thinking Aloud: A Comparison of Three Think-Aloud Protocols
CHI '18: Proceedings of the 2018 CHI Conference on Human Factors in Computing SystemsThis paper presents the results of a study that compared three think-aloud methods: concurrent think-aloud, retrospective think-aloud, and a hybrid method. The three methods were compared through an evaluation of a library website, which involved four ...
Think-aloud protocols: a comparison of three think-aloud protocols for use in testing data-dissemination web sites for usability
CHI '10: Proceedings of the SIGCHI Conference on Human Factors in Computing SystemsWe describe an empirical, between-subjects study on the use of think-aloud protocols in usability testing of a federal data-dissemination Web site. This double-blind study used three different types of think-aloud protocols: a traditional protocol, a ...
Verbal reports and domain-specific knowledge: a comparison between collegial and retrospective verbalisation
One way to investigate and account for the role of experience in dynamic decision-making tasks is to use a knowledge elicitation method, for example verbal protocols. Recently, methods for verbalisation by other subjects have been suggested as a way to ...
Comments