skip to main content
10.1145/1518701.1518804acmconferencesArticle/Chapter ViewAbstractPublication PageschiConference Proceedingsconference-collections
research-article

Undo and erase events as indicators of usability problems

Published: 04 April 2009 Publication History

Abstract

One approach to reducing the costs of usability testing is to facilitate the automatic detection of critical incidents: serious breakdowns in interaction that stand out during software use. This research evaluates the use of undo and erase events as indicators of critical incidents in Google SketchUp (a 3D-modeling application), measuring an indicator's usefulness by the numbers and types of usability problems discovered. We compared problems identified using undo and erase events to problems identified using the user-reported critical incident technique [Hartson and Castillo 1998]. In a within-subjects experiment with 35 participants, undo and erase episodes together revealed over 90% of the problems rated as severe, several of which would not have been discovered by self-report alone. Moreover, problems found by all three methods were rated as significantly more severe than those identified by only a subset of methods. These results suggest that undo and erase events will serve as useful complements to user-reported critical incidents for low cost usability evaluation of creation-oriented applications like SketchUp.

References

[1]
Capra, M. Contemporaneous Versus Retrospective User-Reported Critical Incidents in Usability Evaluation. In Proc. Human Factors 2002. HFES(2002), 1973--1977.
[2]
Cockton, G. and Lavery, D. A Framework for Usability Problem Extraction. In Proc. Interact 1999. IOS Press(1999), 344--352.
[3]
Cronbach, L. J. Coefficient alpha and the internal structure of tests. Psychometrika 16, 3 (1951), 297--334.
[4]
del Galdo, E.M., Williges, B.H., and Wixon, D.R. An Evaluation of Critical Incidents for Software Documentation Design. In Proc. Human Factors 1986. HFES(1986), 19--23.
[5]
Flanagan, J.C. The Critical Incident Technique. Psychological Bulletin 51, 4 (1954), 327--358.
[6]
Google SketchUp, http://sketchup.google.com
[7]
Google SketchUp New User Tutorial Videos 1-3, http://www.youtube.com/user/SketchUpVideo.
[8]
Guan, Z., Lee, S., Cuddihy, E., and Ramey, J. The validity of the stimulated retrospective think-aloud method as measured by eye tracking. In Proc. CHI 2006. ACM Press (2006), 1253--1262.
[9]
Hartson, H.R., Castillo, J. C., Kelso, J., and Neale, W. C. Remote evaluation: the network as an extension of the usability laboratory. In Proc. CHI 1996. ACM Press(1996), 228--235.
[10]
Hartson, R. and Castillo, J.C. Remote evaluation for post-deployment usability improvement. In Proc. AVI 98. ACM Press (1998), 25--27.
[11]
Hilbert, D. and Redmiles, D. An approach to large-scale collection of application usage data over the Internet. In Proc. Software Engineering 1998. ACM Press (1998), 136--145.
[12]
Howarth, J., Andre, T. S., and Hartson, R. A Structured Process for Transforming Usability Data into Usability Information. Journal of Usability Studies, 3,1 (2007), 7--23.
[13]
Jacobsen, N.E., Hertzum, M., and John, B. E. The evaluator effect in usability tests. In Proc. CHI 1998. ACM Press(1998), 255--256.
[14]
Kirsh, D. and Maglio, P. On distinguishing epistemic from pragmatic action. Cog. Sci. 18 (1994), 513--549.
[15]
Koenemann-Belliveau, J., Carroll, J., Rosson, M. B., and Singley, M.K. Comparative usability evaluation: critical incidents and critical threads. In Proc. CHI 1994. ACM Press(1994), 245--251.
[16]
Law, E.L. and Hvannberg, E.T. Analysis of Combinatorial User Effect in International Usability Tests, In Proc. CHI 2004. ACM Press (2004), 9--16.
[17]
Lindgaard, G. and Chattratichart, J. Usability testing: what have we overlooked? In Proc. CHI 2007. ACM Press (2007), 1415--1424.
[18]
Medlock, M.C., Wixon, D., Terrano, M., Romero, R., and Fulton, B. Using the RITE method to improve products: a definition and a case study. In Usability Professionals Association (2002).
[19]
Nielsen, J. and Landauer, T. A mathematical model of the finding of usability problems. In Proc. CHI/INTERACT 93. ACM Press (1993), 206--213.
[20]
Nisbett, R. and Wilson, T. Telling more than we can know: Verbal reports on mental processes. Psychological Review, 84, 231--259, 1977.
[21]
Rubin, J. and Hudson, T. Handbook of Usability Testing: How to Plan, Design, and Conduct Effective Tests. John Wiley&Sons, Inc. (1994).
[22]
Spool, J. and Schroeder, W. Testing web sites: five users is nowhere near enough. In Extended Abstracts of Proc. CHI 2001. ACM Press (2001), 285--286.
[23]
Swallow, J., Hameluck, D., and Carey, T. User Interface Instrumentation for Usability Analysis: A Case Study. In Proc. Cascon 97, 1997.
[24]
Winograd, T. and Flores, F. (Eds.) Understanding computers and cognition. Ablex Publish. Corp (1986).

Cited By

View all
  • (2023)Frustration: Still a Common User ExperienceACM Transactions on Computer-Human Interaction10.1145/358243230:3(1-26)Online publication date: 10-Jun-2023
  • (2023)Point of no Undo: Irreversible Interactions as a Design StrategyProceedings of the 2023 CHI Conference on Human Factors in Computing Systems10.1145/3544548.3581433(1-18)Online publication date: 19-Apr-2023
  • (2022)Quantifying Proactive and Reactive Button InputProceedings of the 2022 CHI Conference on Human Factors in Computing Systems10.1145/3491102.3501913(1-18)Online publication date: 29-Apr-2022
  • Show More Cited By

Index Terms

  1. Undo and erase events as indicators of usability problems

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Conferences
    CHI '09: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
    April 2009
    2426 pages
    ISBN:9781605582467
    DOI:10.1145/1518701
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

    Sponsors

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 04 April 2009

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. critical incidents
    2. erase
    3. google sketchup
    4. undo
    5. usability testing
    6. user-reported critcial incident technique

    Qualifiers

    • Research-article

    Conference

    CHI '09
    Sponsor:

    Acceptance Rates

    CHI '09 Paper Acceptance Rate 277 of 1,130 submissions, 25%;
    Overall Acceptance Rate 6,199 of 26,314 submissions, 24%

    Upcoming Conference

    CHI 2025
    ACM CHI Conference on Human Factors in Computing Systems
    April 26 - May 1, 2025
    Yokohama , Japan

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)25
    • Downloads (Last 6 weeks)2
    Reflects downloads up to 06 Jan 2025

    Other Metrics

    Citations

    Cited By

    View all
    • (2023)Frustration: Still a Common User ExperienceACM Transactions on Computer-Human Interaction10.1145/358243230:3(1-26)Online publication date: 10-Jun-2023
    • (2023)Point of no Undo: Irreversible Interactions as a Design StrategyProceedings of the 2023 CHI Conference on Human Factors in Computing Systems10.1145/3544548.3581433(1-18)Online publication date: 19-Apr-2023
    • (2022)Quantifying Proactive and Reactive Button InputProceedings of the 2022 CHI Conference on Human Factors in Computing Systems10.1145/3491102.3501913(1-18)Online publication date: 29-Apr-2022
    • (2022)Critical Incident Technique and Gig-Economy Work (Deliveroo): Working with and Challenging Assumptions around AlgorithmsExtended Abstracts of the 2022 CHI Conference on Human Factors in Computing Systems10.1145/3491101.3519865(1-6)Online publication date: 27-Apr-2022
    • (2019)Assisted pattern mining for discovering interactive behaviours on the webInternational Journal of Human-Computer Studies10.1016/j.ijhcs.2019.06.012130:C(196-208)Online publication date: 1-Oct-2019
    • (2018)MaestroProceedings of the 31st Annual ACM Symposium on User Interface Software and Technology10.1145/3242587.3242606(287-298)Online publication date: 11-Oct-2018
    • (2017)WevQueryProceedings of the ACM on Human-Computer Interaction10.1145/30958061:EICS(1-17)Online publication date: 30-Jun-2017
    • (2016)The trade-off between usability and security in the context of eGovernmentProceedings of the 30th International BCS Human Computer Interaction Conference: Fusion!10.14236/ewic/HCI2016.36(1-13)Online publication date: 11-Jul-2016
    • (2016)Just the other side of the coin? From error to insight analysisInformation Visualization10.1177/147387161559864115:4(312-324)Online publication date: 26-Jul-2016
    • (2016)The world is your test suitePerspectives on Data Science for Software Engineering10.1016/B978-0-12-804206-9.00069-6(375-378)Online publication date: 2016
    • Show More Cited By

    View Options

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Media

    Figures

    Other

    Tables

    Share

    Share

    Share this Publication link

    Share on social media