skip to main content
10.1145/1899503.1899504acmconferencesArticle/Chapter ViewAbstractPublication PageshtConference Proceedingsconference-collections
research-article

The complementary role of two evaluation methods in the usability and accessibility evaluation of a non-standard system

Authors Info & Claims
Published:11 October 2010Publication History

ABSTRACT

Usability, which is generally defined in terms of application effectiveness, efficiency and user satisfaction, is one of the focus areas of human-computer interaction (HCI). Accessibility is the design of systems that can be perceived, understood and used by people with varying abilities. Although accessibility concerns are aimed at making systems usable for people with disabilities, support for direct accessibility, the built-in redundancies in an application that enable as many people as possible to utilize it without system modifications, is beneficial to people with or without disabilities. Different usability evaluation methods (UEMs) are available. Selecting between the various methods can be influenced by the type of system being evaluated. The Digital Doorway (DD), a non-standard computer system deployed to promote computer literacy amongst underprivileged communities in South Africa, was evaluated using the heuristic evaluation method and a field usability study. The heuristic evaluation method revealed a large number of usability and direct accessibility-related problems, some of which could be classified as low-severity problems. The field study showed additional problems that affected the successful completion of user tasks. Since a number of these were a direct consequence of the context of use, they were not recognized as problems by expert evaluators. The study showed that the heuristic evaluation method can be optimized by complementing it with another method that involves user participation and is, preferably, carried out in the intended context of use.

References

  1. Adebesin, T. F. 2010. Report on the Usability and Accessibility Evaluation of the Digital Doorway. Available from http://hufee.meraka.org.za/Hufeesite/links/files/Technical%20Report%20-%20Digital%20Doorway%20Evaluation-Protected.pdf/viewGoogle ScholarGoogle Scholar
  2. Alessi, S. M., & Trollip, S. R. 2001. Multimedia for Learning: Methods and Development (3rd ed.). Massachusetts: Allyn & Bacon. Google ScholarGoogle ScholarDigital LibraryDigital Library
  3. Barnum, C. M. 2002. Usability Testing and Research. Massachusetts: Allyn & Bacon. Google ScholarGoogle ScholarDigital LibraryDigital Library
  4. Cockton, G., Woolrych, A., & Lavery, D. 2008. Inspection-Based Evaluations. In A. Sears & J. A. Jacko (Eds.), The Human-Computer Interaction Handbook: Fundamentals, Evolving Technologies and Emerging Applications (2nd ed.). New York: Lawrence Erlbaum Associates.Google ScholarGoogle Scholar
  5. Davis, F. D. 1989. Perceived Usefulness, Perceived Ease of Use, and User Acceptance of Information Technology. MIS Quarterly, 13(3), 319--340.Google ScholarGoogle ScholarDigital LibraryDigital Library
  6. Dix, A., Finlay, J., Abowd, G. D., & Beale, R. 2004. Human-Computer Interaction (3rd ed.): Pearson Education Ltd. Google ScholarGoogle ScholarDigital LibraryDigital Library
  7. Dumas, J. S. 2003. User-Based Evaluation. In J. A. Jacko & A. Sears (Eds.), The Human-Computer Interaction Handbook. Mahwah: Lawrence Erlbaum Associates. Google ScholarGoogle ScholarDigital LibraryDigital Library
  8. Gardner-Bonneau, D. 2010. Is Technology Becoming More Usable -or Less - and With What Consequences. Journal of Usability Studies, 5(2), 46--49.Google ScholarGoogle ScholarDigital LibraryDigital Library
  9. Gelderblom, J. H. 2008. Designing Technology for Young Children: Guidelines Grounded in a Literature Investigation on Child Development and Children's Technology. PhD Thesis, UNISA. Retrieved from http://etd.unisa.ac.za/ETD-db/theses/available/etd-09172008-132111/unrestricted/thesis.pdf Google ScholarGoogle ScholarDigital LibraryDigital Library
  10. Gray, W. D., & Salzman, M. C. 1998. Damaged Merchandise? A Review of Experiments That Compare Usability Evaluation Methods. Human-Computer Interaction, 13, 203--261. Google ScholarGoogle ScholarDigital LibraryDigital Library
  11. Gush, K., Cambridge, G., & Smith, R. 2004. The Digital Doorway - minimally invasive education in Africa. Paper presented at the Proceedings of the ICT in Education Conference, Cape Town.Google ScholarGoogle Scholar
  12. Gush, K., De Villiers, M. R., Smith, R., & Cambridge, G. In Press. Digital Doorways. In J. Steyn, J. Belle & E. M. Villeneuva (Eds.), Development Informatics and Regional Information Technologies: Theory, Practice and the Digital Divide (Vol. 2). Pennsylvania: IGI.Google ScholarGoogle Scholar
  13. Henry, S. L. 2002. Understanding Web Accessibility. In J. Thatcher, P. Bohman, M. R. Burks, S. L. Henry, B. Regan, S. Swierenga, M. D. Urban & C. D. Waddell (Eds.), Constructing Accessibility Web Sites. Birmingham: Glasshaus.Google ScholarGoogle Scholar
  14. Henry, S. L. 2007. Just Ask: Integrating Accessibility Throughout Design. Retrieved from http://www.uiaccess.com/JustAsk/Google ScholarGoogle Scholar
  15. Hertzum, M., & Jacobsen, N. E. 2003. The Evaluator Effect: A Chilling Fact About Usability Evaluation Methods. International Journal of Human-Computer Interaction, 15(1), 183--204.Google ScholarGoogle ScholarCross RefCross Ref
  16. Hornbaek, K., & Frokjaer, E. 2008. A Study of the Evaluator Effect in Usability Testing. Human Computer Interaction, 23(3), 251--277.Google ScholarGoogle ScholarCross RefCross Ref
  17. IBM. 2009. IBM Software Accessibility Checklist Retrieved 24 November 2009, from http://www-03.ibm.com/able/guidelines/software/accesssoftware.htmlGoogle ScholarGoogle Scholar
  18. International Organization for Standardization. 1998. ISO 9241-11 Ergonomic Requirements for Office Work with Visual Display Terminals (VDTs) - Part 11: Guidance on Usability.Google ScholarGoogle Scholar
  19. Iwarsson, S., & Stahl, A. 2003. Accessibility, Usability and Universal Design - Positioning and Definition of Concepts Describing Person-Environment Relationships. Disability & Rehabilitation, 25(2), 57--66.Google ScholarGoogle Scholar
  20. Jeffries, R., Miller, J. R., Wharton, C., & Uyeda, K. 1991. User Interface Evaluation in the Real World: A Comparison of Four Techniques. Paper presented at the Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. Google ScholarGoogle ScholarDigital LibraryDigital Library
  21. Lindgaard, G. 1994. Usability Testing and System Evaluation: A Guide for Designing Useful Computer Systems. London: Chapman & Hall Computing.Google ScholarGoogle Scholar
  22. Malone, T. W. 1980. What Makes Things Fun to Learn? Heuristics for Designing Instructional Computer Games. Paper presented at the 3rd ACM SIGSMALL Symposium and 1st SIGPC Symposium on Small Systems. Google ScholarGoogle ScholarDigital LibraryDigital Library
  23. Malone, T. W. 1981. Toward a Theory of Intrinsically Motivating Instruction. Cognitive Science, 5(4), 333--369.Google ScholarGoogle ScholarCross RefCross Ref
  24. Mayhew, D. J. 1992. Principles and Guidelines in Software User Interface Design. New Jersey: Prentice Hall, Inc. Google ScholarGoogle ScholarDigital LibraryDigital Library
  25. Molich, R., & Dumas, J. S. 2008. Comparative Usability Evaluation (CUE-4). Behaviour & Information Technology, 27(3), 263--281. Google ScholarGoogle ScholarDigital LibraryDigital Library
  26. Nielsen, J. 1993. Usability Engineering. Boston: Academic Press, Inc. Google ScholarGoogle ScholarDigital LibraryDigital Library
  27. Nielsen, J. 1994. Heuristic Evaluation. In J. Nielsen & R. L. Mack (Eds.), Usability Inspection Methods. New York: John Wiley & Sons, Inc. Google ScholarGoogle ScholarDigital LibraryDigital Library
  28. Nielsen, J. 2003. Usability 101: Introduction to Usability Retrieved 20 July 2009, from http://www.useit.com/alertbox/20030825.htmlGoogle ScholarGoogle Scholar
  29. Norman, D. A. 2001. The Design of Everyday Things. London: MIT Press. Google ScholarGoogle ScholarDigital LibraryDigital Library
  30. Preece, J., Rogers, Y., & Sharp, H. 2007. Interaction Design: Beyond Human-Computer Interaction (2nd ed.). Chichester: John Wiley & Sons Ltd. Google ScholarGoogle ScholarDigital LibraryDigital Library
  31. Pühretmair, F., & Miesenberger. 2005. Making sense of Accessibility in IT Design - Usable Accessibility vs. Accessible Usability. Paper presented at the Database and Expert Systems Applications (DEXA'05).Google ScholarGoogle Scholar
  32. Rogoff, R. 2001, 24--27 Oct 2001. Making Electronic Information Accessible to Everyone. Paper presented at the Professional Communication Conference, (IPCC 2001).Google ScholarGoogle Scholar
  33. Rubin, J. 1994. Handbook of Usability Testing: How to plan, design, and conduct effective tests. New York: John Wiley & Sons. Google ScholarGoogle ScholarDigital LibraryDigital Library
  34. Sears, A. 1997. Heuristic Walkthroughs: Finding the Problems Without the Noise. International Journal of Human-Computer Interaction, 9(3), 213--234.Google ScholarGoogle ScholarCross RefCross Ref
  35. Shelley, B. 2001. Guidelines for Developing Successful Games Retrieved 10 March 2010, from http://jnoodle.com/careertech/files/GuidelinesDevelopingSuccessfulGames.pdfGoogle ScholarGoogle Scholar
  36. Shneiderman, B. 1998. Designing the User Interface: strategies for Effective Human-Computer Interaction (3rd ed.). New York: Addison Wesley. Google ScholarGoogle ScholarDigital LibraryDigital Library
  37. Story, M. F., Mueller, J. L., & Mace, R. L. 1998. The Universal Design File: Designing for People of All Ages and Abilities Retrieved 17 November 2009, from http://eric.ed.gov/ERICDocs/data/ericdocs2sql/content_storage_01/0000019b/80/19/ac/11.pdfGoogle ScholarGoogle Scholar
  38. United States Access Board. 2000. Electronic and Information Technology Accessibility Standards (Section 508) Retrieved 09 November 2009, from http://www.access-board.gov/sec508/standards.htmGoogle ScholarGoogle Scholar
  39. Vanderheiden, G. C. 1994. Application Software Design Guidelines: Increasing the Accessibility of Application Software to People with Disabilities and Older Users Retrieved 04 November 2009, from http://trace.wisc.edu/docs/software_guidelines/software.htmGoogle ScholarGoogle Scholar
  40. Wharton, C., Braffort, J., Jeffries, R., & Franzke, M. 1992. Applying Cognitive Walkthroughs to More Complex User Interfaces: Experiences, Issues, and Recommendations. Paper presented at the SIGCHI Conference on Human factors in Computing Systems. Google ScholarGoogle ScholarDigital LibraryDigital Library
  41. World Wide Web Consortium. 1999. Web Content Accessibility Guidelines 1.0 Retrieved 7 August 2009, 2009, from http://www.w3.org/TR/WAI-WEBCONTENT/Google ScholarGoogle Scholar

Index Terms

  1. The complementary role of two evaluation methods in the usability and accessibility evaluation of a non-standard system

      Recommendations

      Comments

      Login options

      Check if you have access through your login credentials or your institution to get full access on this article.

      Sign in
      • Published in

        cover image ACM Conferences
        SAICSIT '10: Proceedings of the 2010 Annual Research Conference of the South African Institute of Computer Scientists and Information Technologists
        October 2010
        447 pages
        ISBN:9781605589503
        DOI:10.1145/1899503

        Copyright © 2010 ACM

        Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

        Publisher

        Association for Computing Machinery

        New York, NY, United States

        Publication History

        • Published: 11 October 2010

        Permissions

        Request permissions about this article.

        Request Permissions

        Check for updates

        Qualifiers

        • research-article

        Acceptance Rates

        Overall Acceptance Rate187of439submissions,43%

        Upcoming Conference

        HT '24
        35th ACM Conference on Hypertext and Social Media
        September 10 - 13, 2024
        Poznan , Poland

      PDF Format

      View or Download as a PDF file.

      PDF

      eReader

      View online with eReader.

      eReader