skip to main content
10.1145/1414471.1414494acmconferencesArticle/Chapter ViewAbstractPublication PagesassetsConference Proceedingsconference-collections
research-article

A comparative test of web accessibility evaluation methods

Published: 13 October 2008 Publication History

Abstract

Accessibility auditors have to choose a method when evaluating accessibility: expert review (a.k.a. conformance testing), user testing, subjective evaluations, barrier walkthrough are some possibilities. However, little is known to date about their relative strengths and weaknesses. Furthermore, what happened for usability evaluation methods is likely to repeat for accessibility: that there is uncertainty about not only pros and cons of methods, but also about criteria to be used to compare them and metrics to measure these criteria. After a quick review and description of methods, the paper illustrates a comparative test of two web accessibility evaluation methods: conformance testing and barrier walkthrough. The comparison aims at determining merits of barrier walkthrough, using conformance testing as a control condition. A comparison framework is outlined, followed by the description of a laboratory experiment with 12 subjects (novice accessibility evaluators), and its results. Significant differences were found in terms of correctness, one of the several metrics used to compare the methods. Reliability also appears to be different.

References

[1]
G. Brajnik. Web Accessibility Testing: When the Method is the Culprit. In K. Miesenberger, J. Klaus, W. Zagler, and A. Karshmer, editors, ICCHP 2006, 10th International Conference on Computers Helping People with Special Needs, Lecture Notes in Computer Science 4061, Linz, Austria, July 2006. Springer Verlag.
[2]
G. Brajnik. Web accessibility testing with barriers walkthrough. www.dimi.uniud.it/giorgio/projects/bw, March 2006. Visited May 2008.
[3]
G. Brajnik. Beyond conformance: the role of accessibility evaluation methods. In 2nd International Workshop on Web Usability and Accessibility IWWUA08, Auckland, New Zealand, Sept. 2008. Keynote speech.
[4]
G. Cockton and A. Woolrych. Understanding inspection methods: lessons from an assessment of heuristic evaluation. In A. Blandford and J. Vanderdonckt, editors, People & Computers XV, pages 171--192. Springer-Verlag, 2001.
[5]
K. Coyne and J. Nielsen. How to conduct usability evaluations for accessibility: methodology guidelines for testing websites and intranets with users who use assistive technology. http://www.nngroup.com/reports/accessibility/testing, Nielsen Norman Group, Oct. 2001.
[6]
A. Dey. Accessibility evaluation practices -- survey results. http://deyalexander.com/publications/accessibility-evaluation-practices.html, 2004. Visited May 2008.
[7]
DRC. Formal investigation report: web accessibility. Disability Rights Commission, www.drc-gb.org/publicationsandreports/report.asp, April 2004. Visited Jan. 2006.
[8]
W. Gray and M. Salzman. Damaged merchandise: a review of experiments that compare usability evaluation methods. Human-Computer Interaction, 13(3):203--261, 1998.
[9]
H. R. Hartson, T. S. Andre, and R. C. Williges. Criteria for evaluating usability evaluation methods. Int. Journal of Human-Computer Interaction, 15(1):145--181, 2003.
[10]
S. Henry and M. Grossnickle. Just Ask: Accessibility in the User-Centered Design Process. Georgia Tech Research Corporation, Atlanta, Georgia, USA, 2004.
[11]
On-line book: www.UIAccess.com/AccessUCD.
[12]
M. Hertzum and N. Jacobsen. The evaluator effect: a chilling fact about usability evaluation methods. Int. Journal of Human-Computer Interaction, 1(4):421--443, 2001.
[13]
M. Hertzum, N. Jacobsen, and R. Molich. Usability inspections by groups of specialists: Perceived agreement in spite of disparate observations. In CHI 2002 Extended Abstracts, pages 662--663. ACM, ACM Press, 2002.
[14]
Italian Government. Requisiti tecnici e i diversi livelli per l'accessibilità agli strumenti informatici. www.pubbliaccesso.it/normative/DM080705.htm, July 2005. G. U. n. 183 8/8/2005.
[15]
B. Kelly, D. Sloan, S. Brown, J. Seale, H. Petrie, P. Lauke, and S. Ball. Accessibility 2.0: people, policies and processes. In W4A '07: Proc. of the 2007 international cross-disciplinary conference on Web accessibility (W4A), pages 138--147, New York, NY, USA, 2007. ACM.
[16]
T. Lang. Comparing website accessibility evaluation methods and learnings from usability evaluation methods. http://www.peakusability.com.au/about-us/pdf/website_accessibility.pdf, Visited May 2008, 2003.
[17]
J. Mankoff, H. Fait, and T. Tran. Is your web page accessible?: a comparative study of methods for assessing web page accessibility for the blind. In CHI 2005: Proc. of the SIGCHI conference on Human factors in computing systems, pages 41--50, New York, NY, USA, 2005. ACM.
[18]
R. Molich, N. Bevan, I. Curson, S. Butler, E. Kindlund, D. Miller, and J. Kirakowski. Comparative evaluation of usability tests. In Proc. of the Usability Professionals Association Conference, Washington, DC, June 1998.
[19]
H. Petrie and O. Kheir. The relationship between accessibility and usability of websites. In Proc. CHI 2007, pages 397--406, San Jose, CA, USA, 2007. ACM.
[20]
A. Sears. Heuristic walkthroughs: finding the problems without the noise. Int. Journal of Human-Computer Interaction, 9(3):213--234, 1997.
[21]
W3C/WAI. How people with disabilities use the web. World Wide Web Consortium - Web Accessibility Initiative, w3.org/WAI/EO/Drafts/PWD-Use-Web/20040302.html, March 2004. Visited May 2008.
[22]
W3C/WAI. Conformance evaluation of web sites for accessibility. www.w3.org/WAI/eval/conformance.html, 2008. Visited May 2008.
[23]
A. Woolrych and G. Cockton. Assessing heuristic evaluation: mind the quality, not just the percentages. In Proc. of HCI 2000, pages 35--36, 2000.

Cited By

View all
  • (2024)Assessing Accessibility Levels in Mobile Applications Developed from Figma TemplatesProceedings of the 17th International Conference on PErvasive Technologies Related to Assistive Environments10.1145/3652037.3652075(316-321)Online publication date: 26-Jun-2024
  • (2024)Developing a blind user mental model (BlUMM) for web browsingUniversal Access in the Information Society10.1007/s10209-023-01035-523:3(1343-1367)Online publication date: 1-Aug-2024
  • (2023)Türkiye’deki üniversite web sitelerinin erişilebilirliği: Web içeriği erişilebilirlik kılavuzu kapsamında bir değerlendirmeAccessibility of university websites in Türkiye: An evaluation using website content accessibility guidelinesSakarya Üniversitesi İşletme Enstitüsü Dergisi10.47542/sauied.12474275:2(65-75)Online publication date: 31-Dec-2023
  • Show More Cited By

Index Terms

  1. A comparative test of web accessibility evaluation methods

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Conferences
    Assets '08: Proceedings of the 10th international ACM SIGACCESS conference on Computers and accessibility
    October 2008
    332 pages
    ISBN:9781595939760
    DOI:10.1145/1414471
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

    Sponsors

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 13 October 2008

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. accessibility evaluation method
    2. quality assessment
    3. web accessibility

    Qualifiers

    • Research-article

    Conference

    ASSETS08
    Sponsor:

    Acceptance Rates

    Overall Acceptance Rate 436 of 1,556 submissions, 28%

    Upcoming Conference

    ASSETS '25

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)154
    • Downloads (Last 6 weeks)9
    Reflects downloads up to 16 Feb 2025

    Other Metrics

    Citations

    Cited By

    View all
    • (2024)Assessing Accessibility Levels in Mobile Applications Developed from Figma TemplatesProceedings of the 17th International Conference on PErvasive Technologies Related to Assistive Environments10.1145/3652037.3652075(316-321)Online publication date: 26-Jun-2024
    • (2024)Developing a blind user mental model (BlUMM) for web browsingUniversal Access in the Information Society10.1007/s10209-023-01035-523:3(1343-1367)Online publication date: 1-Aug-2024
    • (2023)Türkiye’deki üniversite web sitelerinin erişilebilirliği: Web içeriği erişilebilirlik kılavuzu kapsamında bir değerlendirmeAccessibility of university websites in Türkiye: An evaluation using website content accessibility guidelinesSakarya Üniversitesi İşletme Enstitüsü Dergisi10.47542/sauied.12474275:2(65-75)Online publication date: 31-Dec-2023
    • (2023)Accessibility of Mobile Apps for Visually Impaired Users: Problems Encountered by User Evaluation, Inspections and Automated ToolsProceedings of the XXII Brazilian Symposium on Human Factors in Computing Systems10.1145/3638067.3638101(1-11)Online publication date: 16-Oct-2023
    • (2023)Comparison of Free and Open Source WCAG Accessibility Evaluation ToolsProceedings of the 6th International Conference on Networking, Intelligent Systems & Security10.1145/3607720.3607722(1-6)Online publication date: 24-May-2023
    • (2023)A Probabilistic Model and Metrics for Estimating Perceived Accessibility of Desktop Applications in Keystroke-Based Non-Visual InteractionsProceedings of the 2023 CHI Conference on Human Factors in Computing Systems10.1145/3544548.3581400(1-20)Online publication date: 19-Apr-2023
    • (2023)Web Structure Derived Clustering for Optimised Web Accessibility EvaluationProceedings of the ACM Web Conference 202310.1145/3543507.3583508(1345-1354)Online publication date: 30-Apr-2023
    • (2023)Screen Recognition: Creating Accessibility Metadata for Mobile Applications using View Type Detection2023 9th International Conference on Computer and Communications (ICCC)10.1109/ICCC59590.2023.10507590(1787-1793)Online publication date: 8-Dec-2023
    • (2023)Web accessibility automatic evaluation tools: to what extent can they be automated?CCF Transactions on Pervasive Computing and Interaction10.1007/s42486-023-00127-85:3(288-320)Online publication date: 14-Mar-2023
    • (2023)Accessibility Inspections of Mobile Applications by Professionals with Different Expertise Levels: An Empirical Study Comparing with User EvaluationsHuman-Computer Interaction – INTERACT 202310.1007/978-3-031-42280-5_9(135-154)Online publication date: 25-Aug-2023
    • Show More Cited By

    View Options

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Figures

    Tables

    Media

    Share

    Share

    Share this Publication link

    Share on social media