skip to main content
10.1145/1414471.1414487acmconferencesArticle/Chapter ViewAbstractPublication PagesassetsConference Proceedingsconference-collections
research-article

Slide rule: making mobile touch screens accessible to blind people using multi-touch interaction techniques

Published: 13 October 2008 Publication History

Abstract

Recent advances in touch screen technology have increased the prevalence of touch screens and have prompted a wave of new touch screen-based devices. However, touch screens are still largely inaccessible to blind users, who must adopt error-prone compensatory strategies to use them or find accessible alternatives. This inaccessibility is due to interaction techniques that require the user to visually locate objects on the screen. To address this problem, we introduce Slide Rule, a set of audio-based multi-touch interaction techniques that enable blind users to access touch screen applications. We describe the design of Slide Rule, our interaction techniques, and a user study in which 10 blind people used Slide Rule and a button-based Pocket PC screen reader. Results show that Slide Rule was significantly faster than the button-based system, and was preferred by 7 of 10 users. However, users made more errors when using Slide Rule than when using the more familiar button-based system.

References

[1]
Aitchison, J. and Brown, J.A.C. (1957). The Lognormal Distribution. Cambridge University Press.
[2]
Amar, R., Dow, S., Gordon, R., Hamid, M.R. and Sellers, C. (2003). Mobile ADVICE: an accessible device for visually impaired capability enhancement. In Proc. CHI '03, ACM Press, 918--919.
[3]
Brewster, S.A. and Cryer, P.G. (1999). Maximising screen-space on mobile computing devices. In Proc. CHI '99, ACM Press, 224--225.
[4]
Buxton, W., Hill, R. and Rowley, P. (1985). Issues and techniques in touch-sensitive tablet input. In Proc. SIGGRAPH '85, ACM Press, 215--224.
[5]
Harrison, B.L., Fishkin, K.P., Gujar, A., Mochon, C. and Want, R. (1998). Squeeze me, hold me, tilt me! An exploration of manipulative user interfaces. In Proc. CHI '98, ACM Press, 17--24.
[6]
Hill, D.R. and Grieb, C. (1988). Substitution for a restricted visual channel in multimodal computer-human dialogue. IEEE Transactions on Systems, Man and Cybernetics, 18(2), 285--304.
[7]
Holm, S. (1979). A simple sequentially rejective multiple test procedure. Scandinavian Journal of Statistics, 6, 65--70.
[8]
Landau, S. and Wells, L. (2003). Merging tactile sensory input and audio data by means of the Talking Tactile Tablet. In Proc. EuroHaptics '03, IEEE Computer Society, 414--418.
[9]
Li, K.A., Baudisch, P. and Hinckley, K. (2008). Blindsight: eyes-free access to mobile phones. In Proc. CHI '08, ACM Press, 1389--1398.
[10]
MacKenzie, I.S. and Oniszczak, A. (1998). A comparison of three selection techniques for touchpads. In Proc. CHI '98, ACM Press, 336--343.
[11]
O'Neill, E., Kaenampornpan, M., Kostakos, V., Warr, A. and Woodgate, D. (2006). Can we do without GUIs? Gesture and speech interaction with a patient information system. Personal and Ubiquitous Computing, 10 (5), 269--283.
[12]
Parhi, P., Karlson, A.K. and Bederson, B.B. (2006). Target size study for one-handed thumb use on small touchscreen devices. In Proc. MobileHCI '06, ACM Press, 203--210.
[13]
Pirhonen, A., Brewster, S. and Holguin, C. (2002). Gestural and audio metaphors as a means of control for mobile devices. In Proc. CHI '02, ACM Press, 291--298.
[14]
Potter, R.L., Weldon, L.J. and Shneiderman, B. (1988). Improving the accuracy of touch screens: an experimental evaluation of three strategies. In Proc. CHI '88, ACM Press, 27--32.
[15]
Sánchez, J. and Aguayo, F. (2007). Mobile messenger for the blind. In C. Stephanidis and M. Pieper (eds.), Universal Access in Ambient Intelligence Environments, Springer, 369--385.
[16]
Sánchez, J. and Maureira, E. (2007). Subway Mobility Assistance Tools for Blind Users. In C. Stephanidis and M. Pieper (eds.), Universal Access in Ambient Intelligence Environments, Springer, 386--404.
[17]
Sears, A. and Shneiderman, B. (1991). High precision touchscreens: design strategies and comparisons with a mouse. International Journal of Man-Machine Studies, 34(4), 593--613.
[18]
Vanderheiden, G.C. (1996). Use of audio-haptic interface techniques to allow nonvisual access to touchscreen appliances. Human Factors and Ergonomics Society Annual Meeting Proceedings, 40, 1266.
[19]
Wu, M. and Balakrishnan, R. (2003). Multi-finger and whole hand gestural interaction techniques for multi-user tabletop displays. In Proc. UIST '03, ACM Press, 193--202.
[20]
Yfantidis, G. and Evreinov, G. (2006). Adaptive blind interaction technique for touchscreens. Universal Access in the Information Society, 4(4), 328--337.
[21]
Zhao, S., Dragicevic, P., Chignell, M., Balakrishnan, R. and Baudisch, P. (2007). Earpod: eyes-free menu selection using touch input and reactive audio feedback. In Proc. CHI '07. ACM Press, 1395--1404.

Cited By

View all
  • (2113)Intelligent Interaction in Accessible ApplicationsA Multimodal End-2-End Approach to Accessible Computing10.1007/978-1-4471-5082-4_5(93-117)Online publication date: 26-Mar-2113
  • (2024)Improving Usability of Data Charts in Multimodal Documents for Low Vision UsersProceedings of the 26th International Conference on Multimodal Interaction10.1145/3678957.3685714(498-507)Online publication date: 4-Nov-2024
  • (2024)Touchpad Mapper: Exploring Non-Visual Touchpad Interactions for Screen-Reader UsersProceedings of the 21st International Web for All Conference10.1145/3677846.3677867(42-44)Online publication date: 13-May-2024
  • Show More Cited By

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Conferences
Assets '08: Proceedings of the 10th international ACM SIGACCESS conference on Computers and accessibility
October 2008
332 pages
ISBN:9781595939760
DOI:10.1145/1414471
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 13 October 2008

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. accessibility
  2. blindness
  3. mobile devices
  4. multi-touch interaction techniques
  5. speech output
  6. touch screens

Qualifiers

  • Research-article

Conference

ASSETS08
Sponsor:

Acceptance Rates

Overall Acceptance Rate 436 of 1,556 submissions, 28%

Upcoming Conference

ASSETS '25

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)173
  • Downloads (Last 6 weeks)21
Reflects downloads up to 16 Feb 2025

Other Metrics

Citations

Cited By

View all
  • (2113)Intelligent Interaction in Accessible ApplicationsA Multimodal End-2-End Approach to Accessible Computing10.1007/978-1-4471-5082-4_5(93-117)Online publication date: 26-Mar-2113
  • (2024)Improving Usability of Data Charts in Multimodal Documents for Low Vision UsersProceedings of the 26th International Conference on Multimodal Interaction10.1145/3678957.3685714(498-507)Online publication date: 4-Nov-2024
  • (2024)Touchpad Mapper: Exploring Non-Visual Touchpad Interactions for Screen-Reader UsersProceedings of the 21st International Web for All Conference10.1145/3677846.3677867(42-44)Online publication date: 13-May-2024
  • (2024)Touchpad Mapper: Examining Information Consumption From 2D Digital Content Using Touchpads by Screen-Reader UsersProceedings of the 26th International ACM SIGACCESS Conference on Computers and Accessibility10.1145/3663548.3688505(1-4)Online publication date: 27-Oct-2024
  • (2024)A Recipe for Success? Exploring Strategies for Improving Non-Visual Access to Cooking InstructionsProceedings of the 26th International ACM SIGACCESS Conference on Computers and Accessibility10.1145/3663548.3675662(1-15)Online publication date: 27-Oct-2024
  • (2024)ChartA11y: Designing Accessible Touch Experiences of Visualizations with Blind Smartphone UsersProceedings of the 26th International ACM SIGACCESS Conference on Computers and Accessibility10.1145/3663548.3675611(1-15)Online publication date: 27-Oct-2024
  • (2024)Improving FlexType: Ambiguous Text Input for Users with Visual ImpairmentsProceedings of the 17th International Conference on PErvasive Technologies Related to Assistive Environments10.1145/3652037.3652059(130-139)Online publication date: 26-Jun-2024
  • (2024)Empowering Autonomous Digital Learning for Older AdultsExtended Abstracts of the CHI Conference on Human Factors in Computing Systems10.1145/3613905.3651133(1-6)Online publication date: 11-May-2024
  • (2024)SPICA: Interactive Video Content Exploration through Augmented Audio Descriptions for Blind or Low-Vision ViewersProceedings of the 2024 CHI Conference on Human Factors in Computing Systems10.1145/3613904.3642632(1-18)Online publication date: 11-May-2024
  • (2024)TADA: Making Node-link Diagrams Accessible to Blind and Low-Vision PeopleProceedings of the 2024 CHI Conference on Human Factors in Computing Systems10.1145/3613904.3642222(1-20)Online publication date: 11-May-2024
  • Show More Cited By

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Figures

Tables

Media

Share

Share

Share this Publication link

Share on social media