skip to main content
research-article

Universal Design of Auditory Graphs: A Comparison of Sonification Mappings for Visually Impaired and Sighted Listeners

Published: 01 March 2010 Publication History

Abstract

Determining patterns in data is an important and often difficult task for scientists and students. Unfortunately, graphing and analysis software typically is largely inaccessible to users with vision impairment. Using sound to represent data (i.e., sonification or auditory graphs) can make data analysis more accessible; however, there are few guidelines for designing such displays for maximum effectiveness. One crucial yet understudied design issue is exactly how changes in data (e.g., temperature) are mapped onto changes in sound (e.g., pitch), and how this may depend on the specific user. In this study, magnitude estimation was used to determine preferred data-to-display mappings, polarities, and psychophysical scaling functions relating data values to underlying acoustic parameters (frequency, tempo, or modulation index) for blind and visually impaired listeners. The resulting polarities and scaling functions are compared to previous results with sighted participants. There was general agreement about polarities obtained with the two listener populations, with some notable exceptions. There was also evidence for strong similarities regarding the magnitudes of the slopes of the scaling functions, again with some notable differences. For maximum effectiveness, sonification software designers will need to consider carefully their intended users’ vision abilities. Practical implications and limitations are discussed.

References

[1]
Alty, J. L. and Rigas, D. I. 1998. Communicating graphical information to blind users using music: The role of context. In Proceedings of the ACM SIGCHI Conference on Human Factors in Computing Systems (CHI’98). ACM Press/Addison-Wesley Publishing, 574--581.
[2]
Brewster, S. A. 2002. Visualization tool for blind people using multiple modalities. Disabil. Rehab. Technol. 24, 613--621.
[3]
Brown, L., Brewster, S. A., Ramloll, R., Riedel, B., and Yu, W. 2002. Browsing modes for exploring sonified line graphs. In Proceedings of the British Human-Computer Interface Conference. 6--9.
[4]
Davison, B. and Walker, B. N. 2007. Sonification Sandbox overhaul: Software standard for auditory graphs. In Proceedings of the International Conference on Auditory Display (ICAD’07). 386--390.
[5]
Edworthy, J., Hellier, E. J., and Hards, R. 1995. The semantic associations of acoustic parameters commonly used in the design of auditory information and warning signals. Ergonom. 38, 2341--2361.
[6]
Edworthy, J., Loxley, S., and Dennis, I. 1991. Improving auditory warning design: Relationship between warning sound parameters and perceived urgency. Hum. Factors 33, 205--231.
[7]
Flowers, J. H. and Grafel, D. C. 2002. Perception of sonified daily weather records. In Proceedings of the Annual Meeting of the Human Factors and Ergonomics Society (HFES).
[8]
Gardner, J. A. 1999. The accessible graphing calculator: A self-voicing graphing scientific calculator for windows. http://dots.physics.orst.edu/calculator/.
[9]
Gardner, J. A., Lundquist, R., and Sahyun, S. 1996. TRIANGLE: A practical application of non-speech audio for imparting information. In Proceedings of the International Conference on Auditory Display. 59--60.
[10]
Gescheider, G. A. 1997. Psychophysics: The Fundamentals. Lawrence Erlbaum Associates, Mahwah, NJ.
[11]
Hellier, E. J., Edworthy, J., and Dennis, I. 1995. A comparison of different techniques for scaling perceived urgency. Ergonom. 38, 659--670.
[12]
Hellier, E. J., Edworthy, J., Weedon, B., Walters, K., and Adams, A. 2002. The perceived urgency of speech warnings: Semantics versus acoustics. Hum. Factors 44, 1--17.
[13]
Kramer, G., Walker, B. N., Bonebright, T., Cook, P., Flowers, J., Miner, N., Neuhoff, J. G., Bargar, R., Barrass, S., Berger, J., Evreinov, G., Fitch, W. T., Gröhn, M., Handel, S., Kaper, H., Levkowitz, H., Lodha, S., Shinn-Cunningham, B., Simoni, M., and Tipei, S. 1999. The sonification report: Status of the field and research agenda. Rep. prepared for the National Science Foundation by members of the International Community for Auditory Display International Community for Auditory Display (ICAD).
[14]
Martins, A. C. G. and Rangayyan, R. M. 1997. Experimental evaluation of auditory display and sonification of textured images. In Proceedings of the 4th International Conference on Auditory Display (ICAD’97), E. Mynatt and J. A. Ballas Eds., 129--134.
[15]
Meijer, P. 2004. The voice accessible graphing calculator. http://dots.physics.orst.edu/calculator/.
[16]
NASA Information Access Lab. 2004. NASA LTP information access lab: MDE graphing calculator demonstration (Beta 2.0). http://prime.jsc.nasa.gov/demo/IAL/MDE.html.
[17]
Nees, M. A. and Walker, B. N. 2009. Auditory interfaces and sonification. In The Universal Access Handbook, C. Stephanidis Ed., Lawrence Erlbaum Associates, New York, 507--521.
[18]
Nesbitt, K. V. and Barrass, S. 2002. Evaluation of a multimodal sonification and visualization of depth of market stock data. In Proceedings of the International Conference on Auditory Display (ICAD’02). 233--238.
[19]
Neuhoff, J. G. and Wayand, J. 2002. Pitch change, sonification, and musical expertise: Which way is up? In Proceedings of the International Conference on Auditory Display (ICAD’02). R. Nakatsu and H. Kawahara Eds., 351--356.
[20]
Stevens, S. S. 1975. Psychophysics: Introduction to its Perceptual, Neural, and Social Prospects. Wiley, New York.
[21]
Valenzuela, M. L., Sansalone, M. J., Krumhansl, C. L., and Streett, W. B. 1997. Use of sound for the interpretation of impact-echo signals. In Proceedings of the 4th International Conference on Auditory Display (ICAD’97). E. Mynatt and J. A. Ballas Eds., 47--56.
[22]
Walker, B. N. 2000. Magnitude estimation of conceptual data dimensions for use in sonification. Tech. rep., Psychology Department, Rice University, Houston, TX.
[23]
Walker, B. N. 2002. Magnitude estimation of conceptual data dimensions for use in sonification. J. Exper. Psych. Appl. 8, 211--221.
[24]
Walker, B. N. 2007. Consistency of magnitude estimations with conceptual data dimensions used for sonification. Appl. Cogn. Psych. 21, 579--599.
[25]
Walker, B. N. and Cothran, J. T. 2003. Sonification Sandbox: A graphical toolkit for auditory graphs. In Proceedings of the International Conference on Auditory Display (ICAD’03). 161--163.
[26]
Walker, B. N. and Kramer, G. 2005. Mappings and metaphors in auditory displays: An experimental assessment. ACM Trans. Appl. Percept. 2, 407--412.
[27]
Walker, B. N. and Kramer, G. 2006. Sonification. In International Encyclopedia of Ergonomics and Human Factors, W. Karwowski Ed., CRC Press, New York, 1254--1256.
[28]
Walker, B. N. and Lowey, M. 2004. Sonification Sandbox: A graphical toolkit for auditory graphs. In Proceedings of the Rehabilitation Engineering and Assistive Technology Society of America (RESNA) 27th International Conference.
[29]
Walker, B. N. and Nees, M. A. 2008. Data density and trend reversals in auditory graphs: Effects on point estimation and trend identification tasks. ACM Trans. Appl. Percept. 5, 1--24.
[30]
Wickens, C. D., Lee, J. D., Liu, Y., and Gordon-Becker, S. E. 2004. An Introduction to Human Factors Engineering. Pearson Prentice Hall, Upper Saddle River, NJ.
[31]
Zhao, H. 2005. Interactive sonification of geo-referenced data. In Proceedings of the ACM Conference on Human Factors in Computing Systems (CHI’05). 1134--1135.

Cited By

View all
  • (2024)Towards Designing Digital Learning Tools for Students with Cortical/Cerebral Visual Impairments: Leveraging Insights from Teachers of the Visually ImpairedProceedings of the 26th International ACM SIGACCESS Conference on Computers and Accessibility10.1145/3663548.3675636(1-18)Online publication date: 27-Oct-2024
  • (2024)SonicVista: Towards Creating Awareness of Distant Scenes through SonificationProceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies10.1145/36596098:2(1-32)Online publication date: 15-May-2024
  • (2024)Erie: A Declarative Grammar for Data SonificationProceedings of the 2024 CHI Conference on Human Factors in Computing Systems10.1145/3613904.3642442(1-19)Online publication date: 11-May-2024
  • Show More Cited By

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Transactions on Accessible Computing
ACM Transactions on Accessible Computing  Volume 2, Issue 3
March 2010
60 pages
ISSN:1936-7228
EISSN:1936-7236
DOI:10.1145/1714458
Issue’s Table of Contents
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 01 March 2010
Accepted: 01 June 2009
Revised: 01 June 2009
Received: 01 November 2008
Published in TACCESS Volume 2, Issue 3

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. Magnitude estimation
  2. auditory display
  3. visually impaired

Qualifiers

  • Research-article
  • Research
  • Refereed

Funding Sources

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)49
  • Downloads (Last 6 weeks)7
Reflects downloads up to 08 Mar 2025

Other Metrics

Citations

Cited By

View all
  • (2024)Towards Designing Digital Learning Tools for Students with Cortical/Cerebral Visual Impairments: Leveraging Insights from Teachers of the Visually ImpairedProceedings of the 26th International ACM SIGACCESS Conference on Computers and Accessibility10.1145/3663548.3675636(1-18)Online publication date: 27-Oct-2024
  • (2024)SonicVista: Towards Creating Awareness of Distant Scenes through SonificationProceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies10.1145/36596098:2(1-32)Online publication date: 15-May-2024
  • (2024)Erie: A Declarative Grammar for Data SonificationProceedings of the 2024 CHI Conference on Human Factors in Computing Systems10.1145/3613904.3642442(1-19)Online publication date: 11-May-2024
  • (2024)Open Your Ears and Take a Look: A State‐of‐the‐Art Report on the Integration of Sonification and VisualizationComputer Graphics Forum10.1111/cgf.1511443:3Online publication date: 10-Jun-2024
  • (2024)Beyond Vision Impairments: Redefining the Scope of Accessible Data RepresentationsIEEE Transactions on Visualization and Computer Graphics10.1109/TVCG.2024.335656630:12(7619-7636)Online publication date: 1-Dec-2024
  • (2024)TactualPlot: Spatializing Data as Sound Using Sensory Substitution for Touchscreen AccessibilityIEEE Transactions on Visualization and Computer Graphics10.1109/TVCG.2023.332693730:1(836-846)Online publication date: 1-Jan-2024
  • (2023)Sonificación y periodismo: la representación de datos mediante sonidosRevista de Comunicación10.26441/RC22.1-2023-302222:1Online publication date: 15-Jan-2023
  • (2023)Increasing Web3D Accessibility with Audio CaptioningProceedings of the 28th International ACM Conference on 3D Web Technology10.1145/3611314.3615902(1-10)Online publication date: 9-Oct-2023
  • (2023)The Accessibility of Data Visualizations on the Web for Screen Reader Users: Practices and Experiences During COVID-19ACM Transactions on Accessible Computing10.1145/355789916:1(1-29)Online publication date: 29-Mar-2023
  • (2023)Exploring Chart Question Answering for Blind and Low Vision UsersProceedings of the 2023 CHI Conference on Human Factors in Computing Systems10.1145/3544548.3581532(1-15)Online publication date: 19-Apr-2023
  • Show More Cited By

View Options

Login options

Full Access

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Figures

Tables

Media

Share

Share

Share this Publication link

Share on social media