skip to main content
10.1145/1620509.1620528acmotherconferencesArticle/Chapter ViewAbstractPublication PagesautomotiveuiConference Proceedingsconference-collections
research-article

Enhanced auditory menu cues improve dual task performance and are preferred with in-vehicle technologies

Published: 21 September 2009 Publication History

Abstract

Auditory display research for driving has mainly focused on collision warning signals, and recent studies on auditory in-vehicle information presentation have examined only a limited range of tasks (e.g., cell phone operation tasks or verbal tasks such as reading digit strings). The present study used a dual task paradigm to evaluate a plausible scenario in which users navigated a song list. We applied enhanced auditory menu navigation cues, including spearcons (i.e., compressed speech) and a spindex (i.e., a speech index that used brief audio cues to communicate the user's position in a long menu list). Twenty-four undergraduates navigated through an alphabetized song list of 150 song titles---rendered as an auditory menu---while they concurrently played a simple, perceptual-motor, ball-catching game. The menu was presented with text-to-speech (TTS) alone, TTS plus one of three types of enhanced auditory cues, or no sound at all. Both performance of the primary task (success rate of the game) and the secondary task (menu search time) were better with the auditory menus than with no sound. Subjective workload scores (NASA TLX) and user preferences favored the enhanced auditory cue types. Results are discussed in terms of multiple resources theory and practical IVT design applications.

References

[1]
W. J. Horrey and C. D. Wickens, "Driving and side task performance: The effects of display clutter, separation, and modality," Human Factors, vol. 46, p. 611--624, 2004.
[2]
W. J. Horrey, C. D. Wickens, and K. P. Consalus, "Modeling drivers' visual attention allocation while interacting with in-vehicle technologies," Journal of Experimental Psychology: Applied, vol. 12, pp. 67--78, 2006.
[3]
S. Ashley, "Driving the info highway," Scientific American, vol. 285, pp. 44--50, 2001.
[4]
T. Dukic, L. Hanson, and T. Falkmer, "Effect of drivers' age and push button locations on visual time off road, steering wheel deviation and safety perception," Ergonomics, vol. 49, pp. 78--92, 2006.
[5]
C. Patten, A. Kircher, J. Ostlund, and L. Nilsson, "Using mobile telephones: Cognitive workload and attention resource allocation," Accident Analysis and Prevention, vol. 36, pp. 341--350, 2004.
[6]
T. C. Lansdown, N. Brook-Carter, and T. Kersloot, "Distraction from multiple in-vehicle secondary tasks: Vehicle performance and mental workload implications," Ergonomics, vol. 47, pp. 91--104, 2004.
[7]
O. Tsimhoni and P. A. Green, "Visual demand of driving and the execution of display-intensive in-vehicle tasks," in Proceedings of the Human Factors and Ergonomics Society Annual Meeting (HFES 2001). vol. 2, 2001, pp. 1586--1590.
[8]
P. C. Burns and T. C. Lansdown, "E-Distraction: The challenges for safe and usable internet services in vehicles.," in Internet Forum on the Safety Impact of Driver Distraction When Using In-Vehicle Technologies www.driverdistraction.org, 2000, July--August.
[9]
F. Schieber, A. Holtz, B. Schlorholtz, and R. McCall, "Analysis of visual demands of in-vehicle text displays reveals an age-related increase in time needed to reallocate attention to the road," in Human Factors and Ergonomics Society (HFES2008). vol. 52, 2008, pp. 149--153.
[10]
S. Brewster, "Using non-speech sound to overcome information overload," Displays, vol. 17, pp. 179--189, 1997.
[11]
M. L. Brown, S. L. Newsome, and E. P. Glinert, "An experiment into the use of auditory cues to reduce visual workload," in ACM CHI 89 Human Factors in Computing Systems Conference (CHI 89), 1989, pp. 339--346.
[12]
Y. C. Liu, "Comparative study of the effects of auditory, visual and multimodality displays on drivers' performance in advanced traveller information systems," Ergonomics, vol. 44, pp. 425--442, 2001.
[13]
C. D. Wickens, "Multiple resources and performance prediction," Theoretical Issues in Ergonomics Science, vol. 3, pp. 159--177, 2002.
[14]
C. D. Wickens and Y. Liu, "Codes and modalities in multiple resources: A success and a qualification," Human Factors, vol. 30, pp. 599--616, 1988.
[15]
C. Spence and J. Driver, "Audiovisual links in attention: Implications for interface design," in Engineering Psychology and Cognitive Ergonomics Vol. 2: Job Design and Product Design, D. Harris, Ed. Hampshire: Ashgate Publishing, 1997, pp. 185--192.
[16]
N. A. Stanton, R. T. Booth, and R. B. Stammers, "Alarms in human supervisory control: A human factors perspective," International Journal of Computer Integrated Manufacturing, vol. 5, pp. 81--93, 1992.
[17]
G. Matthews, T. J. Sparkes, and H. M. Bygrave, "Attentional overload, stress, and simulate driving performance," Human Performance, vol. 9, pp. 77--101, 1996.
[18]
J. D. Lee, B. F. Gore, and J. L. Campbell, "Display alternatives for in-vehicle warning and sign information: Message style, location, and modality," Transportation Human Factors, vol. 1, pp. 347--375, 1999.
[19]
T. A. Dingus, D. V. McGehee, N. Manakkal, S. K. Jahns, C. Carney, and J. M. Hankey, "Human factors field evaluation of automotive headway maintenance/collision warning devices," Human Factors, vol. 39, pp. 216--229, 1997.
[20]
D. L. Strayer and F. A. Drews, "Cell-phone-induced driver distraction," Current Directions in Psychological Science, vol. 16, pp. 128--131, 2007.
[21]
M. A. Mollenhauer, J. D. Lee, K. Cho, M. C. Hulse, and T. A. Dingus, "The effects of sensory modality and information priority on in-vehicle signing and information systems," in Human Factors and Ergonomics Society (HFES 1994). vol. 38: Human Factors and Ergonomics Society, 1994, pp. 1072--1076.
[22]
A. Parkes and G. Burnett, "An evaluation of medium range "advance information" in route-guidance displays for use in vehicles," in IEEE Vehicle Navigation&Information Systems Conference Ottawa, Canada, 1993, pp. 238--241.
[23]
T. A. Ranney, J. L. Harbluk, and Y. I. Noy, "Effects of voice technology on test track driving performance: Implications for driver distraction," Human Factors, vol. 47, p. 439--454, 2005.
[24]
G. E. Burnett and S. M. Joyner, "An assessment of moving map and symbol-based route guidance systems," Ergonomics and Safety of Intelligent Driver Interfaces, pp. 115--136, 1997.
[25]
T. A. Dingus, M. C. Hulse, D. V. McGehee, and R. Manakkal, "Driver performance results from the TravTek IVHS camera car evaluation study," in Human Factors and Ergonomics Society (HFES 1994). vol. 38: Human Factors and Ergonomics Society, 1994, pp. 1118--1122.
[26]
K. W. Gish, L. Staplin, J. Stewart, and M. Perel, "Sensory and cognitive factors affecting automotive head-up display effectiveness," Transportation Research Record: Journal of the Transportation Research Board, vol. 1694, pp. 10--19, 1999.
[27]
L. Streeter, D. Vitello, and S. Wonsiewicz, "How to tell people where to go: Comparing navigational aids," International Journal of Man-Machine Studies, vol. 22, pp. 549--562, 1985.
[28]
J. Walker, E. Alicandri, C. Sedney, and K. Roberts, "In-vehicle navigation devices: Effects on the safety of driver performance," McLean, VA: Performer: Federal Highway Administration, Office of Safety and Traffic Operations Research and Development, 1991, p. 107.
[29]
C. Ho and C. Spence, "Assessing the effectiveness of various auditory cues in capturing a driver's visual attnetion," Journal of Experimental Psychology: Applied, vol. 11, pp. 157--174, 2005.
[30]
B. N. Walker, A. Nance, and J. Lindsay, "Spearcons: Speech-based earcons improve navigation performance in auditory menus," in International Conference on Auditory Display (ICAD2006) London, UK, 2006, pp. 95--98.
[31]
W. W. Gaver, "Using and creating auditory icons," in Auditory display: Sonificaiton, audification, and auditory interfaces, G. Kramer, Ed. MA: Addison-Wesley, 1994, pp. 417--446.
[32]
M. M. Blattner, D. A. Sumikawa, and R. M. Greenberg, "Earcons and icons: Their structure and common design principles," Human-Computer Interaction, vol. 4, pp. 11--44, 1989.
[33]
D. Palladino and B. N. Walker, "Navigation efficiency of two dimensional auditory menus using spearcon enhancements," in Annual Meeting of the Human Factors and Ergonomics Society (HFES 2008) New York, NY (22--26 September), 2008, pp. 1262--1266.
[34]
D. Palladino and B. N. Walker, "Efficiency of spearcon-enhanced navigation of one dimensional electronic menus," in International Conference on Auditory Display (ICAD08) Paris, France, 2008.
[35]
B. N. Walker and A. Kogan, "Spearcons enhance performance and preference for auditory menus on a mobile phone," in 5th International Conference on Universal Access in Human-Computer Interaction (UAHCI) at HCI International 2009 San Diego, CA, USA, 2009.
[36]
T. Dingler, J. Lindsay, and B. N. Walker, "Learnability of sound cues for environmental features: Auditory icons, earcons, spearcons, and speech," in 15th International Conference on Auditory Display (ICAD 08) Paris, France, 2008.
[37]
D. Palladino and B. N. Walker, "Learning rates for auditory menus enhanced with spearcons versus earcons," in International Conference on Auditory Display (ICAD2007) Montreal, Canada, 2007, pp. 274--279.
[38]
M. Jeon and B. N. Walker, ""Spindex": Accelerated initial speech sounds improve navigation performance in auditory menus," in Human Factors and Ergonomics Society (HFES 2009) San Antonio, TX, 2009, p. TBD.
[39]
P. Klante, "Auditory interaction objects for mobile applications," in 24th annual ACM International Conference on Design of Communication Myrtle Beach, SC, USA, 2004, pp. 157--163.
[40]
M. Jeon and B. N. Walker, "Optimized spindex improves both acceptance and performance with auditory menus," Georgia Institute of Technology Sonification Lab Technical Report, 2009.
[41]
A. Treisman and A. Davies, "Divided attention to ear and eye," Attention and Performance IV, pp. 101--117, 1973.
[42]
D. Gopher, M. Brickner, and D. Navon, "Different difficulty manipulations interact differently with task emphasis: Evidence for multiple resources," Journal of Experimental Psychology: Human Perception and Performance, vol. 8, pp. 146--157, 1982.
[43]
A.-M. Bonnel and E. R. Hafter, "Divided attention between simltaneous auditory and visual signals," Percpetion&Psychophysics, vol. 60, pp. 179--190, 1998.
[44]
S. G. Hart, "NASA-Task Load Index (NASA-TLX); 20 years later," in Human Factors and Ergonomics Society 50th Annual Meeting San Francisco, CA, 2006, pp. 904--908.
[45]
J. Edworthy, "Does sound help us to work better with machines? A commentary on Rautenberg's paper 'About the importance of auditory alarms during the operation of a plant simulator'," Interacting with Computers, vol. 10, pp. 401--409, 1998.
[46]
G. Kramer, "An introduction to auditory display," in Auditory display: Sonification, audification, and auditory interfaces, G. Kramer, Ed. MA: Addison Wesley, 1994, pp. 1--78.

Cited By

View all
  • (2025)Understanding preferenceInternational Journal of Human-Computer Studies10.1016/j.ijhcs.2024.103408195:COnline publication date: 1-Jan-2025
  • (2024)Sonically-enhanced in-vehicle air gesture interactions: evaluation of different spearcon compression ratesJournal on Multimodal User Interfaces10.1007/s12193-024-00430-3Online publication date: 19-May-2024
  • (2023)Novel In-Vehicle Gesture Interactions: Design and Evaluation of Auditory Displays and Menu Generation InterfacesProceedings of the 15th International Conference on Automotive User Interfaces and Interactive Vehicular Applications10.1145/3580585.3607164(224-233)Online publication date: 18-Sep-2023
  • Show More Cited By

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Other conferences
AutomotiveUI '09: Proceedings of the 1st International Conference on Automotive User Interfaces and Interactive Vehicular Applications
September 2009
143 pages
ISBN:9781605585710
DOI:10.1145/1620509

In-Cooperation

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 21 September 2009

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. IVTs (in-vehicle technologies)
  2. TTS (text-to-speech)
  3. auditory display
  4. auditory menus
  5. dual task
  6. infotainment
  7. multiple resources
  8. spearcon
  9. spindex

Qualifiers

  • Research-article

Conference

AutomotiveUI '09

Acceptance Rates

Overall Acceptance Rate 248 of 566 submissions, 44%

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)16
  • Downloads (Last 6 weeks)0
Reflects downloads up to 30 Jan 2025

Other Metrics

Citations

Cited By

View all
  • (2025)Understanding preferenceInternational Journal of Human-Computer Studies10.1016/j.ijhcs.2024.103408195:COnline publication date: 1-Jan-2025
  • (2024)Sonically-enhanced in-vehicle air gesture interactions: evaluation of different spearcon compression ratesJournal on Multimodal User Interfaces10.1007/s12193-024-00430-3Online publication date: 19-May-2024
  • (2023)Novel In-Vehicle Gesture Interactions: Design and Evaluation of Auditory Displays and Menu Generation InterfacesProceedings of the 15th International Conference on Automotive User Interfaces and Interactive Vehicular Applications10.1145/3580585.3607164(224-233)Online publication date: 18-Sep-2023
  • (2023)Earcons to reduce mode confusions in partially automated vehicles: Development and application of an evaluation methodInternational Journal of Human-Computer Studies10.1016/j.ijhcs.2023.103044176(103044)Online publication date: Aug-2023
  • (2023)A review of flexible printed sensors for automotive infotainment systemsArchives of Civil and Mechanical Engineering10.1007/s43452-023-00604-y23:1Online publication date: 23-Jan-2023
  • (2022)Multimodal Displays for Takeover RequestsUser Experience Design in the Era of Automated Driving10.1007/978-3-030-77726-5_15(397-424)Online publication date: 1-Jan-2022
  • (2021)How Compatible is Alexa with Dual Tasking? — Towards Intelligent Personal Assistants for Dual-Task SituationsProceedings of the 9th International Conference on Human-Agent Interaction10.1145/3472307.3484165(103-111)Online publication date: 9-Nov-2021
  • (2020)The Lateral LineProceedings of the Augmented Humans International Conference10.1145/3384657.3384775(1-10)Online publication date: 16-Mar-2020
  • (2019)Adaptive Auditory Alerts for Smart In-Vehicle InterfacesProceedings of the Human Factors and Ergonomics Society Annual Meeting10.1177/107118131963140463:1(1545-1549)Online publication date: 20-Nov-2019
  • (2019)Auditory Distraction in HCIProceedings of the 14th International Audio Mostly Conference: A Journey in Sound10.1145/3356590.3356601(61-66)Online publication date: 18-Sep-2019
  • Show More Cited By

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Figures

Tables

Media

Share

Share

Share this Publication link

Share on social media