skip to main content
10.1145/1409240.1409285acmotherconferencesArticle/Chapter ViewAbstractPublication PagesmobilehciConference Proceedingsconference-collections
research-article

Evaluation of pause intervals between haptic/audio cues and subsequent speech information

Published: 02 September 2008 Publication History

Abstract

Non-speech sounds and haptics have an important role in enabling access to user assistance material in ubiquitous computing scenarios. Non-speech sounds and haptics can be used to cue assistance material that is to be presented to users via speech. In this paper, we report on a study that examines user perception of the duration of a pause between a cue (which may be non-speech sound, haptic, or combined non-speech sound plus haptic) and the subsequent delivery of assistance material using speech.

References

[1]
Adelstein, B. D., Begault, D. R., Anderson, M. R. and Wenzel, E. M. 2003. Sensitivity to haptic-audio asynchrony, Proceedings of the 5th international conference on Multimodal interfaces, 73--76.
[2]
Blattner, M., Sumikawa, D. and Greenberg, R. 1989. Earcons and icons: Their structure and common design principles. Human-Computer Interaction, 4, 11--44.
[3]
Block H. U., Caspari R. and Schachtl S. 2004. Callable Manuals. Information Technology, ISSN: 1611-2776, Volume: 46, June 2004, 299--305.
[4]
Bregman A. S. 1990. Auditory scene analysis, MIT Press.
[5]
Brewster, S. A. 1997. Navigating telephone-based interfaces with earcons. In Proceedings of BCS HCI'97, 39--56.
[6]
Brewster, S. A. and Brown, L. M. 2004. Non-visual information display using tactons, CHI '04 extended abstracts on Human factors in computing systems.
[7]
Chang, A. and O'Sullivan C. 2005. Audio-haptic feedback in mobile phones, CHI '05 extended abstracts on Human factors in computing systems.
[8]
Edworthy, J. and Hellier, E. 2006. Complex auditory signals and speech. In Wogalter, M (Ed) Handbook of Warnings. Chapter 15, 199--220. Lawrence Erlbaum
[9]
Gaver, W. 1997. Auditory interfaces. In Handbook of Human-Computer Interaction, 2nd ed. 1003--1041.
[10]
Jones, D. M. and Macken, W. J. 1993. Irrelevant Tones Produce an Irrelevant Speech Effect: Implications for Phonological Coding in Working Memory. Journal of Experimental Psychology: Learning Memory, and Cognition.
[11]
Kehoe, A., Neff, F., Pitt, I. and Russell G. 2007. Modifications to a Speech-Enabled User Assistance System Based on a Pilot Study. Proceeding ACM SIGDOC 2007 Conference. 42--47.
[12]
Larson J. A. 2002. Voicexml: Introduction to Developing Speech Applications, Prentice Hall.
[13]
Levitin, D. J., MacLean, K., Mathews, M., Chu L. and Jensen, E. 1999. The perception of cross-modal simultaneity. Proceedings Computing Anticipatory Systems. AIP Conf. Proc. 517, 323--329.
[14]
McTear, M. F. 2002. Spoken Dialogue Technology: Enabling the Conversational User Interface. Volume 34, Issue 1, 90--169. ACM Computing Surveys (CSUR).
[15]
VibeTonz system. 2008. Immersion Corporation http://www.immersion.com/mobility
[16]
Wrigley, S. N. and Brown, G. J. 2000. A model of auditory attention. Technical Report CS-00-07, Speech and Hearing Research Group, University of Sheffield.

Index Terms

  1. Evaluation of pause intervals between haptic/audio cues and subsequent speech information

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Other conferences
    MobileHCI '08: Proceedings of the 10th international conference on Human computer interaction with mobile devices and services
    September 2008
    568 pages
    ISBN:9781595939524
    DOI:10.1145/1409240
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

    In-Cooperation

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 02 September 2008

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. cues
    2. haptics
    3. non-speech sounds
    4. user assistance

    Qualifiers

    • Research-article

    Conference

    MobileHCI08

    Acceptance Rates

    Overall Acceptance Rate 202 of 906 submissions, 22%

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • 0
      Total Citations
    • 180
      Total Downloads
    • Downloads (Last 12 months)1
    • Downloads (Last 6 weeks)1
    Reflects downloads up to 22 Feb 2025

    Other Metrics

    Citations

    View Options

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Figures

    Tables

    Media

    Share

    Share

    Share this Publication link

    Share on social media