skip to main content
10.1145/1514095.1514108acmconferencesArticle/Chapter ViewAbstractPublication PageshriConference Proceedingsconference-collections
research-article

Providing route directions: design of robot's utterance, gesture, and timing

Published: 09 March 2009 Publication History

Abstract

Providing route directions is a complicated interaction. Utterances are combined with gestures and pronounced with appropriate timing. This study proposes a model for a robot that generates route directions by integrating three important crucial elements: utterances, gestures, and timing. Two research questions must be answered in this modeling process. First, is it useful to let robot perform gesture even though the information conveyed by the gesture is given by utterance as well? Second, is it useful to implement the timing at which humans speaks? Many previous studies about the natural behavior of computers and robots have learned from human speakers, such as gestures and speech timing. However, our approach is different from such previous studies. We emphasized the listener's perspective. Gestures were designed based on the usefulness, although we were influenced by the basic structure of human gestures. Timing was not based on how humans speak, but modeled from how they listen. The experimental result demonstrated the effectiveness of our approach, not only for task efficiency but also for perceived naturalness.

References

[1]
Alibali, M. 2005. Gesture in Spatial Cognition: Expressing, Communicating, and Thinking About Spatial Information. Spatial Cognition and Computation. 5, 4, 307--331.
[2]
Allen, G. 2003. Gestures Accompanying Verbal Route Directions: Do They Point to a New Avenue for Examining Spatial Representations? Spatial cognition and computation. 3, 4, 259--268.
[3]
Breazeal, C., et al. 2005. Effects of nonverbal communication on efficiency and robustness in human-robot teamwork. IEEE/RSJ Int. Conf. on Intelligent Robots and Systems (IROS2005). 383--388.
[4]
Burgard, W., et al. 1998. The Interactive Museum Tour-Guide Robot. In Proc. the National Conference on Artificial Intelligence (AAAI'98). 11--18.
[5]
Campione, E., et al. 2002. A Large-Scale Multilingual Study of Silent Pause Duration. Speech Prosody. 199--202.
[6]
Daniel, M., Tom, A., Manghi, E., Denis, M. 2003. Testing the Value of Route Directions through Navigational Performance. Spatial cognition and computation. 3, 4, 269--289.
[7]
Jaffe, J., and Feldstein, S. 1970. Rhythms of dialogue. Academic Press. New York.
[8]
Kanda, T., Kamasima, M., Imai, M., Ono, T., Sakamoto, D., Ishiguro H., and Anzai, Y. 2007. A humanoid robot that pretends to listen to route guidance from a human. Autonomous Robots. 22, 1, 87--100.
[9]
Kendon, A. 2004. Gesture: Visible Action as Utterance.
[10]
Kidd, C.D., and Breazeal, C. 2004. Effect of a Robot on User Perceptions. Int. Conf. on Intelligent Robots and Systems (IROS'04). 4, 28, 3559--3564.
[11]
Kita, S. 2003. Interplay of gaze, hand, torso orientation and language in pointing. In Kita, S. (Ed.) Pointing: Where Language, Culture, and Cognition Meet.
[12]
Komori, M., Yamamoto, Y., Nagaoka, C. 2006. Manipulation of pause durations for the facilitation of understanding of speech played back at higher rate. The Japanese journal of ergonomics. 42, 2, 64--69. (in Japanese)
[13]
Kopp, S., Tepper, P.A., Ferriman, K., Striegnitz, K., and Cassell, J. 2008. Trading Spaces: How Humans and Humanoids use Speech and Gesture to Give Directions. In Nishida, T. (Ed.) Conversational Informatics: An Engineering Approach.
[14]
Krauss, R.M. 1998. Why do we gesture when we speak? Current Directions in Psychological Science. 7, 54--59.
[15]
McLaughlin, M.L. 1984. Conversation: How talk is organized.
[16]
McNeill, D. 1987. Psycholinguistics: a new approach.
[17]
McNeill, D. 2005. Gesture and thought.
[18]
Mutlu, B., Hodgins, J.K., and Forlizzi, J. 2006. A Storytelling Robot: Modeling and Evaluation of Human-like Gaze Behavior. HUMANOIDS'06. 518--523.
[19]
Nagai, Y., et al. 2003. Emergence of Joint Attention based on Visual Attention and Self Learning. Int. Symposium on Adaptive Motion of Animals and Machines.
[20]
Nagaoka, C., et al. 2005. Influence of Response Latencies on Impression Evaluation of Speakers in Dialogues. Technical Report of IEICE. 104, 745, 57--60. (in Japanese)
[21]
Ogawa H., and Watanabe, T. 2001. InterRobot: speech-driven embodied interaction robot. Advanced Robotics. 15, 3, 371--377.
[22]
Ono, T., Imai, M., Ishiguro, H. 2001. A Model of Embodied Communications with Gestures between Humans and Robots. Proc. Annual Meeting of the Cognitive Science Society (CogSci2001). 732--737.
[23]
Pacchierotti, E., Christensen, H.I., Jensfelt, P. 2006. Design of an office guide robot for social interaction studies. Int. Conf. Intelligent Robots and Systems (IROS2006), 4965--4970.
[24]
Powers, A., Kiesler, S., Fussell, S., Torrey C. 2007. Comparing a Computer Agent with a Humanoid Robot. ACM/IEEE Conf. on Human-Robot Interaction (HRI2007). 145--152.
[25]
Robins, B., Dautenhahn, K., Boekhorst, R.te, and Nehaniv, C. L. 2008. Behaviour Delay and Robot Expressiveness in Child-Robot Interactions: A User Study on Interaction Kinetics. ACM/IEEE Int. Conf. on Human-Robot Interaction (HRI2008). 17--24.
[26]
Sacks, H., Schegloff, E.A., and Jefferson, G. 1974. A simplest systematic for the organization of turn-taking for conversation. Language. 50, 4, 696--735.
[27]
Scassellati, B. 2000. Investigating Models of Social Development Using a Humanoid Robot. Biorobotics. MIT Press.
[28]
Shinozawa, K., et al. 2005. Differences in Effect of Robot and Screen Agent Recommendations on Human Decision-Making. Int. J. of Human-Computer Studies. 62, 267--279.
[29]
Shiwa, T., Kanda, T., Imai, M., Ishiguro, H., Hagita, N. 2008. How Quickly Should Communication Robots Respond? ACM/IEEE Conf. on Human-Robot Interaction (HRI2008). 153--160.
[30]
Sidner, A.L., Kidd, C.D., Lee, C., and Lesh, N. 2004. Where to look: a study of human-robot engagement. Intelligent User Interfaces (IUI'04). 78--84.
[31]
Sidner, C.L., et al. 2006. The effect of head-nod recognition in human-robot conversation. ACM/IEEE Conf. on Human-Robot Interaction (HRI2006). 290--296.
[32]
Skubic, M., Blisard, S., Bailey, C., Adams, J.A., Matsakis, P. 2004. Qualitative analysis of sketched route maps: translating a sketch into linguistic descriptions. IEEE Trans. on Systems, Man, and Cybernetics. Part B, 34, 2, 1275--1282.
[33]
Striegnitz, K., Tepper, P., Lovett, A., Cassell, J. 2005. Knowledge representation for generating locating gestures in route directions. WS in Spatial Language and Dialogue.
[34]
Trafton, J.G., Cassimatis, N.L., Bugajska, M.D., Brock, D.P., Mintz, F.E., Schultz, A.C. 2005. Enabling Effective Human-Robot Interaction Using Perspective-Taking in Robots. IEEE Transactions on Systems, Man and Cybernetics. Part A, 35, 4, 460--470.
[35]
Trafton, J.G., et al. 2006. The Relationship Between Spatial Transformations and Iconic Gestures. Spatial Cognition and Computation. 6, 1, 1--29.
[36]
Yamamoto, M., and Watanabe, T. 2004. Timing control effects of utterance to communicative actions on embodied interaction with a robot. IEEE Int. Workshop on Robot and Human Communication (ROMAN2004). 467--472.
[37]
Yamamoto, M., Watanabe, T. 2006. Time Lag Effects of Utterance to Communicative Actions on CG Character-Human Greeting Interaction (ROMAN2006). 629--634.
[38]
Yamazaki, A., Yamazaki, K., Kuno, Y., Burdelski, M., Kawashima, M., and Kuzuoka, H. 2008. Precision timing in human-robot interaction: coordination of head movement and utterance. CHI '08. 131--140.
[39]
Zellner, B. 1994. Pauses and the temporal structure of speech. In Keller, E. (Ed.) Fundamentals of speech synthesis and speech recognition. 41--61.
[40]
Zvonik, E., Cummins, F. 2002. Pause duration and variability in read texts. Int. Conf. on Spoken Language Processing (ICSLP-2002). 1109--1112
[41]
Zvonik, E., Cummins, F. 2003. The effect of surrounding phrase lengths on pause duration. European Conference on Speech Communication and Technology (EUROSPEECH-2003). 777--780.

Cited By

View all
  • (2024)Is there Really an Effect of Time Delays on Perceived Fluency and Social attributes between Humans and Social Robots? A Pilot StudyCompanion of the 2024 ACM/IEEE International Conference on Human-Robot Interaction10.1145/3610978.3640667(1013-1017)Online publication date: 11-Mar-2024
  • (2024)Combining Emotional Gestures, Sound Effects, and Background Music for Robotic Storytelling - Effects on Storytelling Experience, Emotion Induction, and Robot PerceptionProceedings of the 2024 ACM/IEEE International Conference on Human-Robot Interaction10.1145/3610977.3634956(687-696)Online publication date: 11-Mar-2024
  • (2024)(Gestures Vaguely): The Effects of Robots' Use of Abstract Pointing Gestures in Large-Scale EnvironmentsProceedings of the 2024 ACM/IEEE International Conference on Human-Robot Interaction10.1145/3610977.3634924(293-302)Online publication date: 11-Mar-2024
  • Show More Cited By

Index Terms

  1. Providing route directions: design of robot's utterance, gesture, and timing

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Conferences
    HRI '09: Proceedings of the 4th ACM/IEEE international conference on Human robot interaction
    March 2009
    348 pages
    ISBN:9781605584041
    DOI:10.1145/1514095
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

    Sponsors

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 09 March 2009

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. gesture
    2. route directions
    3. timing

    Qualifiers

    • Research-article

    Conference

    HRI09
    HRI09: International Conference on Human Robot Interaction
    March 9 - 13, 2009
    California, La Jolla, USA

    Acceptance Rates

    Overall Acceptance Rate 268 of 1,124 submissions, 24%

    Upcoming Conference

    HRI '25
    ACM/IEEE International Conference on Human-Robot Interaction
    March 4 - 6, 2025
    Melbourne , VIC , Australia

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)38
    • Downloads (Last 6 weeks)2
    Reflects downloads up to 19 Feb 2025

    Other Metrics

    Citations

    Cited By

    View all
    • (2024)Is there Really an Effect of Time Delays on Perceived Fluency and Social attributes between Humans and Social Robots? A Pilot StudyCompanion of the 2024 ACM/IEEE International Conference on Human-Robot Interaction10.1145/3610978.3640667(1013-1017)Online publication date: 11-Mar-2024
    • (2024)Combining Emotional Gestures, Sound Effects, and Background Music for Robotic Storytelling - Effects on Storytelling Experience, Emotion Induction, and Robot PerceptionProceedings of the 2024 ACM/IEEE International Conference on Human-Robot Interaction10.1145/3610977.3634956(687-696)Online publication date: 11-Mar-2024
    • (2024)(Gestures Vaguely): The Effects of Robots' Use of Abstract Pointing Gestures in Large-Scale EnvironmentsProceedings of the 2024 ACM/IEEE International Conference on Human-Robot Interaction10.1145/3610977.3634924(293-302)Online publication date: 11-Mar-2024
    • (2024)Enhancing the Mobile Humanoid Robot’s Emotional Expression with Affective Vertical-OscillationsInternational Journal of Social Robotics10.1007/s12369-024-01142-016:7(1523-1540)Online publication date: 14-Jun-2024
    • (2024)How to Make a Robot Guide?Experimental Robotics10.1007/978-3-031-63596-0_43(483-494)Online publication date: 6-Aug-2024
    • (2023)Expanding the Senses: Considering the Use of Active Smell Delivery for Human-Robot InteractionsProceedings of the 11th International Conference on Human-Agent Interaction10.1145/3623809.3623971(479-481)Online publication date: 4-Dec-2023
    • (2023)Crossing Reality: Comparing Physical and Virtual Robot DeixisProceedings of the 2023 ACM/IEEE International Conference on Human-Robot Interaction10.1145/3568162.3576972(152-161)Online publication date: 13-Mar-2023
    • (2023)Best of Both Worlds? Combining Different Forms of Mixed Reality Deictic GesturesACM Transactions on Human-Robot Interaction10.1145/356338712:1(1-23)Online publication date: 15-Feb-2023
    • (2023)Read the Room: Adapting a Robot's Voice to Ambient and Social Contexts2023 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS)10.1109/IROS55552.2023.10341925(3998-4005)Online publication date: 1-Oct-2023
    • (2022)The Design and Observed Effects of Robot-performed Manual Gestures: A Systematic ReviewACM Transactions on Human-Robot Interaction10.1145/354953012:1(1-62)Online publication date: 19-Jul-2022
    • Show More Cited By

    View Options

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Figures

    Tables

    Media

    Share

    Share

    Share this Publication link

    Share on social media