skip to main content
10.1145/1228716.1228750acmconferencesArticle/Chapter ViewAbstractPublication PageshriConference Proceedingsconference-collections
Article

Exploring adaptive dialogue based on a robot's awareness of human gaze and task progress

Published: 10 March 2007 Publication History

Abstract

When a robot provides direction--as a guide, an assistant, or as an instructor--the robot may have to interact with people of different backgrounds and skill sets. Different people require informat on adapted to their level of understanding. In this paper, we explore the use of two simple forms of awareness that a robot might use to infer that a person needs further verbal elaboration during a tool select on task. First, the robot could use an eye tracker for inferring whether the person is looking at the robot and thus in need of further elaboration. Second, the robot could monitor delays in the individual's task progress, indicating that he or she could use further elaboration. We investigated the effects of these two types of awareness on performance time, selection mistakes, and the number of questions people asked the robot. We did not observe any obvious benefits of our gaze awareness manipulation. Awareness of task delays did reduce the number of questions participants' asked compared to our control condition but did not significantly reduce the number of select on mistakes. The mixed results of our investigation suggest that more research is necessary before we can understand how awareness of gaze and awareness of task delay can be successfully implemented in human-robot dialogue.

References

[1]
Argyle, M. and Cook, M. Gaze and Mutual Gaze. Cambridge University Press, 1976.
[2]
Brooks, R.A., Breazeal, C., Marjanovic, M., Scassellati, B., and Williamson, M.M. The cog project: Building a humanoid robot. Lecture Notes in Computer Science 1562 (1999), 52--87.
[3]
Clark, H. Using Language. Cambridge University Press, 1996.
[4]
Clark, H. and Krych, M. Speaking while monitoring addressees for understanding. Journal of Memory and Language, 50 (2004), 62--81.
[5]
Clark, H. and Wilkes-Gibbs, D. Referring as a collaborative process. Cognition, 22 (1986), 1--39.
[6]
Duncan, S. and Fiske, D.W. Face-to-face interaction: Research methods and theory. Erlbaum, Hilldale, NJ, 1977.
[7]
Fussell, S. and Krauss, R. Coordination of knowledge in communication: Effects of speakers' assumptions about what others know. Journal of Personality and Social Psychology, 62 (1992), 378--391.
[8]
Gergle, D., Kraut, R. E., and Fussell, S. R. Act on as language in a shared visual space. Proceedings of CSCW 2004, 2004, 487--496.
[9]
Imai, M., Kanda, T., Ono, T., Ishiguro, H. and Mase, K. Robot-mediated round table: Analysis of the effect of robots gaze. Proceedings of 11th IEEE International Workshop on Robot and Human Communication (ROMAN 2002), 411--416.
[10]
Isaacs, E. and Clark, H. References in conversation between experts and novices. Journal of Experimental Psychology: General, 116 (1987), 26--37.
[11]
Kleinke, C.L. Gaze and eye contact: A research review, Psychological Bulletin (1986), 78--100.
[12]
Kraut, R., Fussell, S. and Siegel, J. Visual information as a conversational resource in collaborative physical tasks. Human-Computer Interaction, 18 (2003), 13--49.
[13]
Lenzo, K.A., and Black, A.W., Theta, Cepstral, http://www.cepstral.com.
[14]
Mutlu, B., Hodgins, J., and Forlizzi, J. A Storytelling Robot: Modeling and Evaluation of Human-like Gaze Behavior. Under review.
[15]
Nakano, Y., Reinstein, G., Stocky, T., and Cassell, J. Toward a Model of ace-to-ace Grounding. Proceedings of ACL 2003, 2003.
[16]
Powers, A. An easy to use dialogue tool: AIMLE. Unpublished manuscript. Obtained from the author at HCII, Carnegie Mellon University, Pittsburgh, PA.
[17]
Sacks, H., Schegloff, E. and Jefferson, G. A simplest systematics for the organization of turn-taking for conversation. Language, 50, 4 (1974) 696--735.
[18]
Sakita, K., Ogawara, K., Murakami, S., Kawamura, K., Ikeuchi, K. Flexible Cooperation between Human and Robot by interpreting Human intention from Gaze Information. Proc. IEEE/RSJ Int. Conf. on Intelligent Robot and Systems (IROS), 2004, 846--851.
[19]
Scassellati, B. Mechanisms of shared attention for a humanoid robot. Embodied Cognition and Action: Papers from the 1996 AAAI Fall Symposium. AAAI Press, 1996.
[20]
Sidner, C., Lee, C., Kidd, C. and Lesh, N. Explorations in engagement for humans and robots. Proceedings of the International Conference on Humanoid Robots 2004.
[21]
Smith, J.D., Vertegaal, R., and Sohn, C. ViewPointer: Lightweight calibration-free eye tracking for ubiquitous handsfree deixis. Proceedings of UIST 2005. (Seattle, WA) 2005, 53--61.
[22]
Schober, M. and Brennan, S. Processes of interactive spoken discourse: The role of the partner. In Graesser, A., Gernsbacher, M. and Goldman, S. eds. The Handbook of Discourse Processes, Lawrence Erlbaum, Mahwah, NJ, 2003, 123--164.
[23]
Torrey, C. Powers, A., Marge, M., Fussell, S., and Kiesler, S. Effects of adaptive robot dialogue on information exchange and social relation. Proceedings of the Conference on Human-Robot Interaction 2006. (Salt Lake City, March 1-3), 2006, 126--133.
[24]
Wallace, R. A.L.I.C.E. ALICE Artificial Intelligence Foundation. http://www.al cebot.org.
[25]
Xuuk, inc., eyeBox. http://www.xuuk.com.

Cited By

View all
  • (2023)Robots in the Wild: Contextually-Adaptive Human-Robot Interactions in Urban Public EnvironmentsProceedings of the 35th Australian Computer-Human Interaction Conference10.1145/3638380.3638440(701-705)Online publication date: 2-Dec-2023
  • (2022)A survey on the design and evolution of social robots — Past, present and futureRobotics and Autonomous Systems10.1016/j.robot.2022.104193156:COnline publication date: 1-Oct-2022
  • (2020)Towards Adaptive and Least-Collaborative-Effort Social RobotsCompanion of the 2020 ACM/IEEE International Conference on Human-Robot Interaction10.1145/3371382.3378249(311-313)Online publication date: 23-Mar-2020
  • Show More Cited By

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Conferences
HRI '07: Proceedings of the ACM/IEEE international conference on Human-robot interaction
March 2007
392 pages
ISBN:9781595936172
DOI:10.1145/1228716
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 10 March 2007

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. adaptive dialogue
  2. human-robot dialogue
  3. human-robot interaction
  4. social robots

Qualifiers

  • Article

Conference

HRI07
HRI07: International Conference on Human Robot Interaction
March 10 - 12, 2007
Virginia, Arlington, USA

Acceptance Rates

HRI '07 Paper Acceptance Rate 22 of 101 submissions, 22%;
Overall Acceptance Rate 268 of 1,124 submissions, 24%

Upcoming Conference

HRI '25
ACM/IEEE International Conference on Human-Robot Interaction
March 4 - 6, 2025
Melbourne , VIC , Australia

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)9
  • Downloads (Last 6 weeks)0
Reflects downloads up to 16 Feb 2025

Other Metrics

Citations

Cited By

View all
  • (2023)Robots in the Wild: Contextually-Adaptive Human-Robot Interactions in Urban Public EnvironmentsProceedings of the 35th Australian Computer-Human Interaction Conference10.1145/3638380.3638440(701-705)Online publication date: 2-Dec-2023
  • (2022)A survey on the design and evolution of social robots — Past, present and futureRobotics and Autonomous Systems10.1016/j.robot.2022.104193156:COnline publication date: 1-Oct-2022
  • (2020)Towards Adaptive and Least-Collaborative-Effort Social RobotsCompanion of the 2020 ACM/IEEE International Conference on Human-Robot Interaction10.1145/3371382.3378249(311-313)Online publication date: 23-Mar-2020
  • (2017)A Systematic Review of Adaptivity in Human-Robot InteractionMultimodal Technologies and Interaction10.3390/mti10300141:3(14)Online publication date: 20-Jul-2017
  • (2017)Looking CoordinatedProceedings of the 2017 CHI Conference on Human Factors in Computing Systems10.1145/3025453.3026033(2571-2582)Online publication date: 2-May-2017
  • (2016)Computational Human-Robot InteractionFoundations and Trends in Robotics10.1561/23000000494:2-3(105-223)Online publication date: 20-Dec-2016
  • (2015)Effective task training strategies for human and robot instructorsAutonomous Robots10.1007/s10514-015-9461-039:3(313-329)Online publication date: 1-Oct-2015
  • (2013)How a robot should give adviceProceedings of the 8th ACM/IEEE international conference on Human-robot interaction10.5555/2447556.2447666(275-282)Online publication date: 3-Mar-2013
  • (2013)How a robot should give advice2013 8th ACM/IEEE International Conference on Human-Robot Interaction (HRI)10.1109/HRI.2013.6483599(275-282)Online publication date: Mar-2013
  • (2011)See what i'm saying?Proceedings of the ACM 2011 conference on Computer supported cooperative work10.1145/1958824.1958892(435-444)Online publication date: 19-Mar-2011
  • Show More Cited By

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Figures

Tables

Media

Share

Share

Share this Publication link

Share on social media