ABSTRACT
Head motion occurs naturally and in synchrony with speech during human dialogue communication, and may carry paralinguistic information, such as intentions, attitudes and emotions. Therefore, natural-looking head motion by a robot is important for smooth human-robot interaction. Based on rules inferred from analyses of the relationship between head motion and dialogue acts, this paper proposes a model for generating head tilting and nodding, and evaluates the model using three types of humanoid robot (a very human-like android, "Geminoid F", a typical humanoid robot with less facial degrees of freedom, "Robovie R2", and a robot with a 3-axis rotatable neck and movable lips, "Telenoid R2"). Analysis of subjective scores shows that the proposed model including head tilting and nodding can generate head motion with increased naturalness compared to nodding only or directly mapping people's original motions without gaze information. We also find that an upwards motion of a robot's face can be used by robots which do not have a mouth in order to provide the appearance that utterance is taking place. Finally, we conduct an experiment in which participants act as visitors to an information desk attended by robots. As a consequence, we verify that our generation model performs equally to directly mapping people's original motions with gaze information in terms of perceived naturalness.
Supplemental Material
- C. T. Ishi, C. Liu, H. Ishiguro, N. Hagita. 2010. Head motion during dialogue speech and nod timing control in humanoid robots. Proc. of IEEE/RSJ Human Robot Interaction (HRI 2010), 293--300, 2010. Google ScholarDigital Library
- C. Sidner, C. Lee, L.-P. Morency, C. Forlines. 2006. The effect of head-nod recognition in human-robot conversation. Proc. of IEEE/RSJ Human Robot Interaction (HRI 2006), pp. 290--296, 2006. Google ScholarDigital Library
- L.-P. Morency, C. Sidner, C. Lee, and T. Darrell. 2007. Head gestures for perceptual interfaces: The role of context in improving recognition. Artificial Intelligence, 171(8-9): 568--585, June 2007. Google ScholarDigital Library
- H. C. Yehia, T. Kuratate, E. Vatikiotis-Bateson. 2002. Linking facial animation, head motion and speech acoustics. J. of Phonetics, Vol. 30, pp. 555--568, 2002.Google ScholarCross Ref
- M. E. Sargin, O. Aran, A. Karpov, F. Ofli, Y. Yasinnik, S. Wilson, E. Erzin, Y. Yemez, A. M. Tekalp. 2006. Combined Gesture-Speech Analysis and Speech Driven Gesture Synthesis. Proc IEEE International Conference on Multimedia, 2006.Google ScholarCross Ref
- K. G. Munhall, J. A. Jones, D. E. Callan, T. Kuratate, E. Vatikiotis-Bateson. 2004. Visual prosody and speech intelligibility - Head movement improves auditory speech perception. Psychological Science, Vol. 15, No. 2, pp. 133--137, 2004.Google ScholarCross Ref
- H. P. Graf., E. Cosatto, V. Strom, F. J. Huang. 2002. Visual prosody: Facial movements accompanying speech. Proc. IEEE Int. Conf. on Automatic Face and Gesture Recognition (FGR'02), 2002. Google ScholarDigital Library
- J. Beskow, B. Granstrom, D. House. 2006. Visual correlates to prominence in several expressive modes. Proc. Interspeech 2006 - ICSLP, pp. 1272--1275, 2006.Google Scholar
- C. Busso, Z. Deng, M. Grimm, U. Neumann, S. Narayanan. 2007. Rigid Head Motion in Expressive Speech Animation: Analysis and Synthesis. IEEE Trans. on Audio, Speech and Language Processing, March 2007. Google ScholarDigital Library
- Y. Iwano, S. Kageyama, E. Morikawa, S. Nakazato, K. Shirai. 1996. Analysis of head movements and its role in spoken dialogue. Proc. International Conference on Spoken Language Processing (ICSLP'96), pp. 2167--2170, 1996.Google ScholarCross Ref
- M. E. Foster and J. Oberlander. 2007. Corpus-based generation of head and eyebrow motion for an embodied conversational agent. Language Resources and Evaluation, 41(3-4), 305--323, Dec. 2007.Google ScholarCross Ref
- C. T Ishi, J. Haas, F. P. Wilbers, H. Ishiguro, and N. Hagita. 2007. Analysis of head motions and speech, and head motion control in an android. Proc. of IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS 2007), 548--553, 2007.Google ScholarCross Ref
- C. T. Ishi, H. Ishiguro, N. Hagita. 2006. Analysis of prosodic and linguistic cues of phrase finals for turn-taking and dialog acts. Proc. Interspeech'2006 - ICSLP, pp. 2006--2009, 2006.Google Scholar
- T. Minato, M. Shimada, H. Ishiguro, S. Itakura. 2004. Development of an android robot for studying human-robot interaction. Innovations in Applied Artificial Intelligence, Springer Verlag, pp. 424--434, 2004. Google ScholarDigital Library
- M. DeBoer and A. M. Boxer. 1979. Signal functions of infant facial expression and gaze direction during mother-infant face-to-face play. Child Development, vol. 50, no. 4, pp. 1215--1218, 1979.Google ScholarCross Ref
- S. R. H. Langton, R. J. Watt, and V. Bruce. 2000. Do the eyes have it? Cues to the direction of social attention. Trends in Cognitive Sciences, vol. 4, no. 2, pp. 50--59, 2000.Google ScholarCross Ref
- F. Kaplan and V. Hafner. 2004. The challenges of joint attention. Interaction Studies, pp. 67--74, 2004. {Online}. Available: http://cogprints.org/4067/Google Scholar
- Y. Nagai, M. Asada, and K. Hosoda. 2006. Learning for joint attention helped by functional development. Advanced Robotics, vol. 20, pp. 1165--1181(17), October 2006.Google ScholarCross Ref
- C. T. Ishi, C. Liu, H. Ishiguro, N. Hagita. 2011. Speech-driven lip motion generation for tele-operated humanoid robots. International Conference on Auditory-Visual Speech Processing, 2011Google Scholar
Index Terms
Generation of nodding, head tilting and eye gazing for human-robot dialogue interaction
Recommendations
Precision timing in human-robot interaction: coordination of head movement and utterance
CHI '08: Proceedings of the SIGCHI Conference on Human Factors in Computing SystemsAs research over the last several decades has shown that non-verbal actions such as face and head movement play a crucial role in human interaction, such resources are also likely to play an important role in human-robot interaction. In developing a ...
Head motions during dialogue speech and nod timing control in humanoid robots
HRI '10: Proceedings of the 5th ACM/IEEE international conference on Human-robot interactionHead motion naturally occurs in synchrony with speech and may carry paralinguistic information, such as intention, attitude and emotion, in dialogue communication. With the aim of verifying the relationship between head motion and the dialogue acts ...
Damping robot's head movements affects human-robot interaction
HRI '14: Proceedings of the 2014 ACM/IEEE international conference on Human-robot interactionA new research platform has been developed to study human-robot interaction and communication. In this setup, a humanoid robot is used as a proxy between two humans involved in dyadic interactions. An experimenter is bound with a humanoid robot. He can ...
Comments