skip to main content
10.1145/2503713.2503736acmconferencesArticle/Chapter ViewAbstractPublication PagesvrstConference Proceedingsconference-collections
research-article

Multi-party interaction with a virtual character and a human-like robot

Published:06 October 2013Publication History

ABSTRACT

Research on interactive virtual characters and social robots focuses mainly on one-to-one interactions and multi-party interactions concept are rather less explored. As we are developing these characters to be helpful to us in our daily lives as guides, companions, assistants or receptionists, they should be aware of the existence of multiple people and address their requirements in a natural way and act according to the social rules and norms. In contrast with previous work, we are interested in multi-party and multi-modal interactions between 3D virtual characters, real humans and social robots. This means that any of these participants can interact with each other. In this paper we present our on-going work, provide a discussion on multi-party interaction, describe the overall system architecture and mention our future work.

References

  1. Bohus, D., and Horvitz, E. 2009. Open-world dialog: Challenges, directions and a prototype. In IJCAI Workshop on Knowledge and Reasoning in Practical Dialog Systems,.Google ScholarGoogle Scholar
  2. Cassell, J., Vilhjálmsson, H., Högni, and Bickmore, T. 2001. BEAT: the behavior expression animation toolkit. In Proceedings of the 28th annual conference on Computer graphics and interactive techniques, SIGGRAPH '01, 477--486. Google ScholarGoogle ScholarDigital LibraryDigital Library
  3. D. Traum, and Larsson, S. 2003. The information state approach to dialogue management. In Current and New Directions in Discourse and Dialogue, 325--353.Google ScholarGoogle Scholar
  4. Foster, M., Gaschler, A., Giuliani, M., Isard, A., Pateraki, M., and Petrick, R. 2012. Two people walk into a bar: Dynamic multi-party social interaction with a robot agent. In Proceedings of the ACM International Conference on Multi-modal Interaction (ICMI 2012), 3--10. Google ScholarGoogle ScholarDigital LibraryDigital Library
  5. Kasap, Z., and Magnenat-Thalmann, N. 2012. Building long-term relationships with virtual and robotic characters: The role of remembering. The Visual Computer 28, 1, 87--97. Google ScholarGoogle ScholarDigital LibraryDigital Library
  6. Kondo, Y., Takemura, K., Takamatsu, J., and Ogasawara, T. 2013. A gesture-centric android system for multi-party human-robot interaction. Journal of Human-Robot Interaction 2, 1, 133--151.Google ScholarGoogle ScholarDigital LibraryDigital Library
  7. Lee, C., Jung, S., Kim, K., Lee, D., and Lee, G. 2010. Recent approaches to dialogue management for spoken dialog systems. Journal of Computing Science and Engineering 4, 1, 1--22.Google ScholarGoogle ScholarCross RefCross Ref
  8. Matsuyama, Y., Taniyama, H., F., and Kobayashi, T. 2010. Framework of communication activation robot participating in multiparty conversation. In AAAI Fall Symposium.Google ScholarGoogle Scholar
  9. Rashobh, R. S., and Khong, A. W. H. 2012. A variable step-size adaptive algorithm for room acoustics equalization exploiting the sparseness constraint. In IEEE Int. Symp. Circuits and Systems.Google ScholarGoogle Scholar
  10. Ren, J., Jiang, X., and Yu, J. 2013. Relaxed local ternary pattern for face recognition. In IEEE Int. Conf. Image Processing.Google ScholarGoogle Scholar
  11. Traum, D. R., and Morency, L. P. 2010. Integration of visual perception in dialogue understanding for virtual humans in multi-party interaction. In AAMAS International Workshop on Interacting with ECAs as Virtual Characters, 1723--1726.Google ScholarGoogle Scholar
  12. Traum, D. 2004. Issues in multi-party dialoguess. In: F. Dignum, editor. Advances in Agent Communication (November), 201--211.Google ScholarGoogle Scholar
  13. Wu, K., Goh, S. T., and Khong, A. W. H. 2013. Speaker localization and tracking in the presence of sound interference by exploiting speech harmonicity. In IEEE. Int. Conf. Acoust., Speech, Signal Process. (ICASSP 13).Google ScholarGoogle Scholar
  14. Yang, X., Yuan, J., and Thalmann, D. 2013. Human-virtual human interaction by upper body gesture understanding. In The 19th ACM Symposium on Virtual Reality Software and Technology (VRST2013). Google ScholarGoogle ScholarDigital LibraryDigital Library

Index Terms

  1. Multi-party interaction with a virtual character and a human-like robot

          Recommendations

          Comments

          Login options

          Check if you have access through your login credentials or your institution to get full access on this article.

          Sign in
          • Published in

            cover image ACM Conferences
            VRST '13: Proceedings of the 19th ACM Symposium on Virtual Reality Software and Technology
            October 2013
            271 pages
            ISBN:9781450323796
            DOI:10.1145/2503713

            Copyright © 2013 ACM

            Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

            Publisher

            Association for Computing Machinery

            New York, NY, United States

            Publication History

            • Published: 6 October 2013

            Permissions

            Request permissions about this article.

            Request Permissions

            Check for updates

            Qualifiers

            • research-article

            Acceptance Rates

            Overall Acceptance Rate66of254submissions,26%

            Upcoming Conference

            VRST '24

          PDF Format

          View or Download as a PDF file.

          PDF

          eReader

          View online with eReader.

          eReader