skip to main content
10.1145/1322192.1322242acmconferencesArticle/Chapter ViewAbstractPublication Pagesicmi-mlmiConference Proceedingsconference-collections
poster

Speech-driven embodied entrainment character system with hand motion input in mobile environment

Published: 12 November 2007 Publication History

Abstract

InterActor is a speech-input-driven CG-embodied interaction character that can generate communicative movements and actions for entrained interactions. InterPuppet, on the other hand, is an embodied interaction character that is driven by both speech input-similar to InterActor-and hand motion input, like a puppet. Therefore, humans can use InterPuppet to effectively communicate by using deliberate body movements and natural communicative movements and actions. In this paper, an advanced InterPuppet system that uses a cellular-phone-type device is developed, which can be used in a mobile environment. The effectiveness of the system is demonstrated by performing a sensory evaluation experiment in an actual remote communication scenario.

References

[1]
Hirotsu, C.: Analysis of Experimental Data, Beyond Analysis of Variance; Kyoritsu Shuppan, Tokyo, pp.203--217 (1992) (in Japanese).
[2]
Johnson, M., Wilson, A., Kline, C., Blumberg, B. and Bobick, A. Sympathetic Interfaces: Using Plush Toys to Direct Synthetic Characters. Proceedings of CHI99, 152--158, 1999.
[3]
Koizumi, N., Shimizu, N., Sugimoto, M., Nii, H. and Inami, M. Development of Hand Puppet type Robotic User Interface. Transactions of the Virtual Reality Society of Japan, 11(2):265--274, 2006.
[4]
Ministry of Public Management, Home Affairs, Posts and Telecommunications. 2006 WHITE PAPER Information and Communications in Japan. Ministry of internal Affairs and Communications, Chapter 1, Section 2, 2006.
[5]
Osaki, K., Watanabe, T. and Yamamoto, M. Development of a Video Contents Production Support System with the Speech-Driven Embodied Entrainment Character by Speech and Hand Motion Input. Correspondences on Human Interface, 8(2):91--96, 2006.
[6]
Sato, K. and Lim, Y. Physical Interaction and Multi-Aspect Representation for Information Intensive Environments. Proceedings of 9th IEEE International Workshop on Robot and Human Interactive Communication (RO-MAN2000), 436--443, 2000.
[7]
Watanabe, T. Embodied Communication Technologies and Their Applications. Systems, Control and Information, 49(11): 431--436, 2005.
[8]
Watanabe, T., Okubo, M., Nakashige, M. and Danbara, R. InterActor: Speech-Driven Embodied Interactive Actor. International Journal of Human-Computer Interaction, 17(1):43--60, 2004.
[9]
Yamamoto, M., Watanabe, T., and Osaki, K. Development of an Embodied Interaction System with InterActor by Speech and Hand Motion Input. Proceedings of 14th IEEE International Workshop on Robot and Human Interactive Communication (RO-MAN2005), 323--328, 2005.
[10]
Yonezawa, T., Suzuki, N., Mase, K., Mogure, K. Handysinger: Expressive Singing Voice Morphing using Personified Hand-Puppet Interface. Proceedings of the 2005 conference on New interfaces for musical expression (NIME 2005), 121--126, 2005.

Cited By

View all
  • (2012)Development of a context-enhancing surface based on the entrainment of embodied rhythms and actions sharing via interactionProceedings of the 2012 ACM international conference on Interactive tabletops and surfaces10.1145/2396636.2396701(363-366)Online publication date: 11-Nov-2012
  • (2010)An embodied entrainment character cell phone by speech and head motion inputs19th International Symposium in Robot and Human Interactive Communication10.1109/ROMAN.2010.5598702(298-303)Online publication date: Sep-2010

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Conferences
ICMI '07: Proceedings of the 9th international conference on Multimodal interfaces
November 2007
402 pages
ISBN:9781595938176
DOI:10.1145/1322192
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 12 November 2007

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. cellular phone
  2. embodied communication
  3. embodied interaction
  4. human communication
  5. human interaction

Qualifiers

  • Poster

Conference

ICMI07
Sponsor:
ICMI07: International Conference on Multimodal Interface
November 12 - 15, 2007
Aichi, Nagoya, Japan

Acceptance Rates

Overall Acceptance Rate 453 of 1,080 submissions, 42%

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)1
  • Downloads (Last 6 weeks)0
Reflects downloads up to 18 Feb 2025

Other Metrics

Citations

Cited By

View all
  • (2012)Development of a context-enhancing surface based on the entrainment of embodied rhythms and actions sharing via interactionProceedings of the 2012 ACM international conference on Interactive tabletops and surfaces10.1145/2396636.2396701(363-366)Online publication date: 11-Nov-2012
  • (2010)An embodied entrainment character cell phone by speech and head motion inputs19th International Symposium in Robot and Human Interactive Communication10.1109/ROMAN.2010.5598702(298-303)Online publication date: Sep-2010

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Figures

Tables

Media

Share

Share

Share this Publication link

Share on social media