skip to main content
10.1145/3136755.3143028acmconferencesArticle/Chapter ViewAbstractPublication Pagesicmi-mlmiConference Proceedingsconference-collections
short-paper

Modulating the non-verbal social signals of a humanoid robot

Published: 03 November 2017 Publication History

Abstract

In this demonstration we present a repertoire of social signals generated by the humanoid robot Pepper in the context of the EU-funded project MuMMER. The aim of this research is to provide the robot with the expressive capabilities required to interact with people in real-world public spaces such as shopping malls-and being able to control the non-verbal behaviour of such a robot is key to engaging with humans in an effective way. We propose an approach to modulating the non-verbal social signals of the robot based on systematically varying the amplitude and speed of the joint motions and gathering user evaluations of the resulting gestures. We anticipate that the humans' perception of the robot behaviour will be influenced by these modulations

Supplementary Material

Auxiliary Video (icmi17-demo-113-aux.mp4)
Pepper robot performing gain attention animation with 3 different alpha values.

References

[1]
Virginia P Richmond, James C McCroskey, and Steven K Payne. Nonverbal behavior in interpersonal relations. Prentice Hall Englewood Cliffs, NJ, 1991.
[2]
Terrence Fong, Illah Nourbakhsh, and Kerstin Dautenhahn. A survey of socially interactive robots. Robotics and autonomous systems, 42(3):143–166, 2003.
[3]
Cynthia L Breazeal. Designing sociable robots. MIT press, 2004.
[4]
Cynthia Breazeal, Cory D Kidd, Andrea Lockerd Thomaz, Guy Hoffman, and Matt Berlin. Effects of nonverbal communication on efficiency and robustness in human-robot teamwork. In Intelligent Robots and Systems, 2005.(IROS 2005). 2005 IEEE/RSJ International Conference on, pages 708–713. IEEE, 2005.
[5]
Massimiliano Zecca, Yu Mizoguchi, Keita Endo, Fumiya Iida, Yousuke Kawabata, Nobutsuna Endo, Kazuko Itoh, and Atsuo Takanishi. Whole body emotion expressions for kobian humanoid robot-preliminary experiments with different emotional patterns. In Robot and Human Interactive Communication, 2009. ROMAN 2009. The 18th IEEE International Symposium on, pages 381–386. IEEE, 2009.
[6]
Markus Häring, Nikolaus Bee, and Elisabeth André. Creation and evaluation of emotion expression with body movement, sound and eye color for humanoid robots. In Ro-Man, 2011 Ieee, pages 204–209. IEEE, 2011.
[7]
Heeyoung Kim, Sonya S Kwak, and Myungsuk Kim. Personality design of sociable robots by control of gesture design factors. In Robot and Human Interactive Communication, 2008. RO-MAN 2008. The 17th IEEE International Symposium on, pages 494–499. IEEE, 2008.
[8]
Mary Ellen Foster, Rachid Alami, Olli Gestranius, Oliver Lemon, Marketta Niemelä, Jean-Marc Odobez, and Amit Kumar Pandey. The MuMMER project: Engaging human-robot interaction in real-world public spaces. In Proceedings of the Eighth International Conference on Social Robotics (ICSR 2016), November 2016.
[9]
Junchao Xu, Joost Broekens, Koen Hindriks, and Mark A. Neerincx. Bodily Mood Expression: Recognize Moods from Functional Behaviors of Humanoid Robots, pages 511–520. 2013.
[10]
Beatrice Rammstedt and Oliver P John. Measuring personality in one minute or less: A 10-item short version of the big five inventory in english and german. Journal of research in Personality, 41(1):203–212, 2007.
[11]
Christoph Bartneck, Dana Kulić, Elizabeth Croft, and Susana Zoghbi. Measurement instruments for the anthropomorphism, animacy, likeability, perceived intelligence, and perceived safety of robots. International journal of social robotics, 1(1):71–81, 2009.
[12]
Abstract 1 Introduction 2 Background 3 Scenario 4 Approach 5 Demonstration Acknowledgments References

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Conferences
ICMI '17: Proceedings of the 19th ACM International Conference on Multimodal Interaction
November 2017
676 pages
ISBN:9781450355438
DOI:10.1145/3136755
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 03 November 2017

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. Social signals
  2. human-robot interaction
  3. social robotics

Qualifiers

  • Short-paper

Conference

ICMI '17
Sponsor:

Acceptance Rates

ICMI '17 Paper Acceptance Rate 65 of 149 submissions, 44%;
Overall Acceptance Rate 453 of 1,080 submissions, 42%

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • 0
    Total Citations
  • 135
    Total Downloads
  • Downloads (Last 12 months)5
  • Downloads (Last 6 weeks)0
Reflects downloads up to 20 Feb 2025

Other Metrics

Citations

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Figures

Tables

Media

Share

Share

Share this Publication link

Share on social media