skip to main content
10.5555/1734454.1734473acmconferencesArticle/Chapter ViewAbstractPublication PageshriConference Proceedingsconference-collections
research-article

Perception of affect elicited by robot motion

Published: 02 March 2010 Publication History

Abstract

Nonverbal behaviors serve as a rich source of information in inter human communication. In particular, motion cues can reveal details on a person's current physical and mental state. Research has shown, that people do not only interpret motion cues of humans in these terms, but also the motion of animals and inanimate devices such as robots. In order to successfully integrate mobile robots in domestic environments, designers have therefore to take into account how the device will be perceived by the user.
In this study we analyzed the relationship between motion characteristics of a robot and perceived affect. Based on a literature study we selected two motion characteristics, namely acceleration and curvature, which appear to be most influential for how motion is perceived. We systematically varied these motion parameters and recorded participants interpretations in terms of affective content. Our results suggest a strong relation between motion parameters and attribution of affect, while the type of embodiment had no effect. Furthermore, we found that the level of acceleration can be used to predict perceived arousal and that valence information is at least partly encoded in an interaction between acceleration and curvature. These findings are important for the design of behaviors for future autonomous household robots.

References

[1]
L. F. Barrett, B. Mesquita, K. N. Ochsner, and J. J. Gross. The experience of emotion. In Annual Review of Psychology, volume 58, pages 373--403, Jan. 2007.
[2]
C. Bethel and R. Murphy. Survey of Non-facial/Non-verbal Affective Expressions for Appearance-Constrained Robots. Systems, Man, and Cybernetics, Part C: Applications and Reviews, IEEE Transactions on, 38(1):83--92, 2008.
[3]
P. W. Blythe, P. M. Todd, and G. F. Miller. Simple Heuristics That Make Us Smart, chapter How Motion Reveals Intention: Categorizing Social Interactions, pages 257--285. Oxford University Press, 1999.
[4]
M. Bradley and P. Lang. Measuring emotion: the self-assessment manikin and the semantic differential. Journal of behavior therapy and experimental psychiatry, 25(1):49--59, 1994.
[5]
A. Camurri, I. Lagerlöf, and G. Volpe. Recognizing emotion from dance movement: comparison of spectator recognition and automated techniques. Int. J. Hum.-Comput. Stud., 59(1-2):213--225, 2003.
[6]
R. Cowie, E. Douglas-Cowie, N. Tsapatsoulis, G. Votsis, S. Kollias, W. Fellenz, and J. Taylor. Emotion recognition in human-computer interaction. Signal Processing Magazine, IEEE, 18(1):32--80, 2001.
[7]
J. R. Crawford and J. D. Henry. The positive and negative affect schedule (PANAS): Construct validity, measurement properties and normative data in a large non-clinical sample. British Journal of Clinical Psychology, 43:245--265, 2004.
[8]
S. Denham, T. Wyatt, H. Bassett, D. Echeverria, and S. Knox. Assessing social-emotional development in children from a longitudinal perspective. British Medical Journal, 63(Suppl 1):i37--i52, 2009.
[9]
P. Ekman and W. Friesen. Unmasking the face: a guide to recognizing emotions from facial clues. Prentice-Hall, Englewood Cliffs, NJ, 1975.
[10]
J. Forlizzi and C. DiSalvo. Service robots in the domestic environment: a study of the roomba vacuum in the home. In ACM SIGCHI/SIGART International conference on Human-robot interaction (HRI), pages 258--265, New York, NY, USA, 2006
[11]
V. Gaur and B. Scassellati. Which motion features induce the perception of animacy? In International Conference for Developmental Learning, Bloomington, Indiana, 2006.
[12]
A. Grizard and C. L. Lisetti. Generation of facial emotional expressions based on psychological theory. In 1rst Workshop on Emotion and Computing at KI 2006, 29th Annual Conference on Artificial Intelligence, pages 14--19, 2006.
[13]
F. Heider and M. Simmel. An experimental study of apparent behavior. Journal of Psychology, 57:243--249, 1944.
[14]
J. Humrichouse, M. Chmielewski, E. A. McDade-Montez, and D. Watson. Emotion and Psychopathology, chapter Affect assessment through self-report methods, pages 13--34. American Psychological Association (APA), 2007.
[15]
J. F. Schouten School for User System Interaction Research. Participant database: http://ppdb.tm.tue.nl/, 2009.
[16]
D. K. I could be you: The phenomenological dimension of social understanding. Cybernetics and Systems, 25(8):417--453, 1997.
[17]
P. Lang, M. Bradley, and B. Cuthbert. International affective picture system (IAPS): Instruction manual and affective ratings. The Center for Research in Psychophysiology, University of Florida, 1999.
[18]
R. Larsen and E. Diener. Promises and problems with the circumplex model of emotion. Emotion, 13:25--59, 1992.
[19]
J.-H. Lee, P. Jin-Yung, and T.-J. Nam. Emotional interaction through physical movement. In Human-Computer Interaction,12th International Conference, pages 401--410, 2007.
[20]
M. Lewis, J. M. Haviland-Jones, and L. F. Barrett, editors. Handbook of Emotions (Third Edition). The Guilford Press, 2008.
[21]
A. Mehrabian. Framework for a comprehensive description and measurement of emotional states. Genetic, Social, and General Psychology Monographs, 121(3):339, 1995.
[22]
A. Mehrabian. Comparison of the pad and panas as models for describing emotions and for differentiating anxiety from depression. Journal of Psychopathology and Behavioral Assessment, 19(4):331--357, 1997.
[23]
T. Minato and H. Ishiguro. Construction and evaluation of a model of natural human motion based on motion diversity. In HRI '08: Proceedings of the 3rd ACM/IEEE international conference on Human robot interaction, pages 65--72, New York, NY, USA, 2008. ACM.
[24]
H. Mizoguchi, T. Sato, K. Takagi, M. Nakao, and Y. Hatamura. Realization of expressive mobile robot. In 1997 IEEE International Conference on Robotics and Automation, Proceedings. Albuqueraue, NM, April, 1997.
[25]
F. Pollick, H. Paterson, A. Bruderlin, and A. Sanford. Perceiving affect from arm movement. Cognition, 82(2):51--61, 2001.
[26]
J. Russell. A circumplex model of affect. Journal of Personality and Social Psychology, 39(6):1161--1178, 1980.
[27]
M. Saerbeck and C. Bartneck. Online calibration for low-cost consumer robots using neural networks. Submitted, 2009.
[28]
M. Saerbeck and A. J. van Breemen. Design guidelines and tools for creating believable motion for personal robots. In Robot and Human interactive Communication (RO-MAN), pages 386--391, 2007.
[29]
B. J. Scholl and T. P. D. Perceptual causality and animacy. Trends in Cognitive Science, 4(8):229--309, Jul 2000.
[30]
J.-Y. Sung, R. E. Grinter, H. I. Christensen, and L. Guo. Housewives or technophiles?: understanding domestic robot owners. In ACM/IEEE International conference on Human-robot interaction (HRI), pages 129--136, New York, NY, USA, 2008. ACM.
[31]
J.-Y. Sung, L. Guo, R. E. Grinter, and H. I. Christensen. My roomba is rambo: Intimate home appliances. In UbiComp 2007: Ubiquitous Computing, volume 4717 of Lecture Notes in Computer Science, pages 145--162. Springer Berlin / Heidelberg, 2007.
[32]
P. D. Tremoulet and J. Feldman. Perception of animacy from the motion of a single object. Perception, 29(8):943--951, 2000.
[33]
D. Watson, L. Clark, and A. Tellegen. Development and validation of brief measures of positive and negative affect: The PANAS scales. Journal of Personality and Social Psychology, 54(6):1063--1070, 1988.

Cited By

View all
  • (2024)Power in Human-Robot InteractionProceedings of the 2024 ACM/IEEE International Conference on Human-Robot Interaction10.1145/3610977.3634949(269-282)Online publication date: 11-Mar-2024
  • (2022)How to Make People Think You're Thinking if You're a Drawing RobotProceedings of the 2022 ACM/IEEE International Conference on Human-Robot Interaction10.5555/3523760.3523969(1190-1191)Online publication date: 7-Mar-2022
  • (2021)Social Robot Co-Design Canvases: A Participatory Design FrameworkACM Transactions on Human-Robot Interaction10.1145/347222511:1(1-39)Online publication date: 18-Oct-2021
  • Show More Cited By

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Conferences
HRI '10: Proceedings of the 5th ACM/IEEE international conference on Human-robot interaction
March 2010
400 pages
ISBN:9781424448937

Sponsors

Publisher

IEEE Press

Publication History

Published: 02 March 2010

Check for updates

Author Tags

  1. affective communication
  2. expressive robotic behavior
  3. nonverbal communication

Qualifiers

  • Research-article

Conference

HRI 10
Sponsor:

Acceptance Rates

HRI '10 Paper Acceptance Rate 26 of 124 submissions, 21%;
Overall Acceptance Rate 268 of 1,124 submissions, 24%

Upcoming Conference

HRI '25
ACM/IEEE International Conference on Human-Robot Interaction
March 4 - 6, 2025
Melbourne , VIC , Australia

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)7
  • Downloads (Last 6 weeks)1
Reflects downloads up to 17 Feb 2025

Other Metrics

Citations

Cited By

View all
  • (2024)Power in Human-Robot InteractionProceedings of the 2024 ACM/IEEE International Conference on Human-Robot Interaction10.1145/3610977.3634949(269-282)Online publication date: 11-Mar-2024
  • (2022)How to Make People Think You're Thinking if You're a Drawing RobotProceedings of the 2022 ACM/IEEE International Conference on Human-Robot Interaction10.5555/3523760.3523969(1190-1191)Online publication date: 7-Mar-2022
  • (2021)Social Robot Co-Design Canvases: A Participatory Design FrameworkACM Transactions on Human-Robot Interaction10.1145/347222511:1(1-39)Online publication date: 18-Oct-2021
  • (2021)Investigation of Model for Initial Phase of CommunicationACM Transactions on Human-Robot Interaction10.1145/343971910:2(1-27)Online publication date: 9-Feb-2021
  • (2020)Using the Geneva Emotion Wheel to Measure Perceived Affect in Human-Robot InteractionProceedings of the 2020 ACM/IEEE International Conference on Human-Robot Interaction10.1145/3319502.3374834(491-498)Online publication date: 9-Mar-2020
  • (2020)"Are You Sad, Cozmo?"Proceedings of the 2020 ACM/IEEE International Conference on Human-Robot Interaction10.1145/3319502.3374814(461-470)Online publication date: 9-Mar-2020
  • (2020)MoveAEProceedings of the 2020 ACM/IEEE International Conference on Human-Robot Interaction10.1145/3319502.3374807(481-489)Online publication date: 9-Mar-2020
  • (2019)Differences in Haptic and Visual Perception of Expressive 1DoF MotionACM Symposium on Applied Perception 201910.1145/3343036.3343136(1-9)Online publication date: 19-Sep-2019
  • (2019)Animation Techniques in Human-Robot Interaction User StudiesACM Transactions on Human-Robot Interaction10.1145/33173258:2(1-22)Online publication date: 3-Jun-2019
  • (2019)Communicating Dominance in a Nonanthropomorphic Robot Using LocomotionACM Transactions on Human-Robot Interaction10.1145/33103578:1(1-14)Online publication date: 6-Mar-2019
  • Show More Cited By

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Figures

Tables

Media

Share

Share

Share this Publication link

Share on social media