skip to main content
10.1145/1228716.1228755acmconferencesArticle/Chapter ViewAbstractPublication PageshriConference Proceedingsconference-collections
Article

Non-facial/non-verbal methods of affective expression as applied to robot-assisted victim assessment

Published: 10 March 2007 Publication History

Abstract

This work applies a previously developed set of heuristics for determining when to use non-facial/non-verbal methods of affective expression to the domain of a robot being used for victim assessment in the aftermath of a disaster. Robot-assisted victim assessment places a robot approximately three meters or less from a victim, and the path of the robot traverses three proximity zones (intimate (contact -- 0.46m), personal (0.46 -- 1.22 m), and social (1.22 -- 3.66 m)). Robot- and victim-eye views of an Inuktun robot were collected as it followed a path around the victim. The path was derived from observations of a prior robot-assisted medical reachback study. The victim's-eye views of the robot from seven points of interest on the path illustrate the appropriateness of each of the five primary non-facial/non-verbal methods of affective expression: (body movement, posture, orientation, illuminated color, and sound), offering support for the heuristics as a design aid. In addition to supporting the heuristics, the investigation identified three open research questions on acceptable motions and impact of the surroundings on robot affect.

References

[1]
M. Argyle. Bodily communication. Methuen, London, 1975.
[2]
C. L. Bethel and R. R. Murphy. Affective expression in appearance-constrained robots. In 1st ACM SIGCHI/SIGART Conference on Human-Robot Interaction (HRI2006), pages 327--328, Salt Lake City, UT, 2006. ACM Press.
[3]
C. L. Bethel and R. R. Murphy. Auditory and other non-verbal expressions of affect for robots. In 2006 AAAI Fall Symposium Series, Aurally Informed Performance: Integrating Machine Listening and Auditory Presentation in Robotic Systems, Washington, DC, 2006. AAAI.
[4]
C. L. Bethel and R. R. Murphy. Non-facial/non-verbal affective expressions for appearance-constrained robots. IEEE Transactions on Systems, Man, Cybernetics, Part C, Under Revision, 2006.
[5]
T. W. Bickmore and R. W. Picard. Towards caring machines. In E. Dykstra-Erickson and M. Tscheligi, editors, Conference on Human Factors in Computing Systems CHI'04 extended abstracts on Human factors in computing systems, pages 1489--1492, Vienna, Austria, 2004. ACM Press, New York, NY, USA.
[6]
C. L. Breazeal. Designing sociable robots. Intelligent robots and autonomous agents. MIT Press, Cambridge, Mass., 2002.
[7]
K. Dautenhahn, B. Ogden, and T. Quick. From embodied to socially embedded agents - implications for interaction-aware robots. Cognitive Systems Research, 3(3):397, 2002.
[8]
K. Dautenhahn, M. Walters, S. Woods, K. L. Koay, C. L. Nehaniv, A. Sisbot, R. Alami, and T. Simon. How may i serve you? a robot companion approaching a seated person in a helping context. In 1st ACM SIGCHI/SIGART Conference on Human-Robot Interaction (HRI2006), pages 172--179, Salt Lake City, UT, 2006. ACM Press, New York, NY, USA.
[9]
M. S. El-Nasr and M. Skubic. A fuzzy emotional agent for decision-making in a mobile robot. In The 1998 IEEE International Conference on Fuzzy Systems Proceedings, IEEE World Congress on Computational Intelligence, volume 1, pages 135--140, 1998.
[10]
T. Fincannon, L. E. Barnes, R. R. Murphy, and D. L. Riddle. Evidence of the need for social intelligence in rescue robots. In IEEE/RSJ International Conference on International Conference on Intelligent Robots and Systems (IROS 2004), volume 2, pages 1089--1095, 2004.
[11]
Y. Maeda. Fuzzy rule expression for emotional generation model based on subsumption architecture. In 18th International Conference of the North American Fuzzy Information Processing Society, NAFIPS, pages 781--785, 1999.
[12]
F. Michaud, T. Salter, A. Duquette, and J. F. Laplante. Perspectives on mobile robots rused as tools for pediatric rehabilitation. Assistive Technologies, To be published, 2005.
[13]
H. Mizoguchi, T. Sato, K. Takagi, M. Nakao, and Y. Hatamura. Realization of expressive mobile robot. In 1997 IEEE International Conference on Robotics and Automation, volume 1, pages 581--586, 1997.
[14]
L. Moshkina and R. C. Arkin. Human perspective on affective robotic behavior: A longitudinal study. In 2005 IEEE/RSJ International Conference on Intelligent Robots and Systems, (IROS 2005), pages 2443--2450, 2005.
[15]
R. R. Murphy, D. Riddle, and E. Rasmussen. Robot-assisted medical reachback: a survey of how medical personnel expect to interact with rescue robots. In 13th IEEE International Workshop on Robot and Human Interactive Communication, ROMAN 2004., pages 301--306, 2004.
[16]
E. Pacchierotti, H. I. Christensen, and P. Jensfelt. Human-robot embodied interaction in hallway settings: a pilot user study. In IEEE International Workshop on Robot and Human Interactive Communication. ROMAN 2005., pages 164--171, Nashville, TN, USA, 2005. IEEE.
[17]
E. Pacchierotti, H. I. Christensen, and P. Jensfelt. Evaluation of passing distance for social robots. In K. Dautenhahn, editor, IEEE International Workshop on Robot and Human Interactive Communication (ROMAN 2006), Hertfordshire, UK, 2006. IEEE.
[18]
D. R. Riddle, R. R. Murphy, and J. L. Burke. Robot-assisted medical reachback: using shared visual information. In IEEE International Workshop on Robot and Human Interactive Communication (ROMAN 2005), pages 635--642. IEEE, 2005.
[19]
M. Scheeff, J. Pinto, K. Rahardja, S. Snibbe, and R. Tow. Experiences with sparky, a social robot. In Proceedings of the Workshop on Interactive Robot Entertainment, 2000.
[20]
T. Shimokawa and T. Sawaragi. Acquiring communicative motor acts of social robot using interactive evolutionary computation. In 2001 IEEE International Conference on Systems, Man, and Cybernetics, volume 3, pages 1396--1401, 2001.
[21]
M. L. Walters, K. Dautenhahn, R. te Boekhorst, K. Kheng Lee, C. Kaouri, S. Woods, C. Nehaniv, D. Lee, and I. Werry. The in uence of subjects' personality traits on personal spatial zones in a human-robot interaction experiment. In IEEE International Workshop on Robot and Human Interactive Communication, ROMAN 2005, pages 347--352, 2005.

Cited By

View all
  • (2024)The Effect of Adding Japanese Honorifics When Naming a Driving-Review RobotJournal of Robotics and Mechatronics10.20965/jrm.2024.p157736:6(1577-1591)Online publication date: 20-Dec-2024
  • (2024)Effects of Gait Onomatopoeia on the Impression of Robots2024 Joint 13th International Conference on Soft Computing and Intelligent Systems and 25th International Symposium on Advanced Intelligent Systems (SCIS&ISIS)10.1109/SCISISIS61014.2024.10759934(1-6)Online publication date: 9-Nov-2024
  • (2021)Emotional Expressions of Real Humanoid Robots and Their Influence on Human Decision-Making in a Finite Iterated Prisoner’s Dilemma GameInternational Journal of Social Robotics10.1007/s12369-021-00758-w13:7(1777-1786)Online publication date: 27-Feb-2021
  • Show More Cited By

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Conferences
HRI '07: Proceedings of the ACM/IEEE international conference on Human-robot interaction
March 2007
392 pages
ISBN:9781595936172
DOI:10.1145/1228716
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 10 March 2007

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. affective computing
  2. human-robot interaction
  3. non-verbal communication
  4. proxemics
  5. robotic design guidelines

Qualifiers

  • Article

Conference

HRI07
HRI07: International Conference on Human Robot Interaction
March 10 - 12, 2007
Virginia, Arlington, USA

Acceptance Rates

HRI '07 Paper Acceptance Rate 22 of 101 submissions, 22%;
Overall Acceptance Rate 268 of 1,124 submissions, 24%

Upcoming Conference

HRI '25
ACM/IEEE International Conference on Human-Robot Interaction
March 4 - 6, 2025
Melbourne , VIC , Australia

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)19
  • Downloads (Last 6 weeks)5
Reflects downloads up to 16 Feb 2025

Other Metrics

Citations

Cited By

View all
  • (2024)The Effect of Adding Japanese Honorifics When Naming a Driving-Review RobotJournal of Robotics and Mechatronics10.20965/jrm.2024.p157736:6(1577-1591)Online publication date: 20-Dec-2024
  • (2024)Effects of Gait Onomatopoeia on the Impression of Robots2024 Joint 13th International Conference on Soft Computing and Intelligent Systems and 25th International Symposium on Advanced Intelligent Systems (SCIS&ISIS)10.1109/SCISISIS61014.2024.10759934(1-6)Online publication date: 9-Nov-2024
  • (2021)Emotional Expressions of Real Humanoid Robots and Their Influence on Human Decision-Making in a Finite Iterated Prisoner’s Dilemma GameInternational Journal of Social Robotics10.1007/s12369-021-00758-w13:7(1777-1786)Online publication date: 27-Feb-2021
  • (2019)Emotion expression in HRIProceedings of the 14th ACM/IEEE International Conference on Human-Robot Interaction10.5555/3378680.3378687(29-38)Online publication date: 11-Mar-2019
  • (2019)Personalized Synthesis of Intentional and Emotional Non-Verbal Sounds for Social Robots2019 8th International Conference on Affective Computing and Intelligent Interaction (ACII)10.1109/ACII.2019.8925487(1-7)Online publication date: Sep-2019
  • (2018)Bioluminescence-Inspired Human-Robot InteractionProceedings of the 2018 ACM/IEEE International Conference on Human-Robot Interaction10.1145/3171221.3171249(224-232)Online publication date: 26-Feb-2018
  • (2017)Emotional Expression in Simple Line Drawings of a Robot's Face Leads to Higher Offers in the Ultimatum GameFrontiers in Psychology10.3389/fpsyg.2017.007248Online publication date: 22-May-2017
  • (2017)Affective Grounding in Human-Robot InteractionProceedings of the 2017 ACM/IEEE International Conference on Human-Robot Interaction10.1145/2909824.3020224(263-273)Online publication date: 6-Mar-2017
  • (2014)Evaluation of Head Gaze Loosely Synchronized With Real-Time Synthetic Speech for Social RobotsIEEE Transactions on Human-Machine Systems10.1109/THMS.2014.234203544:6(767-778)Online publication date: Dec-2014
  • (2014)Evaluation of Proxemic Scaling Functions for Social RoboticsIEEE Transactions on Human-Machine Systems10.1109/THMS.2014.230407544:3(374-385)Online publication date: Jun-2014
  • Show More Cited By

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Figures

Tables

Media

Share

Share

Share this Publication link

Share on social media