skip to main content
10.1145/2909824.3020239acmconferencesArticle/Chapter ViewAbstractPublication PageshriConference Proceedingsconference-collections
research-article
Open Access

Expressing Emotions through Color, Sound, and Vibration with an Appearance-Constrained Social Robot

Authors Info & Claims
Published:06 March 2017Publication History

ABSTRACT

Many researchers are now dedicating their efforts to studying interactive modalities such as facial expressions, natural language, and gestures. This phenomenon makes communication between robots and individuals become more natural. However, many robots currently in use are appearance constrained and not able to perform facial expressions and gestures. In addition, although humanoid-oriented techniques are promising, they are time and cost consuming, which leads to many technical difficulties in most research studies. To increase interactive efficiency and decrease costs, we alternatively focus on three interaction modalities and their combinations, namely color, sound, and vibration. We conduct a structured study to evaluate the effects of the three modalities on a human's emotional perception towards our simple-shaped robot "Maru." Our findings offer insights into human-robot affective interactions, which can be particularly useful for appearance-constrained social robots. The contribution of this work is not so much the explicit parameter settings but rather deepening the understanding of how to express emotions through the simple modalities of color, sound, and vibration while providing a set of recommended expressions that HRI researchers and practitioners could readily employ.

References

  1. https://www.ted.com/talks/cynthia_breazeal_the_rise_of_personal_robots, 2008 (accessed September 3, 2016).Google ScholarGoogle Scholar
  2. https://www.ted.com/talks/cynthia_breazeal_the_rise_of_personal_robots, 2010 (accessed September 7, 2016).Google ScholarGoogle Scholar
  3. https://www.ald.softbankrobotics.com/en/cool-robots/nao, accessed September 6, 2016.Google ScholarGoogle Scholar
  4. F. Arafsha, K. M. Alam, and A. El Saddik. Design and development of a user centric affective haptic jacket. Multimedia Tools and Applications, 74(9):3035--3052, 2015. Google ScholarGoogle ScholarDigital LibraryDigital Library
  5. M. Begum, R. W. Serna, D. Kontak, J. Allspaw, J. Kuczynski, H. A. Yanco, and J. Suarez. Measuring the efficacy of robots in autism therapy: How informative are standard hri metrics'. In Proceedings of the Tenth Annual ACM/IEEE International Conference on Human-Robot Interaction, pages 335--342. ACM, 2015. Google ScholarGoogle ScholarDigital LibraryDigital Library
  6. C. L. Bethel and R. R. Murphy. Affective expression in appearance constrained robots. In Proceedings of the 1st ACM SIGCHI/SIGART conference on Human-robot interaction, pages 327--328. ACM, 2006. Google ScholarGoogle ScholarDigital LibraryDigital Library
  7. C. L. Bethel and R. R. Murphy. Survey of non-facial/non-verbal affective expressions for appearance-constrained robots. IEEE Transactions on Systems, Man, and Cybernetics, Part C (Applications and Reviews), 38(1):83--92, 2008. Google ScholarGoogle ScholarDigital LibraryDigital Library
  8. C. Breazeal. Social interactions in hri: the robot view. IEEE Transactions on Systems, Man, and Cybernetics, Part C (Applications and Reviews), 34(2):181--186, 2004. Google ScholarGoogle ScholarDigital LibraryDigital Library
  9. C. L. Breazeal. Designing sociable robots. MIT press, 2004. Google ScholarGoogle ScholarDigital LibraryDigital Library
  10. A. Bruce, I. Nourbakhsh, and R. Simmons. The role of expressiveness and attention in human-robot interaction. In Robotics and Automation, 2002. Proceedings. ICRA'02. IEEE International Conference on, volume 4, pages 4138--4142. IEEE, 2002.Google ScholarGoogle ScholarCross RefCross Ref
  11. P. S. Dehkordi, H. Moradi, M. Mahmoudi, and H. R. Pouretemad. The design, development, and deployment of roboparrot for screening autistic children. International Journal of Social Robotics, 7(4):513--522, 2015.Google ScholarGoogle ScholarCross RefCross Ref
  12. S. R. Fussell, S. Kiesler, L. D. Setlock, and V. Yew. How people anthropomorphize robots. In Proceedings of the 3rd ACM/IEEE international conference on Human robot interaction, pages 145--152. ACM, 2008. Google ScholarGoogle ScholarDigital LibraryDigital Library
  13. C. Harrison, J. Horstman, G. Hsieh, and S. Hudson. Unlocking the expressivity of point lights. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, pages 1683--1692. ACM, 2012. Google ScholarGoogle ScholarDigital LibraryDigital Library
  14. D. Hood, S. Lemaignan, and P. Dillenbourg. When children teach a robot to write: An autonomous teachable humanoid which uses simulated handwriting. In Proceedings of the Tenth Annual ACM/IEEE International Conference on Human-Robot Interaction, pages 83--90. ACM, 2015. Google ScholarGoogle ScholarDigital LibraryDigital Library
  15. I. Infantino, G. Pilato, R. Rizzo, and F. Vella. I feel blue: Robots and humans sharing color representation for emotional cognitive interaction. In Biologically Inspired Cognitive Architectures 2012, pages 161--166. Springer, 2013.Google ScholarGoogle Scholar
  16. H. Ishiguro. Interactive humanoids and androids as ideal interfaces for humans. In Proceedings of the 11th international conference on Intelligent user interfaces, pages 2--9. ACM, 2006. Google ScholarGoogle ScholarDigital LibraryDigital Library
  17. N. Karatas, S. Yoshikawa, and M. Okada. Namida: Sociable driving agents with multiparty conversation. In Proceedings of the Fourth International Conference on Human Agent Interaction, pages 35--42. ACM, 2016. Google ScholarGoogle ScholarDigital LibraryDigital Library
  18. Y. Kato, T. Kanda, and H. Ishiguro. May i help you?: Design of human-like polite approaching behavior. In Proceedings of the Tenth Annual ACM/IEEE International Conference on Human-Robot Interaction, pages 35--42. ACM, 2015. Google ScholarGoogle ScholarDigital LibraryDigital Library
  19. T. Komatsu. Toward making humans empathize with artificial agents by means of subtle expressions. In International Conference on Affective Computing and Intelligent Interaction, pages 458--465. Springer, 2005. Google ScholarGoogle ScholarDigital LibraryDigital Library
  20. T. Komatsu and S. Yamada. How does the agents' appearance affect users' interpretation of the agents' attitudes: Experimental investigation on expressing the same artificial sounds from agents with different appearances. Intl. Journal of Human-Computer Interaction, 27(3):260--279, 2011.Google ScholarGoogle ScholarCross RefCross Ref
  21. T. Komatsu, S. Yamada, K. Kobayashi, K. Funakoshi, and M. Nakano. Artificial subtle expressions: intuitive notification methodology of artifacts. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, pages 1941--1944. ACM, 2010. Google ScholarGoogle ScholarDigital LibraryDigital Library
  22. R. Küller, B. Mikellides, and J. Janssens. Color, arousal, and performance--a comparison of three experiments. Color Research & Application, 34(2):141--152, 2009.Google ScholarGoogle ScholarCross RefCross Ref
  23. B. Manav. Color-emotion associations and color preferences: A case study for residences. Color Research & Application, 32(2):144--150, 2007.Google ScholarGoogle ScholarCross RefCross Ref
  24. L. Moshkina and R. C. Arkin. Human perspective on affective robotic behavior: A longitudinal study. In 2005 IEEE/RSJ International Conference on Intelligent Robots and Systems, pages 1444--1451. IEEE, 2005.Google ScholarGoogle ScholarCross RefCross Ref
  25. Y. Nakagawa, K. Park, H. Ueda, and H. Ono. Driving assistance with conversation robot for elderly drivers. In International Conference on Universal Access in Human-Computer Interaction, pages 750--761. Springer, 2014.Google ScholarGoogle ScholarDigital LibraryDigital Library
  26. K. Naz and H. Helen. Color-emotion associations: Past experience and personal preference. In AIC 2004 Color and Paints, Interim Meeting of the International Color Association, Proceedings, volume 5, page 31. Jose Luis Caivano, 2004.Google ScholarGoogle Scholar
  27. N. A. Nijdam. Mapping emotion to color. Book Mapping emotion to color?(2009), pages 2--9, 2009.Google ScholarGoogle Scholar
  28. I. R. Nourbakhsh, C. Kunz, and T. Willeke. The mobot museum robot installations: A five year experiment. In Intelligent Robots and Systems, 2003.(IROS 2003). Proceedings. 2003 IEEE/RSJ International Conference on, volume 4, pages 3636--3641. IEEE, 2003.Google ScholarGoogle ScholarCross RefCross Ref
  29. J. Posner, J. A. Russell, and B. S. Peterson. The circumplex model of affect: An integrative approach to affective neuroscience, cognitive development, and psychopathology. Development and psychopathology, 17(03):715--734, 2005.Google ScholarGoogle ScholarCross RefCross Ref
  30. S. U. Réhman and L. Liu. Vibrotactile emotions on a mobile phone. In Signal Image Technology and Internet Based Systems, 2008. SITIS'08. IEEE International Conference on, pages 239--243. IEEE, 2008. Google ScholarGoogle ScholarDigital LibraryDigital Library
  31. M. Scheeff, J. Pinto, K. Rahardja, S. Snibbe, and R. Tow. Experiences with sparky, a social robot. In Socially Intelligent Agents, pages 173--180. Springer, 2002.Google ScholarGoogle Scholar
  32. J. Scheirer and R. Picard. Affective objects. MIT Media lab Technical Rep., 524, 2000.Google ScholarGoogle Scholar
  33. A. Singh and J. E. Young. A dog tail for utility robots: exploring affective properties of tail movement. In IFIP Conference on Human-Computer Interaction, pages 403--419. Springer, 2013.Google ScholarGoogle ScholarCross RefCross Ref
  34. M. V. Sokolova and A. Fernández-Caballero. A review on the role of color and light in affective computing. Applied Sciences, 5(3):275--293, 2015.Google ScholarGoogle ScholarCross RefCross Ref
  35. F. Tanaka and S. Matsuzoe. Children teach a care-receiving robot to promote their learning: Field experiments in a classroom for vocabulary learning. Journal of Human-Robot Interaction, 1(1), 2012.Google ScholarGoogle Scholar
  36. S. ur Réhman and L. Liu. ifeeling: Vibrotactile rendering of human emotions on mobile phones. In Mobile multimedia processing, pages 1--20. Springer, 2010.Google ScholarGoogle ScholarCross RefCross Ref
  37. S. Yilmazyildiz, R. Read, T. Belpeame, and W. Verhelst. Review of semantic-free utterances in social human--robot interaction. International Journal of Human-Computer Interaction, 32(1):63--85, 2016.Google ScholarGoogle ScholarCross RefCross Ref

Index Terms

  1. Expressing Emotions through Color, Sound, and Vibration with an Appearance-Constrained Social Robot

    Recommendations

    Comments

    Login options

    Check if you have access through your login credentials or your institution to get full access on this article.

    Sign in
    • Published in

      cover image ACM Conferences
      HRI '17: Proceedings of the 2017 ACM/IEEE International Conference on Human-Robot Interaction
      March 2017
      510 pages
      ISBN:9781450343367
      DOI:10.1145/2909824

      Copyright © 2017 ACM

      Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      • Published: 6 March 2017

      Permissions

      Request permissions about this article.

      Request Permissions

      Check for updates

      Qualifiers

      • research-article

      Acceptance Rates

      HRI '17 Paper Acceptance Rate51of211submissions,24%Overall Acceptance Rate242of1,000submissions,24%

    PDF Format

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader