ABSTRACT
Many researchers are now dedicating their efforts to studying interactive modalities such as facial expressions, natural language, and gestures. This phenomenon makes communication between robots and individuals become more natural. However, many robots currently in use are appearance constrained and not able to perform facial expressions and gestures. In addition, although humanoid-oriented techniques are promising, they are time and cost consuming, which leads to many technical difficulties in most research studies. To increase interactive efficiency and decrease costs, we alternatively focus on three interaction modalities and their combinations, namely color, sound, and vibration. We conduct a structured study to evaluate the effects of the three modalities on a human's emotional perception towards our simple-shaped robot "Maru." Our findings offer insights into human-robot affective interactions, which can be particularly useful for appearance-constrained social robots. The contribution of this work is not so much the explicit parameter settings but rather deepening the understanding of how to express emotions through the simple modalities of color, sound, and vibration while providing a set of recommended expressions that HRI researchers and practitioners could readily employ.
- https://www.ted.com/talks/cynthia_breazeal_the_rise_of_personal_robots, 2008 (accessed September 3, 2016).Google Scholar
- https://www.ted.com/talks/cynthia_breazeal_the_rise_of_personal_robots, 2010 (accessed September 7, 2016).Google Scholar
- https://www.ald.softbankrobotics.com/en/cool-robots/nao, accessed September 6, 2016.Google Scholar
- F. Arafsha, K. M. Alam, and A. El Saddik. Design and development of a user centric affective haptic jacket. Multimedia Tools and Applications, 74(9):3035--3052, 2015. Google ScholarDigital Library
- M. Begum, R. W. Serna, D. Kontak, J. Allspaw, J. Kuczynski, H. A. Yanco, and J. Suarez. Measuring the efficacy of robots in autism therapy: How informative are standard hri metrics'. In Proceedings of the Tenth Annual ACM/IEEE International Conference on Human-Robot Interaction, pages 335--342. ACM, 2015. Google ScholarDigital Library
- C. L. Bethel and R. R. Murphy. Affective expression in appearance constrained robots. In Proceedings of the 1st ACM SIGCHI/SIGART conference on Human-robot interaction, pages 327--328. ACM, 2006. Google ScholarDigital Library
- C. L. Bethel and R. R. Murphy. Survey of non-facial/non-verbal affective expressions for appearance-constrained robots. IEEE Transactions on Systems, Man, and Cybernetics, Part C (Applications and Reviews), 38(1):83--92, 2008. Google ScholarDigital Library
- C. Breazeal. Social interactions in hri: the robot view. IEEE Transactions on Systems, Man, and Cybernetics, Part C (Applications and Reviews), 34(2):181--186, 2004. Google ScholarDigital Library
- C. L. Breazeal. Designing sociable robots. MIT press, 2004. Google ScholarDigital Library
- A. Bruce, I. Nourbakhsh, and R. Simmons. The role of expressiveness and attention in human-robot interaction. In Robotics and Automation, 2002. Proceedings. ICRA'02. IEEE International Conference on, volume 4, pages 4138--4142. IEEE, 2002.Google ScholarCross Ref
- P. S. Dehkordi, H. Moradi, M. Mahmoudi, and H. R. Pouretemad. The design, development, and deployment of roboparrot for screening autistic children. International Journal of Social Robotics, 7(4):513--522, 2015.Google ScholarCross Ref
- S. R. Fussell, S. Kiesler, L. D. Setlock, and V. Yew. How people anthropomorphize robots. In Proceedings of the 3rd ACM/IEEE international conference on Human robot interaction, pages 145--152. ACM, 2008. Google ScholarDigital Library
- C. Harrison, J. Horstman, G. Hsieh, and S. Hudson. Unlocking the expressivity of point lights. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, pages 1683--1692. ACM, 2012. Google ScholarDigital Library
- D. Hood, S. Lemaignan, and P. Dillenbourg. When children teach a robot to write: An autonomous teachable humanoid which uses simulated handwriting. In Proceedings of the Tenth Annual ACM/IEEE International Conference on Human-Robot Interaction, pages 83--90. ACM, 2015. Google ScholarDigital Library
- I. Infantino, G. Pilato, R. Rizzo, and F. Vella. I feel blue: Robots and humans sharing color representation for emotional cognitive interaction. In Biologically Inspired Cognitive Architectures 2012, pages 161--166. Springer, 2013.Google Scholar
- H. Ishiguro. Interactive humanoids and androids as ideal interfaces for humans. In Proceedings of the 11th international conference on Intelligent user interfaces, pages 2--9. ACM, 2006. Google ScholarDigital Library
- N. Karatas, S. Yoshikawa, and M. Okada. Namida: Sociable driving agents with multiparty conversation. In Proceedings of the Fourth International Conference on Human Agent Interaction, pages 35--42. ACM, 2016. Google ScholarDigital Library
- Y. Kato, T. Kanda, and H. Ishiguro. May i help you?: Design of human-like polite approaching behavior. In Proceedings of the Tenth Annual ACM/IEEE International Conference on Human-Robot Interaction, pages 35--42. ACM, 2015. Google ScholarDigital Library
- T. Komatsu. Toward making humans empathize with artificial agents by means of subtle expressions. In International Conference on Affective Computing and Intelligent Interaction, pages 458--465. Springer, 2005. Google ScholarDigital Library
- T. Komatsu and S. Yamada. How does the agents' appearance affect users' interpretation of the agents' attitudes: Experimental investigation on expressing the same artificial sounds from agents with different appearances. Intl. Journal of Human-Computer Interaction, 27(3):260--279, 2011.Google ScholarCross Ref
- T. Komatsu, S. Yamada, K. Kobayashi, K. Funakoshi, and M. Nakano. Artificial subtle expressions: intuitive notification methodology of artifacts. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, pages 1941--1944. ACM, 2010. Google ScholarDigital Library
- R. Küller, B. Mikellides, and J. Janssens. Color, arousal, and performance--a comparison of three experiments. Color Research & Application, 34(2):141--152, 2009.Google ScholarCross Ref
- B. Manav. Color-emotion associations and color preferences: A case study for residences. Color Research & Application, 32(2):144--150, 2007.Google ScholarCross Ref
- L. Moshkina and R. C. Arkin. Human perspective on affective robotic behavior: A longitudinal study. In 2005 IEEE/RSJ International Conference on Intelligent Robots and Systems, pages 1444--1451. IEEE, 2005.Google ScholarCross Ref
- Y. Nakagawa, K. Park, H. Ueda, and H. Ono. Driving assistance with conversation robot for elderly drivers. In International Conference on Universal Access in Human-Computer Interaction, pages 750--761. Springer, 2014.Google ScholarDigital Library
- K. Naz and H. Helen. Color-emotion associations: Past experience and personal preference. In AIC 2004 Color and Paints, Interim Meeting of the International Color Association, Proceedings, volume 5, page 31. Jose Luis Caivano, 2004.Google Scholar
- N. A. Nijdam. Mapping emotion to color. Book Mapping emotion to color?(2009), pages 2--9, 2009.Google Scholar
- I. R. Nourbakhsh, C. Kunz, and T. Willeke. The mobot museum robot installations: A five year experiment. In Intelligent Robots and Systems, 2003.(IROS 2003). Proceedings. 2003 IEEE/RSJ International Conference on, volume 4, pages 3636--3641. IEEE, 2003.Google ScholarCross Ref
- J. Posner, J. A. Russell, and B. S. Peterson. The circumplex model of affect: An integrative approach to affective neuroscience, cognitive development, and psychopathology. Development and psychopathology, 17(03):715--734, 2005.Google ScholarCross Ref
- S. U. Réhman and L. Liu. Vibrotactile emotions on a mobile phone. In Signal Image Technology and Internet Based Systems, 2008. SITIS'08. IEEE International Conference on, pages 239--243. IEEE, 2008. Google ScholarDigital Library
- M. Scheeff, J. Pinto, K. Rahardja, S. Snibbe, and R. Tow. Experiences with sparky, a social robot. In Socially Intelligent Agents, pages 173--180. Springer, 2002.Google Scholar
- J. Scheirer and R. Picard. Affective objects. MIT Media lab Technical Rep., 524, 2000.Google Scholar
- A. Singh and J. E. Young. A dog tail for utility robots: exploring affective properties of tail movement. In IFIP Conference on Human-Computer Interaction, pages 403--419. Springer, 2013.Google ScholarCross Ref
- M. V. Sokolova and A. Fernández-Caballero. A review on the role of color and light in affective computing. Applied Sciences, 5(3):275--293, 2015.Google ScholarCross Ref
- F. Tanaka and S. Matsuzoe. Children teach a care-receiving robot to promote their learning: Field experiments in a classroom for vocabulary learning. Journal of Human-Robot Interaction, 1(1), 2012.Google Scholar
- S. ur Réhman and L. Liu. ifeeling: Vibrotactile rendering of human emotions on mobile phones. In Mobile multimedia processing, pages 1--20. Springer, 2010.Google ScholarCross Ref
- S. Yilmazyildiz, R. Read, T. Belpeame, and W. Verhelst. Review of semantic-free utterances in social human--robot interaction. International Journal of Human-Computer Interaction, 32(1):63--85, 2016.Google ScholarCross Ref
Index Terms
- Expressing Emotions through Color, Sound, and Vibration with an Appearance-Constrained Social Robot
Recommendations
Multimodal Expression of Artificial Emotion in Social Robots Using Color, Motion and Sound
HRI '18: Proceedings of the 2018 ACM/IEEE International Conference on Human-Robot InteractionArtificial emotion display is a key feature of social robots to communicate internal states and behaviors in familiar human terms. While humanoid robots can draw on signals such as facial expressions or voice, emotions in appearance-constrained robots ...
Investigation on Effects of Color, Sound, and Vibration on Human's Emotional Perception
HAI '16: Proceedings of the Fourth International Conference on Human Agent InteractionAs robotics has advanced, research on conveying a robot's emotional state to a person has become a hot topic. Most current studies are focused on interaction modalities such as facial expressions and natural language. Although many of the results seem ...
Expressing emotions using gait of humanoid robot
2015 24th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN)Reading other's emotions is the key of a successful human-human communication. If we imagine using robots in human environment, the ability of robots to express emotions will help human communicate with the artificial agents. In studies about robots' ...
Comments