ABSTRACT
Social interaction between players is an important feature in online games, where text- and voice chat is a standard way to communicate. To express emotions players can type emotes which are text-based commands to play animations from the player avatar. This paper presents a perceptual evaluation which investigates if instead expressing emotions with the face, in real-time with a web camera, is perceived more realistic and preferred in comparison to typing emote-based text commands. A user study with 24 participants was conducted where the two methods to express emotions described above were evaluated. For both methods the participants ranked the realism of facial expressions, which were based on the theory of seven universal emotions stated by American psychologist Paul Ekman: happiness, anger, fear, sadness, disgust, surprise and contempt. The participants also ranked their perceived efficiency of performing the two methods and selected the method they preferred. A significant difference was shown when analyzing the results of ranked facial expression realism. Happiness was perceived as the most realistic in both methods, while disgust and sadness were poorly rated when performed with the face. One conclusion of the perceptual evaluation was that the realism and preference between the methods showed no significant differences. However, participants had higher performance in typing with emotes. Real-time facial capture technology also needs improvements to obtain better recognition and tracking of facial features in the human face.
- Sofien Bouaziz, Yangang Wang, and Mark Pauly. 2013. Online Modeling for Realtime Facial Animation. ACM Trans. Graph. 32, 4, Article 40 (July 2013), 10 pages. Google ScholarDigital Library
- Chen Cao, Hongzhi Wu, Yanlin Weng, Tianjia Shao, and Kun Zhou. 2016. Real-time Facial Animation with Image-based Dynamic Avatars. ACM Trans. Graph. 35, 4, Article 126 (July 2016), 12 pages. Google ScholarDigital Library
- Daybreak Game Company. 2017. EverQuest 2 official website. (2017). Retrieved May 16, 2017 from https://www.everquest2.com/homeGoogle Scholar
- Paul Ekman and Wallace V. Friesen. 2003. Unmasking The Face: A Guide to Recognizing Emotions from Facial Expressions. Malor Books, 171 Main St. 140 Los Altos, CA 94022.Google Scholar
- Cathy Ennis, Ludovic Hoyet, Arjan Egges, and Rachel McDonnell. 2013. Emotion Capture: Emotionally Expressive Characters for Games. In Proceedings of Motion on Games (MIG '13). ACM, New York, NY, USA, Article 31, 8 pages. Google ScholarDigital Library
- Epic Games. 2017. Advanced Social System on Unreal Engine's Marketplace. (2017). Retrieved April 24, 2017 from https://www.unrealengine.com/marketplace/advanced-social-systemGoogle Scholar
- Faceware Tech. 2017. Faceware Technologies website. (2017). Retrieved May 16, 2017 from http://facewaretech.com/products/software/realtime-liveGoogle Scholar
- Image Metrics. 2017. Image Metrics website. (2017). Retrieved May 16, 2017 from http://image-metrics.com/Google Scholar
- Kakao Games. 2017. Black Desert Online official website. (2017). Retrieved May 16, 2017 from https://www.blackdesertonline.comGoogle Scholar
- Elena Kokkinara and Rachel McDonnell. 2015. Animation Realism Affects Perceived Character Appeal of a Self-virtual Face. In Proceedings of the 8th ACM SIGGRAPH Conference on Motion in Games (MIG '15). ACM, New York, NY, USA, 221--226. Google ScholarDigital Library
- Jacob Nahin. 2012. Image Metrics Launches Live Driver. (Feb. 2012). Retrieved May 16, 2017 from http://www.businesswire.com/news/home/20120208005505/en/Image-Metrics-Launches-Live-DriverGoogle Scholar
- Mark Pauly. 2013. Realtime Performance-Based Facial Avatars for Immersive Gameplay. In Proceedings of Motion on Games (MIG '13). ACM, New York, NY, USA, Article 23, 6 pages. Google ScholarDigital Library
- Paul Viola and Michael J. Jones. 2004. Robust Real-Time Face Detection. Int. J. Comput. Vision 57, 2 (May 2004), 137--154. fb Google ScholarDigital Library
- Thibaut Weise, Sofien Bouaziz, Hao Li, and Mark Pauly. 2011. Realtime Performance-based Facial Animation. ACM Trans. Graph. 30, 4, Article 77 (July 2011), 10 pages. Google ScholarDigital Library
- Wikipedia. 2016. Wikipedia article about Emotes in Games. (Nov. 2016). Retrieved May 16, 2017 from https://en.wikipedia.org/wiki/EmoteGoogle Scholar
- Jihun Yu and Jungwoon Park. 2016. Real-time Facial Tracking in Virtual Reality. In SIGGRAPH ASIA 2016 VR Showcase (SA '16). ACM, New York, NY, USA, Article 4, 1 pages. Google ScholarDigital Library
- ZAM Network. 2012. ZAM article about SOEmote in EverQuest 2. (Aug. 2012).Google Scholar
- Retrieved May 16, 2017 from http://eq2.zam.com/wiki/SOEmoteGoogle Scholar
- Zenimax Media. 2017. The Elder Scrolls Online official website. (2017). Retrieved May 16, 2017 from http://www.elderscrollsonline.comGoogle Scholar
- Ce Zhan, Wanqing Li, Philip Ogunbona, and Farzad Safaei. 2008. A Real-time Facial Expression Recognition System for Online Games. Int. J. Comput. Games Technol. 2008, Article 10 (Jan. 2008), 7 pages. Google ScholarDigital Library
Index Terms
- A perceptual evaluation of social interaction with emotes and real-time facial motion capture
Recommendations
Facial expression of emotion and perception of the Uncanny Valley in virtual characters
With technology allowing for increased realism in video games, realistic, human-like characters risk falling into the Uncanny Valley. The Uncanny Valley phenomenon implies that virtual characters approaching full human-likeness will evoke a negative ...
Experience of emotion in face to face and computer-mediated social interactions
The present study compared the experience of emotion in social interactions that take place face to face (FtF), co-presently, and those that take place online, in computer-mediated communications (CMC). For a period of ten days participants reported how ...
Motion Capture and Emotion: Affect Detection in Whole Body Movement
ACII '07: Proceedings of the 2nd international conference on Affective Computing and Intelligent InteractionBodily expression of felt emotion was associated with emotion-specific changes in gait parameters and kinematics. The emotions angry, sad, content, joy and no emotion at all were elicited in forty-two undergraduates (22 female, 20 male; 20.1±2.7 yrs) ...
Comments