skip to main content
10.1145/3136457.3136461acmconferencesArticle/Chapter ViewAbstractPublication PagesmigConference Proceedingsconference-collections
research-article

A perceptual evaluation of social interaction with emotes and real-time facial motion capture

Published:08 November 2017Publication History

ABSTRACT

Social interaction between players is an important feature in online games, where text- and voice chat is a standard way to communicate. To express emotions players can type emotes which are text-based commands to play animations from the player avatar. This paper presents a perceptual evaluation which investigates if instead expressing emotions with the face, in real-time with a web camera, is perceived more realistic and preferred in comparison to typing emote-based text commands. A user study with 24 participants was conducted where the two methods to express emotions described above were evaluated. For both methods the participants ranked the realism of facial expressions, which were based on the theory of seven universal emotions stated by American psychologist Paul Ekman: happiness, anger, fear, sadness, disgust, surprise and contempt. The participants also ranked their perceived efficiency of performing the two methods and selected the method they preferred. A significant difference was shown when analyzing the results of ranked facial expression realism. Happiness was perceived as the most realistic in both methods, while disgust and sadness were poorly rated when performed with the face. One conclusion of the perceptual evaluation was that the realism and preference between the methods showed no significant differences. However, participants had higher performance in typing with emotes. Real-time facial capture technology also needs improvements to obtain better recognition and tracking of facial features in the human face.

References

  1. Sofien Bouaziz, Yangang Wang, and Mark Pauly. 2013. Online Modeling for Realtime Facial Animation. ACM Trans. Graph. 32, 4, Article 40 (July 2013), 10 pages. Google ScholarGoogle ScholarDigital LibraryDigital Library
  2. Chen Cao, Hongzhi Wu, Yanlin Weng, Tianjia Shao, and Kun Zhou. 2016. Real-time Facial Animation with Image-based Dynamic Avatars. ACM Trans. Graph. 35, 4, Article 126 (July 2016), 12 pages. Google ScholarGoogle ScholarDigital LibraryDigital Library
  3. Daybreak Game Company. 2017. EverQuest 2 official website. (2017). Retrieved May 16, 2017 from https://www.everquest2.com/homeGoogle ScholarGoogle Scholar
  4. Paul Ekman and Wallace V. Friesen. 2003. Unmasking The Face: A Guide to Recognizing Emotions from Facial Expressions. Malor Books, 171 Main St. 140 Los Altos, CA 94022.Google ScholarGoogle Scholar
  5. Cathy Ennis, Ludovic Hoyet, Arjan Egges, and Rachel McDonnell. 2013. Emotion Capture: Emotionally Expressive Characters for Games. In Proceedings of Motion on Games (MIG '13). ACM, New York, NY, USA, Article 31, 8 pages. Google ScholarGoogle ScholarDigital LibraryDigital Library
  6. Epic Games. 2017. Advanced Social System on Unreal Engine's Marketplace. (2017). Retrieved April 24, 2017 from https://www.unrealengine.com/marketplace/advanced-social-systemGoogle ScholarGoogle Scholar
  7. Faceware Tech. 2017. Faceware Technologies website. (2017). Retrieved May 16, 2017 from http://facewaretech.com/products/software/realtime-liveGoogle ScholarGoogle Scholar
  8. Image Metrics. 2017. Image Metrics website. (2017). Retrieved May 16, 2017 from http://image-metrics.com/Google ScholarGoogle Scholar
  9. Kakao Games. 2017. Black Desert Online official website. (2017). Retrieved May 16, 2017 from https://www.blackdesertonline.comGoogle ScholarGoogle Scholar
  10. Elena Kokkinara and Rachel McDonnell. 2015. Animation Realism Affects Perceived Character Appeal of a Self-virtual Face. In Proceedings of the 8th ACM SIGGRAPH Conference on Motion in Games (MIG '15). ACM, New York, NY, USA, 221--226. Google ScholarGoogle ScholarDigital LibraryDigital Library
  11. Jacob Nahin. 2012. Image Metrics Launches Live Driver. (Feb. 2012). Retrieved May 16, 2017 from http://www.businesswire.com/news/home/20120208005505/en/Image-Metrics-Launches-Live-DriverGoogle ScholarGoogle Scholar
  12. Mark Pauly. 2013. Realtime Performance-Based Facial Avatars for Immersive Gameplay. In Proceedings of Motion on Games (MIG '13). ACM, New York, NY, USA, Article 23, 6 pages. Google ScholarGoogle ScholarDigital LibraryDigital Library
  13. Paul Viola and Michael J. Jones. 2004. Robust Real-Time Face Detection. Int. J. Comput. Vision 57, 2 (May 2004), 137--154. fb Google ScholarGoogle ScholarDigital LibraryDigital Library
  14. Thibaut Weise, Sofien Bouaziz, Hao Li, and Mark Pauly. 2011. Realtime Performance-based Facial Animation. ACM Trans. Graph. 30, 4, Article 77 (July 2011), 10 pages. Google ScholarGoogle ScholarDigital LibraryDigital Library
  15. Wikipedia. 2016. Wikipedia article about Emotes in Games. (Nov. 2016). Retrieved May 16, 2017 from https://en.wikipedia.org/wiki/EmoteGoogle ScholarGoogle Scholar
  16. Jihun Yu and Jungwoon Park. 2016. Real-time Facial Tracking in Virtual Reality. In SIGGRAPH ASIA 2016 VR Showcase (SA '16). ACM, New York, NY, USA, Article 4, 1 pages. Google ScholarGoogle ScholarDigital LibraryDigital Library
  17. ZAM Network. 2012. ZAM article about SOEmote in EverQuest 2. (Aug. 2012).Google ScholarGoogle Scholar
  18. Retrieved May 16, 2017 from http://eq2.zam.com/wiki/SOEmoteGoogle ScholarGoogle Scholar
  19. Zenimax Media. 2017. The Elder Scrolls Online official website. (2017). Retrieved May 16, 2017 from http://www.elderscrollsonline.comGoogle ScholarGoogle Scholar
  20. Ce Zhan, Wanqing Li, Philip Ogunbona, and Farzad Safaei. 2008. A Real-time Facial Expression Recognition System for Online Games. Int. J. Comput. Games Technol. 2008, Article 10 (Jan. 2008), 7 pages. Google ScholarGoogle ScholarDigital LibraryDigital Library

Index Terms

  1. A perceptual evaluation of social interaction with emotes and real-time facial motion capture

        Recommendations

        Comments

        Login options

        Check if you have access through your login credentials or your institution to get full access on this article.

        Sign in
        • Published in

          cover image ACM Conferences
          MIG '17: Proceedings of the 10th International Conference on Motion in Games
          November 2017
          128 pages
          ISBN:9781450355414
          DOI:10.1145/3136457

          Copyright © 2017 ACM

          Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

          Publisher

          Association for Computing Machinery

          New York, NY, United States

          Publication History

          • Published: 8 November 2017

          Permissions

          Request permissions about this article.

          Request Permissions

          Check for updates

          Qualifiers

          • research-article

          Acceptance Rates

          Overall Acceptance Rate-9of-9submissions,100%

          Upcoming Conference

          MIG '24

        PDF Format

        View or Download as a PDF file.

        PDF

        eReader

        View online with eReader.

        eReader