skip to main content
10.1145/2522628.2522633acmconferencesArticle/Chapter ViewAbstractPublication PagesmigConference Proceedingsconference-collections
tutorial

Emotion Capture: Emotionally Expressive Characters for Games

Authors Info & Claims
Published:06 November 2013Publication History

ABSTRACT

It has been shown that humans are sensitive to the portrayal of emotions for virtual characters. However, previous work in this area has often examined this sensitivity using extreme examples of facial or body animation. Less is known about how attuned people are at recognizing emotions as they are expressed during conversational communication. In order to determine whether body or facial motion is a better indicator for emotional expression for game characters, we conduct a perceptual experiment using synchronized full-body and facial motion-capture data. We find that people can recognize emotions from either modality alone, but combining facial and body motion is preferable in order to create more expressive characters.

References

  1. Ahn, J., Gobron, S., Thalmann, D., and Boulic, R. 2012. Conveying real-time ambivalent feelings through asymmetric facial expressions. In Motion in Games, vol. 7660 of Lecture Notes in Computer Science. 122--133.Google ScholarGoogle Scholar
  2. Atkinson, A. P., Dittrich, W. H., Gemmell, A. J., and Young, A. W. 2004. Emotion perception from dynamic and static body expressions in point-light and full-light displays. Perception 33, 6, 717--746.Google ScholarGoogle ScholarCross RefCross Ref
  3. Aviezer, H., Trope, Y., and Todorov, A. 2012. Body cues, not facial expressions, discriminate between intense positive and negative emotions. Science 338, 6111, 1225--1229.Google ScholarGoogle Scholar
  4. Beeler, T., Hahn, F., Bradley, D., Bickel, B., Beardsley, P., Gotsman, C., Sumner, R. W., and Gross, M. 2011. High-quality passive facial performance capture using anchor frames. ACM Transactions on Graphics (SIGGRAPH 2011) 30, 75:1--75:10. Google ScholarGoogle ScholarDigital LibraryDigital Library
  5. Ben-David, B., van Lieshout, P., and Leszcz, T. 2011. A resource of validated affective and neutral sentences to assess identification of emotion in spoken language after a brain injury. Journal of Brain Injury 25, 2, 206--220.Google ScholarGoogle ScholarCross RefCross Ref
  6. Biele, C., and Grabowska, A. 2006. Sex differences in perception of emotion intensity in dynamic and static facial expressions. Experimental Brain Research 171, 1, 1--6.Google ScholarGoogle ScholarCross RefCross Ref
  7. Borod, J., Koff, E., and White, B. 1983. Facial asymmetry in posed and spontaneous expressions of emotion. Brain and Cognition 2, 2, 165--175.Google ScholarGoogle ScholarCross RefCross Ref
  8. Clarke, T., Bradshaw, M., Field, D., Hampson, S., and Rose, D. 2005. The perception of emotion from body movement in point-light displays of interpersonal dialogue. Perception 34, 10, 1171--1180.Google ScholarGoogle ScholarCross RefCross Ref
  9. Clavel, C., Plessier, J., Martin, J.-C., Ach, L., and Morel, B. 2009. Combining facial and postural expressions of emotions in a virtual character. In Proceedings of the 9th International Conference on Intelligent Virtual Agents, IVA '09, 287--300. Google ScholarGoogle ScholarDigital LibraryDigital Library
  10. Coulson, M. 2004. Attributing emotion to static body postures: Recognition accuracy, confusions, and viewpoint dependence. Journal of Nonverbal Behavior 28, 117--139.Google ScholarGoogle ScholarCross RefCross Ref
  11. Courgeon, M., Buisine, S., and Martin, J.-C. 2009. Impact of expressive wrinkles on perception of a virtual character's facial expressions of emotions. In Intelligent Virtual Agents, vol. 5773 of Lecture Notes in Computer Science. 201--214. Google ScholarGoogle ScholarDigital LibraryDigital Library
  12. Courgeon, M., Clavel, C., Tan, N., and Martin, J.-C. 2011. Front view vs. side view of facial and postural expressions of emotions in a virtual character. In Transactions on Edutainment VI, vol. 6758 of Lecture Notes in Computer Science. 132--143. Google ScholarGoogle ScholarDigital LibraryDigital Library
  13. de la Rosa, S., Giese, M., Bülthoff, H., and C, C. C. 2013. The contribution of different cues of facial movement to the emotional facial expression adaptation aftereffect. Journal of Vision 13, 1, 1--15.Google ScholarGoogle ScholarCross RefCross Ref
  14. Ekman, P. 1992. An argument for basic emotions. Cognition and Emotion 6, 3, 169--200.Google ScholarGoogle ScholarCross RefCross Ref
  15. Johansson, G. 1973. Visual perception of biological motion and a model for its analysis. Perception and Psychophysics 14, 2, 201--211.Google ScholarGoogle ScholarCross RefCross Ref
  16. Krahmer, E., and Swerts, M. 2007. The effects of visual beats on prosodic prominence: Acoustic analyses, auditory perception and visual perception. Journal of Memory and Language 57, 3, 396--414.Google ScholarGoogle ScholarCross RefCross Ref
  17. McDonnell, R., Jörg, S., Hodgins, J., Newell, F., and O'Sullivan, C. 2007. Virtual shapers & movers: form and motion affect sex perception. In Proceedings of the 4th symposium on Applied perception in graphics and visualization, 7--10. Google ScholarGoogle ScholarDigital LibraryDigital Library
  18. McDonnell, R., Jörg, S., McHugh, J., Newell, F., and O'Sullivan, C. 2009. Investigating the role of body shape on the perception of emotion. ACM Transactions on Applied Perception 6, 3, 1--11. Google ScholarGoogle ScholarDigital LibraryDigital Library
  19. Neff, M., Wang, Y., Abbott, R., and Walker, M. 2010. Evaluating the effect of gesture and language on personality perception in conversational agents. In Proceedings of the 10th international conference on Intelligent virtual agents, IVA'10, 222--235. Google ScholarGoogle ScholarDigital LibraryDigital Library
  20. Niewiadomski, R., Hyniewska, S. J., and Pelachaud, C. 2011. Constraint-based model for synthesis of multimodal sequential expressions of emotions. IEEE Transactions on Affective Computing 2, 3, 134--146. Google ScholarGoogle ScholarDigital LibraryDigital Library
  21. Rose, D., and Clarke, T. 2009. Look who's talking: Visual detection of speech from whole-body biological motion cues during emotive interpersonal conversation. Perception 38, 1, 153--156.Google ScholarGoogle ScholarCross RefCross Ref
  22. Simantov, J., 2013. The last of us: Character rigs, autodesk booth sessions. Presented at Game Developers Conference (San Francisco, CA, Mar 25--29).Google ScholarGoogle Scholar
  23. Weise, T., Bouaziz, S., Li, H., and Pauly, M. 2011. Realtime performance-based facial animation. ACM Transactions on Graphics (SIGGRAPH 2011) 30, 4, 77:1--77:10. Google ScholarGoogle ScholarDigital LibraryDigital Library
  24. Zibrek, K., Hoyet, L., Ruhland, K., and McDonnell, R. 2013. Evaluating the effect of emotion on gender recognition in virtual humans. In Proceedings of 2013 Symposium on Applied Perception, to appear. Google ScholarGoogle ScholarDigital LibraryDigital Library

Index Terms

  1. Emotion Capture: Emotionally Expressive Characters for Games

      Recommendations

      Comments

      Login options

      Check if you have access through your login credentials or your institution to get full access on this article.

      Sign in
      • Published in

        cover image ACM Conferences
        MIG '13: Proceedings of Motion on Games
        November 2013
        30 pages
        ISBN:9781450325462
        DOI:10.1145/2522628

        Copyright © 2013 ACM

        Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

        Publisher

        Association for Computing Machinery

        New York, NY, United States

        Publication History

        • Published: 6 November 2013

        Permissions

        Request permissions about this article.

        Request Permissions

        Check for updates

        Qualifiers

        • tutorial
        • Research
        • Refereed limited

        Acceptance Rates

        MIG '13 Paper Acceptance Rate-9of-9submissions,100%Overall Acceptance Rate-9of-9submissions,100%

        Upcoming Conference

        MIG '24

      PDF Format

      View or Download as a PDF file.

      PDF

      eReader

      View online with eReader.

      eReader