skip to main content
10.5555/1402383.1402415acmconferencesArticle/Chapter ViewAbstractPublication PagesaamasConference Proceedingsconference-collections
research-article

A model of gaze for the purpose of emotional expression in virtual embodied agents

Published: 12 May 2008 Publication History

Abstract

Currently, state of the art virtual agents lack the ability to display emotion as seen in actual humans, or even in hand-animated characters. One reason for the emotional inexpressiveness of virtual agents is the lack of emotionally expressive gaze manner. For virtual agents to express emotion that observers can empathize with, they need to generate gaze - including eye, head, and torso movement - to arbitrary targets, while displaying arbitrary emotional states. Our previous work [18] describes the Gaze Warping Transformation, a method of generating emotionally expressive head and torso movement during gaze shifts that is derived from human movement data. Through an evaluation, it was shown that applying different transformations to the same gaze shift could modify the affective state perceived when the transformed gaze shift was viewed by a human observer. In this paper we propose a model of realistic, emotionally expressive gaze that builds upon the Gaze Warping Transformation by improving the transformation implementation, and by adding a model of eye movement drawn from the visual neuroscience literature. We describe how to generate a gaze to an arbitrary target, while displaying an arbitrary emotional behavior. Finally, we propose an evaluation to determine what emotions human observers will attribute to the generated gaze shifts. Once this work is completed, virtual agents will have access to a new channel for emotionally expressive behavior.

References

[1]
Amaya, K., Bruderlin, A., Calvert, T. 1996. Emotion From Motion. Proceedings of the Conference on Graphical Interface, (1996). 222--229.
[2]
Argyle, M., Cook, M. 1976. Gaze and Mutual Gaze. Cambridge University Press, (1976).
[3]
Bickmore, T., Cassell, J. 2004. Social Dialogue with Embodied Conversational Agents. Bernsen, N. (ed.) Natural, Intelligent and Effective Interaction with Multimodal Dialogue Systems. Kluwer Academic (2004). 23--54.
[4]
Brand, M., Hertzmann, A. 2000. Style Machines. Proceedings of SIGGRAPH, ACM Press, New York. (2000)
[5]
Busso, C., Deng, Z., Neumann, U., Narayanan, S. S. 2005. Natural head motion synthesis driven by acoustic prosodic features. Computer Animation and Virtual Worlds, vol. 16, no. 3-4 (July 2005). 283--290.
[6]
Carney, D. 2005. Beliefs About the Nonverbal Expression of Social Power. Journal of Nonverbal Behavior. Vol 29, No. 2 (2005). 105--123.
[7]
Coulson, M. 2004. Attributing Emotion to Static Body Postures: Recognition Accuracy, Confusions, and Viewpoint Dependence. Journal of Nonverbal Behavior Vol. 28, No. 2.
[8]
Douglas-Cowie, E., Campbell, A., Cowie, R., Roach, P. 2003. Emotional Speech: Towards a New Generation of Databases. Speech Communication. Vol. 40, No. 1-2. (2003). 33--60.
[9]
Exline, R. 1974. Visual Interaction: The Glances of Power and Preference. Weitz, S. (ed.) Nonverbal Communication: Readings with Commentary, Oxford University Press, Oxford (1974).
[10]
Fukuyama, A., Ohno, T., Mukawa, N., Sawaki, M., Hagita, N. 2002. Messages Embedded in Gaze of Interface Agents: Impression Management with Agent's Gaze. Proceedings of SIGCHI, (2002).
[11]
Gebhard, P. 2005. ALMA: A Layered Model of Affect. Proceedings of AAMAS, ACM Press, New York, (2005).
[12]
Gresty, M. A. 1974. Coordination of Head and Eye Movements to Fixate Continuous and Intermittent Targets. Vision Research, Vol 14. (1974). 395--403.
[13]
Guitton, D., Volle, M. 1987. Gaze Control in Humans: Eye-Head Coordination During Orienting Movements to Targets Within and Beyond the Oculomotor Range. Journal of Neurophysiology. Vol 58, No. 3. (Sep. 1987). 427--459.
[14]
Itti, L., Dhavale, N., Pighin, F. 2003. Realistic Avatar Eye and Head Animation using a Neurobiological Model of Visual Attention. SPIE 48th Annual International Symposium on Optical Science and Technology. 2003.
[15]
Kipp, M., Neff, M., Kipp, K. H., Albrecht, I. 2007. Towards Natural Gesture Synthesis: Evaluating Gesture Units in a Data-Driven Approach to Gesture Synthesis. Proceedings of Intelligent Virtual Agents. (2007). 15--28.
[16]
Kleinke, C. 1986. Gaze and Eye Contact: A Research Review. Psychological Bulletin. Vol. 100, No. 1. (1986) 78--100.
[17]
Lance, B., Marsella, S., Koizumi, D. 2004. Towards Expressive Gaze Manner in Embodied Virtual Agents (2004)
[18]
Lance, B., Marsella, S. 2007. Emotionally Expressive Head and Body Movement During Gaze Shifts. Proceedings of Intelligent Virtual Agents. (2007). 72--85.
[19]
Lee, S., Badler, J., Badler, N. 2002. Eyes Alive. ACM Transactions on Graphics Vol. 21, No.3. (2002). 637--644.
[20]
Leigh, R. J., Zee, D. 2006. The Neurology of Eye Movements, 4th Ed. Oxford University Press, (2006).
[21]
Liu, C. K., Hertzmann, A., Popović, Z. 2005. Learning Physics-Based Motion Style with Nonlinear Inverse Optimization. Proceedings of SIGGRAPH, ACM Press, New York (2005)
[22]
Mehrabian, A. 1981. Silent Messages: Implicit Communication of Emotions and Attitudes, 2nd edn. Wadsworth Publishing Company (1981).
[23]
Mehrabian, A., Ksionzky, S. 1972. Some Determiners of Social Interaction. Sociometry, Vol 35. (1972). 588--609.
[24]
de Meijer, M. 1989. The Contribution of General Features of Body Movement to the Attribution of Emotions. Journal of Nonverbal Behavior. Vol 13, No. 4 (1989). 247--268.
[25]
Mignault, A., Chaudhuri, A. 2003. The Many Faces of a Neutral Face: Head Tilt and Perception of Dominance and Emotion. Journal of Nonverbal Behavior Vol. 27 No.2. (Summer 2003).
[26]
Paterson, H., Pollick, F., Sanford, A. 2001. The Role of Velocity in Affect Discrimination. Proceedings of the 23rd Annual Conference of the Cognitive Science Society (2001).
[27]
Pelachaud, C., Bilvi, M. 2003. Modelling Gaze Behavior for Conversational Agents. Proceedings of Intelligent Virtual Agents 2003. LINAI Series, Springer, Heidelberg (2003).
[28]
Pelachaud, C., Poggi, I. 2002. Subtleties of facial expressions in embodied agents. The Journal of Visualization and Computer Animation. Vol 13, No. 5. (2002). 301--312.
[29]
Rickel, J., Johnson, W. L. 1999. Animated Agents for Procedural Training in Virtual Reality: Perception, Cognition, and Motor Control. Applied Artificial Intelligence 13(4-5). (1999). 343--382.
[30]
Sobczynski, P. 2004. "Polar Express, The". http://www.efilmcritic.com/review.php?movie=10918&reviewer=389. November 2004.
[31]
Stark, L., Zangemeister, W. H., Edwards, J., Grinberg, J., Jones, A., Lehman, S., Lubock, P., Narayan, V., Nystrom, M. 1980. Head Rotation Trajectories Compared with Eye Saccades by Main Sequence Relationships. Investigative Ophthalmology and Visual Science. Vol 19, No. 8. (1980). 986--988.
[32]
Thomas, F., Johnston, O. The Illusion of Life: Disney Animation. Walt Disney Productions. (1981).
[33]
Wallbott, H. G. 1998. Bodily Expression of Emotion. European Journal of Social Psychology. Vol 28 (1998). 879--896.
[34]
Witkin, A., Popovic, Z. 1995. Motion Warping Proceedings of SIGGRAPH, ACM Press, New York (1995).
[35]
Zee, D. 1976. Disorders of Eye-Head Coordination. Brooks, B. A., Bajandas, F. J. Eye Movements: ARVO Symposium. Plenium Press. (1976). 9--40.
[36]
Zhao, J., Badler, N. 1994. Inverse Kinematics Positioning Using Nonlinear Programming for Highly Articulated Figures. ACM Transactions on Graphics. Vol 13. No. 4. (Oct. 1994). 313--336.
[37]
Zhao, L., Badler, N. 2005. Acquiring and Validating Motion Qualities from Live Limb Gestures. Graphical Models. Vol. 67 No. 1. (2005). 1--16.

Cited By

View all
  • (2021)Agent's Internal State Expression Related to Desire and Suppress Based on Behavior and Physiological ExpressionProceedings of the 9th International Conference on Human-Agent Interaction10.1145/3472307.3484683(417-422)Online publication date: 9-Nov-2021
  • (2019)EVA: Generating Emotional Behavior of Virtual Agents using Expressive Features of Gait and GazeACM Symposium on Applied Perception 201910.1145/3343036.3343129(1-10)Online publication date: 19-Sep-2019
  • (2019)Guiding gazeProceedings of the 11th ACM Symposium on Eye Tracking Research & Applications10.1145/3314111.3319848(1-9)Online publication date: 25-Jun-2019
  • Show More Cited By

Index Terms

  1. A model of gaze for the purpose of emotional expression in virtual embodied agents

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Conferences
    AAMAS '08: Proceedings of the 7th international joint conference on Autonomous agents and multiagent systems - Volume 1
    May 2008
    565 pages
    ISBN:9780981738109

    Sponsors

    Publisher

    International Foundation for Autonomous Agents and Multiagent Systems

    Richland, SC

    Publication History

    Published: 12 May 2008

    Check for updates

    Author Tags

    1. animation
    2. emotional expression
    3. gaze
    4. motion capture
    5. nonverbal
    6. posture
    7. virtual agent

    Qualifiers

    • Research-article

    Conference

    AAMAS08
    Sponsor:

    Acceptance Rates

    Overall Acceptance Rate 1,155 of 5,036 submissions, 23%

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)4
    • Downloads (Last 6 weeks)1
    Reflects downloads up to 13 Feb 2025

    Other Metrics

    Citations

    Cited By

    View all
    • (2021)Agent's Internal State Expression Related to Desire and Suppress Based on Behavior and Physiological ExpressionProceedings of the 9th International Conference on Human-Agent Interaction10.1145/3472307.3484683(417-422)Online publication date: 9-Nov-2021
    • (2019)EVA: Generating Emotional Behavior of Virtual Agents using Expressive Features of Gait and GazeACM Symposium on Applied Perception 201910.1145/3343036.3343129(1-10)Online publication date: 19-Sep-2019
    • (2019)Guiding gazeProceedings of the 11th ACM Symposium on Eye Tracking Research & Applications10.1145/3314111.3319848(1-9)Online publication date: 25-Jun-2019
    • (2015)Eye movement synthesis with 1/f pink noiseProceedings of the 8th ACM SIGGRAPH Conference on Motion in Games10.1145/2822013.2822014(47-56)Online publication date: 16-Nov-2015
    • (2012)Taming Mona LisaACM Transactions on Interactive Intelligent Systems10.1145/2070719.20707241:2(1-25)Online publication date: 13-Jan-2012
    • (2009)Real-time expressive gaze animation for virtual humansProceedings of The 8th International Conference on Autonomous Agents and Multiagent Systems - Volume 110.5555/1558013.1558057(321-328)Online publication date: 10-May-2009
    • (2009)Emotional gaze behavior generation in human-agent interactionCHI '09 Extended Abstracts on Human Factors in Computing Systems10.1145/1520340.1520556(3691-3696)Online publication date: 4-Apr-2009

    View Options

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Figures

    Tables

    Media

    Share

    Share

    Share this Publication link

    Share on social media