ABSTRACT
In this paper we present DaFEx (Database of Facial Expressions), a database created with the purpose of providing a benchmark for the evaluation of the facial expressivity of Embodied Conversational Agents (ECAs). DaFEx consists of 1008 short videos containing emotional facial expressions of the 6 Ekman's emotions plus the neutral expression. The facial expressions were recorded by 8 professional actors (male and female) in two acting conditions ("utterance" and "no- utterance") and at 3 intensity levels (high, medium, low). The properties of DaFEx were studied by having 80 subjects classify the emotion expressed in the videos. High rates of accuracy were obtained for most of the emotions displayed. We also tested the effect of the intensity level, of the articulatory movements due to speech, and of the actors' and subjects' gender, on classification accuracy. The results showed that decoding accuracy decreases with the intensity of emotions; that the presence of articulatory movements negatively affects the recognition of fear, surprise and of the neutral expression, while it improves the recognition of anger; and that facial expressions seem to be recognized (slightly) better when acted by actresses than by actors.
- Agresti, A. 2002. Categorical Data Analysis. John Wiley & Sons. Hoboken, NJ.Google Scholar
- Ahlberg, J., Pandzic, I. S., You, L. 2002. Evaluationg MPEG-4 Facial Animation Players. In I. S. Pandzic, R. Forchhimer (editors), MPEG-4 Facial Animation: the standard, implementation and applications, Wiley & Sons, Chichester, UK, 287--291. Google ScholarDigital Library
- Banse, R., Scherer, K.R. 1996. Acoustic profiles in vocal emotion expression. Journal of Personality and Social Psychology, 70(3), 614--636.Google ScholarCross Ref
- Bassili, J.N. 1979. Emotion Recognition: The Role of Facial Movement and the Relative Importance of Upper and Lower areas of the Face. Journal of Personality and Social Psychology. Vol 37, No. 11, 2049--2058.Google ScholarCross Ref
- Batliner, A., Hacker, C., Steidl, S., Nöth, E., D'Arcy, S., Russell, M., Wong, M. 2004, You stupid tin-box - children interacting with the AIBO robot. A crosslinguistic emotional speech corpus. In Proc. LREC 2004, Lisbon.Google Scholar
- Boucher, J.D.; Carlson, G.E. 1980. Recognition of Facial Expression in Three Cultures. Journal of Cross-Cultural Psychology. Vol. 11, No. 3, 263--280.Google ScholarCross Ref
- Cassell, J., Sullivan, J., Prevost, S., Churchill, E. (Eds.): 2000, Embodied Conversational Agents. Cambridge, MA: MIT Press. Google ScholarDigital Library
- Costantini, E., Pianesi, F., Cosi, P. 2004. Evaluation of Synthetic Faces: Human Recognition of Emotional Facial Displays. In E. Andre', L. Dybkiaer, W. Minker, P. Heisterkamp (eds.), Affective Dialogue Systems, pp. 276--287. Springer Verlag, Berlin.Google Scholar
- Costantini, E., Pianesi, F., Prete, M. 2005. Recognising Emotions in Human and Synthetic Faces: the Role of the Upper and Lower Parts of the Face. In Proceedings of IUI 2005, International Conference on Intelligent User Interfaces. San Diego, CA. Google ScholarDigital Library
- Douglas-Cowie, E., Cowie, R., Schroeder, M. 2000. A new emotion database: Considerations, sources and scope. In Proceedings of the ISCA Workshop on Speech and Emotion: A Conceptual Framework for Research, pp. 39--44. Belfast, Textflow.Google Scholar
- Eifenbein, H.A., Ambady, N. 2003. When Familiarity Breeds Accuracy: Cultural Exposure and Facial Emotion Recognition. Journal of Personality and Social Psychology. Vol. 85, No. 2, 276--290.Google ScholarCross Ref
- Ekman, P., Friesen, W.V. 1971. Constants across cultures in the face and emotion. Journal of Personality and Social Psychology, 17, 124--129.Google Scholar
- Ekman, P., Friesen, W.V., Ellsworth, P. 1972, Emotion in the Human Face. Pergamon Press. Elmsdorf, NY.Google Scholar
- Ekman, P., Friesen W. 1978, Manual for the Facial Action Coding System. Consulting Psych. Press. Palo Alto, CA.Google Scholar
- Hess, U., Blairy, S., Kleck, R.E., 1997, The Intensity of Emotional Facial Expressions and Decoding Accuracy. Journal of Nonverbal Behavior, 21(4), 241--257.Google ScholarCross Ref
- Kätsyri, J., Klucharev, V., Frydrych, M., Sams M. 2003. 'Identification of Synthetic and Natural Emotional Facial Expressions'. In Proceedings of AVSP'2003, 239--244, St. Jorioz, France.Google Scholar
- Kirouac, G., Doré, F.Y. 1985. Accuracy of the Judgment of Facial Expression of Emotions as a Function of Sex and Level of Education. J. of Nonverbal Behavior, 9 (1), 3--7.Google ScholarCross Ref
- Rotter, N.G., Rotter, G.S., 1988, Sex Differences in the Encoding and Decoding of Negative Facial Emotions. Journal of Nonverbal Behavior, 12(2), 139--148.Google ScholarCross Ref
- Russell, J.A. 1980. A circumplex model of affect. Journal of Personality and Social Psychology. 39, 1164--1178.Google ScholarCross Ref
- Russell, J.A. 1997. Reading emotions from and into faces: Resurrecting a dimensional-contextual perspective. In Russell, J.A.; Fernandez-Dols J.M. (eds.) The psychology of facial expression, pp. 295--320. Cambridge University Press. Cambridge, UK.Google Scholar
- Ruttkay, Z., Doorman, C., Noot, H. 2002. 'Evaluating ECAs - What and How?'. In Proceedings of the AAMAS02 Workshop on Embodied conversational agents - let's specify and evaluate them!, Bologna, Italy.Google Scholar
- Ruttkay, Z., Doorman, C., Noot, H. 2004 'Embodied Conversational Agents on a Common Ground. A Framework for Design and Evaluation'. In Ruttkai, Z. and C. Pelachaud (eds.) From Brows till Trust. Kluwer, Dordrecht. Google ScholarDigital Library
- Wallbott, H.G. 1998. Bodily expression of emotion. European Journal of Social Psychology, 28, 879--896.Google ScholarCross Ref
Index Terms
- A first evaluation study of a database of kinetic facial expressions (DaFEx)
Recommendations
The properties of DaFEx, a database of kinetic facial expressions
ACII'05: Proceedings of the First international conference on Affective Computing and Intelligent InteractionIn this paper we present an evaluation study for DaFEx (Database of Facial Expressions), a database created with the purpose of providing a benchmark for the evaluation of the facial expressivity of Embodied Conversational Agents (ECAs). DaFEx consists ...
Appraisal Inference from Synthetic Facial Expressions
Facial expression research largely relies on forced-choice paradigms that ask observers to choose a label to describe the emotion expressed, assuming a categorical encoding and decoding process. In contrast, appraisal theories of emotion suggest that ...
Analyses of a Multimodal Spontaneous Facial Expression Database
Creating a large and natural facial expression database is a prerequisite for facial expression analysis and classification. It is, however, not only time consuming but also difficult to capture an adequately large number of spontaneous facial ...
Comments