ABSTRACT
Several vision-based systems for automatic recognition of emotion have been proposed in the literature. However most of these systems are evaluated only under controlled laboratory conditions. These controlled conditions poorly represent the constraints faced in real-world ecological situations. In this paper, two studies are described. In the first study we evaluate whether two robust vision-based measures (approach-avoidance detection and quantity of motion) can be used to discriminate between different emotions in a dataset containing acted facial expressions under uncontrolled conditions. In the second study we evaluate in the same dataset the accuracy of a commercially available software used for automatic emotion recognition under controlled conditions. Results showed that the evaluated measures are able to discriminate different emotions in uncontrolled conditions. In addition, the accuracy of the commercial software evaluated is reported.
- Adams, R.G., Ambady, N., Macrae, C. N., and Kleck, R. E. 2012. Emotional expressions forecast approach-avoidance behavior. Motivation and Emotion. 30, 2, 177--186.Google ScholarCross Ref
- Anderson, K and McOwan, P. W. 2006. A real-time automated system for recognition of human facial expressions. IEEE Trans. Systems, Man, and Cybernetics- Part B. 36, 1, 96--105. Google ScholarDigital Library
- Asteriadis, S., Karpouzis, K., and Kollias, S. 2009. Feature extraction and selection for inferring user engagement in an hci environment. HCI International. (July 2009), 19--24. Google ScholarDigital Library
- Benta, K.-I., Van Kuilenburg, H., Eligio, U.X., Den Uyl, M., Cremene, M., Hoszu, A., and Cret, O. 2009. Evaluation of a system for real-time valence assessment of spontaneous facial expressions. Distributed Environments Adaptability, Semantics and Security Issues, International Romanian -- French Workshop.Google Scholar
- Bettadapura, V. 2012. Face Expression Recognition and Analysis: The State of the Art. Technical Report. College of Computing, Georgia Institute of Technology.Google Scholar
- Cootes, T.F., Edwards, G. J., and Taylor, C. J. 2001. Active appearance models. In IEEE Transactions on Pattern Analysis and Machine Intelligence, 23, 6, 681--685. Google ScholarDigital Library
- Davis, J. and Bobick, A. F. 1997. The representation and recognition of human movements using temporal templates. In Proceedings of the IEEE CVPR, 928--934. Google ScholarDigital Library
- Den Uyl, M. J. and Van Kuilenburg, H. 2005 The FaceReader: Online Facial Expression Recognition. In Proceedings of Measuring Behavior, 589--590.Google Scholar
- Dhall, A., Goecke, R., Joshi, J., Wagner, M., and Gedeon, T. 2013. Emotion Recognition In The Wild Challenge And Workshop 2013, ACM ICMI 2013. Google Scholar
- D'Mello, S. K. and Graesser, A. C. 2012. AutoTutor and Affective AutoTutor: Learning by Talking with Cognitively and Emotionally Intelligent Computers that Talk Back, ACM Transactions on Interactive Intelligent Systems. 2, 4, 23:2--23:39. Google ScholarDigital Library
- Fasel, B. and Luttin, J. 2003. Automatic Facial Expression Analysis: a survey. Pattern Recognition. 36, 1, 259--275.Google ScholarCross Ref
- Gómez Jáuregui, D. A., Philip, L., Céline, C., Padovani, S., Bailly, M., and Martin, J.-C. 2013 Video Analysis of Approach-Avoidance Behaviors of Teenagers Speaking with Virtual Agents. In Proceedings of the 15th International Conference on Multimodal Interfaces (ICMI 2013). To appear. Google ScholarDigital Library
- Guo, G. and Dyer C. R. 2005. Learning from examples in the small sample case - face expression recognition. In IEEE Trans. Systems, Man and Cybernetics -- Part B. 35, 3, 477--488. Google ScholarDigital Library
- Lucey, S., Ashraf, A.B., and Cohn, J. F. 2007. Investigating Spontaneous Facial Action Recognition through AAM Representations of the Face. In Face Recognition Book, K. Kurihara, Ed. Pro Literatur Verlag, 275--286.Google Scholar
- McDuff, D., Kaliouby R., Senechal T., Amr M., Cohn J., and Picard R.W. 2013. Affectiva-MIT Facial Expression Dataset (AM-FED): Naturalistic and Spontaneous Facial Expressions Collected In-the-Wild, The 2013 IEEE Computer Society Conference on Computer Vision and Pattern Recognition Workshops (CVPRW'10), (June 2013). Google ScholarDigital Library
- Morency, L.P., Rahimi, A. and Darrel, T. 2013. Adaptive view-based appearance models. In Computer Vision and Pattern Recognition. 1, 803--812. Google ScholarDigital Library
- Pantic, M. and Rothkrantz, L. 2004. Facial action recognition for facial expression analysis from static face images. IEEE Trans. Syst., Man, Cybern. B, Cybern. 34, 3, 1449--1461. Google ScholarDigital Library
- Piana, S., Stagliano, A., Camurri, A., and Odone, F. 2013. A set of full-body movement features for emotion recognition to help children affected by autism spectrum condition. In IDGEI International Workshop.Google Scholar
- Russell, J. A. and A. Mehrabian. 1977. Evidence for a three-factor theory of emotions. Journal of Research in Personality. 11, 273--294.Google ScholarCross Ref
- Sebe, N., Lew, M.S., Cohen, I., Sun, Y., Gevers, T., and Huang, T.S. 2004. Authentic Facial Expression Analysis, In International Conference on Automatic Face and Gesture Recognition, 517--522. Google ScholarDigital Library
- Tariq, U., Lin, K-H., Li, Z., Zhou, X., Wang, Z., Le, V., Huang, T. S., Lv, X., and Han, T. X. 2011. Emotion recognition from an ensemble of features. In Ninth IEEE International Conference on Automatic Face and Gesture Recognition (FG 2011), 872---87.Google Scholar
- Tian, Y. L., Kanade, T., and Cohn, J. F. 2005. Facial expression analysis. In Handbook of Face Recognition, S. Z. Li and A. K. Jain. Eds. Springer, 247--276Google Scholar
- Valstar, M., Pantic, M., and Patras, I. 2004. Motion history for facial action detection from face video. In Int. Conf. Systems, Man and Cybernetics. 1, 635--640.Google Scholar
- Wu, T., Bartlett, M.S., and Movellan, J.R. 2010. Facial expression recognition using Gabor motion energy filters. In 2010 IEEE Computer Society Conference on Computer Vision and Pattern Recognition Workshops (CVPRW), 42--47.Google Scholar
- Zeng, Z., Pantic, M., Roisman, G.I., and Huang, T.S. 2009. A Survey of Affect Recognition Methods: Audio, Visual, and Spontaneous Expressions. IEEE Transaction on Pattern Analysis and Machine Intelligence. 31, 1, 39--58. Google ScholarDigital Library
- Zhu, Z. and Ji, Q. 2005. Robust real-time eye detection and tracking under variable lighting conditions and various face orientations. Computer Vision and Image Understanding. 98, 1, 124--154. Google ScholarDigital Library
Index Terms
- Evaluation of vision-based real-time measures for emotions discrimination under uncontrolled conditions
Recommendations
Human-Computer Interaction Using Emotion Recognition from Facial Expression
EMS '11: Proceedings of the 2011 UKSim 5th European Symposium on Computer Modeling and SimulationThis paper describes emotion recognition system based on facial expression. A fully automatic facial expression recognition system is based on three steps: face detection, facial characteristic extraction and facial expression classification. We have ...
Investigating Facial Features for Identification of Emotions
ICONIP 2013: Proceedings, Part II, of the 20th International Conference on Neural Information Processing - Volume 8227The recognition of emotions from others' faces is a universal and fundamental skill for social interaction. Many researchers argue that there is a set of basic emotions which were preserved during evolutive process because they allow the adaption of the ...
Pose-invariant descriptor for facial emotion recognition
Most facial emotion recognition algorithms assume that the face is near frontal and the face pose fixed during the recognition process. However, such constrain limits the adoption for real-world applications. To solve this, pose-invariant descriptor for ...
Comments