ABSTRACT
There is an increasing number of studies in the area of Human-Computer Interaction (HCI) that bears witness to the importance of taking account of emotional factors in interactions with computer systems. By getting to know the emotions of the users, it is possible for artificial agents to have an influence on human feelings with a view to stimulating them in a particular or everyday activities. Thus, one of the great challenges of the HCI area is to enable computer systems to recognize and interpret the feelings of the users. This article sets out a functional Ensemble model for the classification of emotions based on the motor facial expressions of the users. The results described in this article show that the Ensemble Classification that is put forward, can achieve greater rates of accuracy in classifying feelings than what can be obtained by using a single classifier.
- J. N. Bailenson, E. D. Pontikakis, I. B. Mauss, et al. Real-time classification of evoked emotions using facial feature tracking and physiological responses. Intl. journal of human-computer studies, 2008. Google ScholarDigital Library
- R. R. Bouckaert, E. Frank, M. Hall, R. Kirkby, et al. Weka manual for version 3-7-8, 2013.Google Scholar
- G. Chanel, J. J. Kierkels, M. Soleymani, et al. Short-term emotion assessment in a recall paradigm. Intl. Journal of Human-Computer Studies, 2009. Google ScholarDigital Library
- R. O. Duda, P. E. Hart, et al. Pattern classification. John Wiley & Sons, 2012.Google ScholarDigital Library
- P. Ekman. Darwin and facial expression: A century of research in review. Ishk, 2006.Google Scholar
- G. P. R. Filho, J. Ueyama, L. A. Villas, A. R. Pinto, and G. Pessin. Nodepm: A remote monitoring alert system for energy consumption using probabilistic techniques. Sensors, 2014.Google Scholar
- J. Klein, Y. Moon, et al. This computer responds to user frustration:: Theory, design, and results. Interacting with computers, 2002.Google Scholar
- O. Langner, R. Dotsch, G. Bijlstra, et al. Presentation and validation of the radboud faces database. Cognition and Emotion, 2010.Google Scholar
- G. Libralon and R. Romero. Mapping of facial elements for emotion analysis. In Proceedings of the Brazilian Conf. on Intelligent Systems, 2014. Google ScholarDigital Library
- A. Lichtenstein et al. Comparing two emotion models for deriving affective states from physiological data. In Affect and Emotion in HCI. Springer, 2008. Google ScholarDigital Library
- R. LiKamWa, Y. Liu, et al. Moodscope: Building a mood sensor from smartphone usage patterns. In Proceeding of the 11th annual intl. conf. on Mobile systems, applications, and services, 2013. Google ScholarDigital Library
- P. Lucey, J. F. Cohn, et al. The extended cohn-kanade dataset (ck+): A complete dataset for action unit and emotion-specified expression. In Computer Vision and Pattern Recognition Workshops, IEEE Computer Society Conf. on, 2010.Google ScholarCross Ref
- S. Mahlke and M. Minge. Consideration of multiple components of emotions in human-technology interaction. In Affect and emotion in HCI. Springer, 2008. Google ScholarDigital Library
- A. Øhrn et al. Rough sets: a knowledge discovery technique for multifactorial medical outcomes. American journal of physical medicine & rehabilitation, 2000.Google Scholar
- C. Peter and B. Urban. Emotion in human-computer interaction. In Expanding the Frontiers of Visual Analytics and Visualization. Springer, 2012.Google Scholar
- S. Ramakrishnan and I. M. El Emary. Speech emotion recognition approaches in human computer interaction. Telecommunication Systems, 2013. Google ScholarDigital Library
- J. M. Saragih, S. Lucey, et al. Deformable model fitting by regularized landmark mean-shift. Intl. Journal of Computer Vision, 2011. Google ScholarDigital Library
- K. R. Scherer. What are emotions? and how can they be measured? Social science information, 2005.Google Scholar
- B. Schuller, S. Reiter, R. Muller, et al. Speaker independent speech emotion recognition by ensemble classification. In Multimedia and Expo, 2005. ICME 2005. IEEE Intel. Conf. on, 2005.Google ScholarCross Ref
- F. Zhou, X. Qu, M. G. Helander, et al. Affect prediction from physiological measures via visual stimuli. Intl. Journal of Human-Computer Studies, 2011. Google ScholarDigital Library
Index Terms
- Exploiting the Use of Ensemble Classifiers to Enhance the Precision of User's Emotion Classification
Recommendations
Analysis of Emotion EEG Classification Based on GA-Fisher Classifier
IWCDM '11: Proceedings of the 2011 First International Workshop on Complexity and Data MiningEmotion classification is a research hotspot in fields such as psychology and physiology. The categorical scales used recently need to be further researched for their subjective factors and accuracy influence. This paper presents an effective method ...
Towards Emotion Classification Using Appraisal Modeling
The authors studied whether a two-step approach based on appraisal modeling could help in improving performance of emotion classification from sensor data that is typically executed in a one-stage approach in which sensor data is directly classified ...
Toward emotion aware computing: an integrated approach using multichannel neurophysiological recordings and affective visual stimuli
Special section on new and emerging technologies in bioinformatics and bioengineeringThis paper proposes a methodology for the robust classification of neurophysiological data into four emotional states collected during passive viewing of emotional evocative pictures selected from the International Affective Picture System. The proposed ...
Comments