Abstract
The increasing number of people playing games on touch-screen mobile phones raises the question of whether touch behaviors reflect players’ emotional states. This prospect would not only be a valuable evaluation indicator for game designers, but also for real-time personalization of the game experience. Psychology studies on acted touch behavior show the existence of discriminative affective profiles. In this article, finger-stroke features during gameplay on an iPod were extracted and their discriminative power analyzed. Machine learning algorithms were used to build systems for automatically discriminating between four emotional states (Excited, Relaxed, Frustrated, Bored), two levels of arousal and two levels of valence. Accuracy reached between 69% and 77% for the four emotional states, and higher results (~89%) were obtained for discriminating between two levels of arousal and two levels of valence. We conclude by discussing the factors relevant to the generalization of the results to applications other than games.
- Bailenson, N. J., Brave, N. Y. S., Merget, D., and Koslow, D. 2007. Virtual interpersonal touch: Expressing and recognizing emotions through haptic devices. Hum.-Comput. Interact. 22, 325--353. Google ScholarDigital Library
- Balaam, M., Fitzpatrick, G., Good, J., and Luckin, R. 2010. Exploring affective technologies for the classroom with the subtle stone. In Proceedings of the 28th Annual SIGCHI Conference on Human Factors in Computing Systems. 1623--1632. Google ScholarDigital Library
- Bänziger, T. and Scherer, K. R. 2007. Using actor portrayals to systematically study multimodal emotion expression: The GEMEP corpus. In Proceedings of the International Conference on Affective Computing and Intelligent Interaction. 476--487. Google ScholarDigital Library
- Bentley, T., Johnston, L., and Von Baggo, K. 2005. Evaluation using cued-recall debrief to elicit information about a user’s affective experiences. In Proceedings of the 17th Australia Conference on Computer-Human Interaction. 1--10. Google ScholarDigital Library
- Bernhardt, D. and Robinson, P. 2007. Detecting affect from non-stylised body motions. In Proceedings of the International Conference on Affective Computing and Intelligent Interaction. Google ScholarDigital Library
- Bianchi-Berthouze, N. 2013. Understanding the role of body movement in player engagement. Hum.-Comput. Interact 28, 1, 40--75.Google Scholar
- Bianchi-Berthouze, N. and Lisetti, C. L. 2002. Modeling multimodal expression of user s affective subjective experience. User Model. User-Adapt. Interact. 12, 1, 49--84. Google ScholarDigital Library
- Bleiweiss, A., Eshar, D., Kutliroff, G., Lerner, A. Oshrat, Y., and Yanai, Y. 2010. Enhanced interactive gaming by blending full-body tracking and gesture animation. In Proceedings of ACM SIGGRAPH Asia. Google ScholarDigital Library
- Bonanni, L., Vaucelle, C., Lieberman, J., and Zuckerman, O. 2006. Taptap: A haptic wearable for asynchronous distributed touch therapy. In Proceedings of the Conference on Human Factors in Computing Systems (CHI’06). 580--585. Google ScholarDigital Library
- Brew, A., Greene, D., and Cunningham, P. 2010. Using crowdsourcing and active learning to track sentiment in online media. In Proceedings of the 19th European Conference on Artificial Intelligence. H. Coelho, R. Studer, and M. Wooldridge Eds., IOS Press, 145--150. Google ScholarDigital Library
- Burton, C., Weller, D., and Sharpe, M. 2007. Are electronic diaries useful for symptoms research? A systematic review. J. Psychosomatic Resear. 62, 553--561.Google ScholarCross Ref
- Calvo, R. A. and D’Mello, S. 2010. Affect detection: An Interdisciplinary review of models, methods, and their applications. IEEE Trans. Affective Comput. 1, 1, 18--37. Google ScholarDigital Library
- Camurri, A., Mazzarino, B., and Volpe, G. 2003a. Analysis of expressive gesture: The EyesWeb expressive gesture processing library. In Proceedings of the International Gesture Workshop. 460--467.Google Scholar
- Camurri, A., Lagerlof, I., and Volpe, G. 2003b. Recognizing emotion from dance movement: Comparison of spectator recognition and automated techniques. Int. J. Hum.-Comput. Stud. 59, 1--2, 213--225. Google ScholarDigital Library
- Camurri, A., Mazzarino, B., Ricchetti, M., Timmers, R., and Volpe, G. 2004. Multimodal analysis of expressive gesture in music and dance performances. In Proceedings of the International Gesture Workshop. 20--39.Google Scholar
- Castellano, G., Mortillaro, M., Camurri, A., Volpe, G., and Scherer, K. 2008. Automated analysis of body movement in emotionally expressive piano performances. Music Percept. 26, 2, 103--120.Google ScholarCross Ref
- Castellano, G., Leite, I., Pereira, A., Martinho, C., Paiva, A., and McOwan, P. 2010. Affect recognition for interactive companions: Challenges and design in real-world scenarios. J. Multimodal User Interfaces 3, 1--2, 89--98.Google ScholarCross Ref
- Chandler, J. and Schwarz, N. 2009. How extending your middle finger affects your perception of others: Learned movements influence concept accessibility. J. Exp. Social Psych. 45, 1, 123--128.Google ScholarCross Ref
- Chang, A., O’Modhrain, S., Jacob, R., Gunther, E., and Ishii, H. 2002. ComTouch: Design of a vibrotactile communication device. In Proceedings of the ACM Conference on Design of Interactive Systems. Google ScholarDigital Library
- Clore, G. L. and Palmer, J. 2009. Affective guidance of intelligent agents: How emotion controls cognition. Cognitive Syst. Resear. 10, 1, 21--30. Google ScholarDigital Library
- Clynes, M. 1973. Sentography: Dynamic forms of communication of emotion and qualities. Comput. Biol. Med. 3, 119--130.Google ScholarCross Ref
- Conati, C. and Maclaren, H. 2009. Empirically building and evaluating a probabilistic model of user affect. User Model. User-Adapt. Interact. 19, 3, 267--303. Google ScholarDigital Library
- Csikszentmihalyi, M. 1990. Flow. Harper Collins Publishers, New York.Google Scholar
- D’Mello, S. and Graesser, A. 2011. The half-life of cognitive affective states during complex learning. Cognition Emotion, 25, 7, 1299--1308.Google ScholarCross Ref
- Damasio, A. R. 1994. The somatic marker hypothesis and the possible functions of the prefrontal cortex. Proc. Royal Soc. 351, 1413--1420.Google Scholar
- Deethardt, J. F. and Hines, D. G. 1983. Tactile communication and personality differences. J. Nonverbal Behav. 8, 2, 143--156.Google ScholarCross Ref
- Elfenbein, H. A. and Ambady, N. 2002. Is there an ingroup advantage in emotion recognition? Psych. Bull. 128, 243--249.Google ScholarCross Ref
- Essick, G. K., McGlone, F., Dancer, C., Fabricant, D., Ragin, Y., Phillips, N., Jones, T., and Guest, S. 2010. Quantitative assessment of pleasant touch. Neurosci. Biobehav. Rev. 34, 2, 192--203Google ScholarCross Ref
- Gaver, W., Bowers, J., Boucher, A., Law, A., Pennington, S., and Villar, N. 2006. The History Tablecloth: Illuminating domestic activity. In Proceedings of the 6th Conference on Designing Interactive Systems. Google ScholarDigital Library
- Gune, H. and Pantic, M. 2010. Automatic, dimensional and continuous emotion recognition. Int. J. Synthetic Emotions 1, 1, 68--99.Google ScholarDigital Library
- Gune, H. and Piccardi, M. 2009. Automatic temporal segment detection and affect recognition from face and body display. IEEE Trans. Syst. Man Cybern. Part B 39, 1, 64--84. Google ScholarDigital Library
- Haans, A. and Ijsselsteijn, W. 2006. Mediated social touch: A review of current research and future directions. Virtual Reality 9, 2, 149--159. Google ScholarDigital Library
- Hertenstein, M. J., Keltner, D., App, B., Bulleit, B. A., and Jaskolka, A. R. 2006. Touch communicates distinct emotions. Emotion 6, 528--533.Google ScholarCross Ref
- Hertenstein, M. J., Holmes, R., McCullough, M., and Keltner, D. 2009. The communication of emotion via touch. Emotion 9, 4, 566--573.Google ScholarCross Ref
- Ho, C. C., Srinivasan, M. A., and Slater, M. 2000. An experimental study on the role of touch in shared virtual environments. ACM Trans. Comput.-Hum. Interact. 7, 4, 443--460. Google ScholarDigital Library
- Hristova, N. and O’Hare, G. M. P. 2004. Ad-me: Wireless advertising adapted to the user location, device and emotions. In Proceedings of the 37th Annual Hawaii International Conference on System Sciences. Google ScholarDigital Library
- Iacobini, M., Gonsalves, T., Bianchi-Berthouze, N., and Frith, C. 2010. Emotional contagion in interactive art. In Proceedings of the Conference on Kansei Engineering and Emotion Research. 1975--1984.Google Scholar
- Isaacs, E., Konrad, A., Cassidy-Eagle, E., and Whittaker, S. 2012. Reflecting on everyday life using technology. In Proceedings of the Workshop on Emotion and Wellbeing, in conjunction with SIGCHI Conference on Human Factors in Computing Systems.Google Scholar
- Isbister, K., Schwekendiek, U., and Frye, J. 2011. Wriggle: An exploration of emotional and social effects of movement. In Proceedings of the 26th Annual SIGCHI Conference on Human Factors in Computing Systems. 1885--1890. Google ScholarDigital Library
- Jones, S. E. and Yarbrough, A. E. 1985. A naturalistic study of the meanings of touch. Comm. Monographs 52, 19--56.Google ScholarCross Ref
- Kapoor, A., Burleson, W., and Picard, R. W. 2007. Automatic prediction of frustration. Int. J. Hum. Comput. Stud. 65, 8, 724--736. Google ScholarDigital Library
- Kapur, A., Virji-Babul, A., Tzanetakis, G., and Driessen, P. F. 2005. Gesture-based affective computing on motion capture data. In Proceedings of the International Conference on Affective Computing and Intelligent Interaction. 1--7. Google ScholarDigital Library
- Keogh, E., Rosser, B. A., and Eccleston, C. 2010. E-health and chronic pain management: Current status and development. Pain 151, 1, 18--21.Google ScholarCross Ref
- Khanna, P. and Sasikumar, M. 2010. Recognising emotions from keyboard stroke pattern. Int. J. Comput. Appl. 11, 9, 8975--8887.Google Scholar
- Kleinsmith, A. and Bianchi-Berthouze, N. 2011. Form as a cue in the automatic recognition of non-acted affective body expressions. In Proceedings of the International Conference on Affective Computing and Intelligent Interaction. Lecture Notes in Computer Science, vol. 6974. Springer, 155--164. Google ScholarDigital Library
- Kleinsmith, A. and Bianchi-Berthouze, N. 2013. Affective body expression perception and recognition: A survey. IEEE Trans. Affective Comput. To appear.Google ScholarDigital Library
- Kleinsmith, A., Bianchi-Berthouze, N., and Steed, A. 2011. Automatic recognition of non-acted affective postures. IEEE Trans. Syst Man Cybern. Part B 41, 4, 1027--1038. Google ScholarDigital Library
- Knapp, M. L. and Hall, J. A. 1997. Nonverbal Communication in Human Interaction 4th Ed. Harcourt Brace College Publishers.Google Scholar
- Liapis, A., Yannakakis, G. N., and Togelius, J. 2012. Adapting models of visual aesthetics for personalized content creation. IEEE Trans. Comput. Intell. AI Games.Google ScholarCross Ref
- Lindstrom, M., Stahl, A., Hook, K., Sundstrom, P., Laaksolahti, J., Combetto, M., Taylor, A., and Bresin, R. 2006. Affective diary: Designing for bodily expressiveness and self-reflection. In Proceedings of the Conference on Human Factors in Computing Systems (CHI’06). ACM Press, 1037--1042. Google ScholarDigital Library
- Liu, K. T. and Reimer, R. A. 2008. Social playlist: Enabling touch points and enriching ongoing relationships through collaborative mobile music listening. In Proceedings of the Conference on Human-Computer Interaction with Mobile Devices and Services. 403--406. Google ScholarDigital Library
- Lu, L., Liu, D., and Zhang, H. J. 2006. Automatic mood detection and tracking of music audio signals. IEEE Trans. Audio Speech Lang. Process. 14, 1, 5--18. Google ScholarDigital Library
- Lv, H., Lin, Z., Yin, W., and Dong, J. 2008. Emotion recognition based on pressure sensor keyboards. In Proceedings of the IEEE International Conference on Multimedia and Expo. 1089--1092.Google Scholar
- Maldonado, H., Lee, J.R., Brave, S., Nass, C., Nakajima, H., Yamada, R., Iwamura, K., and Morishima, Y. 2005. We learn better together: Enhancing eLearning with emotional characters. In Proceedings of the Conference on Computer Supported Collaborative Learning. T. Koschmann, D. Suthers, and T. W. Chan Eds., 408--417. Google ScholarDigital Library
- Marshall, P., Morris, R., Rogers, Y., Kreitmayer, S., and Davies, M. 2011. Rethinking ‘multi-user’: An in-the-wild study of how groups approach a walk-up-and-use tabletop interface. In Proceedings of the ACM Conference on Human Factors in Computing Systems. 3033--3042. Google ScholarDigital Library
- Matsuda, Y., Sakuma, I., Jimbo, Y., Kobayashi, E., Arafune, T., and Isomura, T. 2010. Emotion recognition of finger Braille. Int. J. Innovative Comput. Inf. Control, 6, 3(B), 1363--1377.Google Scholar
- Melzer, A., Derks, I., Heydekorn, J., and Steffgen, G. 2010. Click or strike: Realistic versus standard game controls in violent video games and their effects on agression. In Proceedings of the 9th International Conference on Entertainment Computing. H. S. Yang, R. Malaka, J. Hoshino, and J. H. Han Eds., Lecture Notes in Computer Science, vol. 6243, Springer, 171--182. Google ScholarDigital Library
- Nass, C. and Brave, S. 2005. Wired for Speech: How Voice Activates and Advances the Human-Computer Relationship. MIT Press, Cambridge, MA. Google ScholarDigital Library
- Niedenthal, P. M., Barsalou, L. W., Winkielman, P., Krauth-Gruber, S., and Ric, F. 2005. Embodiment in attitudes, social perception, and emotion. Personality Soc. Psych. Rev. 9, 3, 184--211.Google ScholarCross Ref
- Niedenthal, S. 2009. What we talk about when we talk about game aesthetics. In Proceedings of the International Conference on Digital Game Research Association.Google Scholar
- Nijhar, J., Bianchi-Berthouze, N., and Boguslawski, G. 2012. Does movement recognition precision affect the player experience in exertion games? In Proceedings of the International Conference on Intelligent Technologies for Interactive Entertainment (INTETAIN ’11). 73--82.Google Scholar
- Nijholt, A. 2007. Playing and cheating in ambient entertainment. In Proceedings of the 6th International Conference on Entertainment Computing (ICEC’07). Lecture Notes in Computer Science, vol. 4740, Springer, 415--420. Google ScholarDigital Library
- Nijholt, A., Plass-Oude Bos, D., and Reuderink, B. 2009. Turning shortcomings into challenges: Brain--computer interfaces for games. Entertainment Comput. 1, 1, 85--94.Google ScholarCross Ref
- Noma, H. and Miyasato, T. 1997. Haptic communication for cooperative object manipulation. In Proceedings of the International Workshop on New Media Technology.Google Scholar
- Oakley, I., Brewster, S. A., and Gray, P. D. 2000. Communicating with feeling. In Proceedings of the 1st Workshop on Haptic Human-Computer Interaction. Springer. Google ScholarDigital Library
- Palmblad, M. and Tiplady, B. 2004. Electronic diaries and questionnaires: designing user interfaces that are easy for all patients to use. Qual. Life Resear. 13, 1199--1207.Google ScholarCross Ref
- Pasch, M., Bianchi-Berthouze, N., van Dijk, B., and Nijholt, A. 2009. Movement-based sports video games: Investigating motivation and gaming experience. Entertainment Comput. 9, 2, 169--180.Google Scholar
- Plass-Oude Bos, D., Reuderink, B., van de Laar, B. L. A., Gürkök, H., Mühl, C., Poel, M., Heylen, D. K. J., and Nijholt, A. 2010. Human-computer interaction for BCI Games: Usability and user experience. In Proceedings of the International Conference on Cyberworlds. A. Sourin Ed., IEEE, 277--281. Google ScholarDigital Library
- Pollick, F. E., Paterson, H. M., Bruderlin, A., and Sanford, A. J. 2001. Perceiving affect from arm movement. Cognition 82, 51--61.Google ScholarCross Ref
- Roccetti, M., Marfia, G., and Semeraro, A. 2012. Playing into the wild: A Gesture-based interface for gaming in public spaces. J. Visual Comm. Image Represent. 23, 3, 426--440. Google ScholarDigital Library
- Russell, J. A. 1980. A circumplex model of affect. J. Personality Social Psych. 39, 1161--1178.Google ScholarCross Ref
- Savva, N., Scarinzi, A., and Bianchi-Berthouze, N. 2012. Continuous recognition of player’s affective body expression as dynamic quality of aesthetic experience, Trans. Comput. Intell. AI in Games 4, 3, 199--212.Google ScholarCross Ref
- Schaefer, M., Heinze, H. J., and Rotte, M. 2012. Touch and personality: Extraversion predicts somatosensory brain response. NeuroImage 62, 1, 432--438.Google ScholarCross Ref
- Schutz, R. and Pekrun, P. A. Eds. 2007. Emotion in Education. Academic Press, San Diego, CA.Google Scholar
- Sebe, N., Sun, Y., Bakker, E., Lew, M. S., Cohen, I., and Huang, T. S. 2004. Towards authentic emotion recognition. In Proceedings of the IEEE International Conference on Systems, Man and Cybernetics. 623--628.Google Scholar
- Shibata, T., Mitsui, T., Wada, K., Touda, A., Kumasaka, T., Tagami, K., and Tanie, K. 2001. Mental commit robot and its application to therapy of children. In Proceedings of the IEEE/ASME International Conference on Advanced Intelligent Mechatronics. Vol. 2, 1053--1058.Google Scholar
- Stiehl, W.D., Lieberman, J., Breazeal, C., Basel, L., Lalla, L., and Wolf, M. 2005. Design of a therapeutic robotic companion for relational, affective touch. In Proceedings of the IEEE International Workshop on Robot and Human Interactive Communication. 408--415.Google Scholar
- Strong, R. and Gaver, B. 1996. Feather, scent and shaker: Supporting simple intimacy. In Videos, Demonstrations, and Short Papers of the Conference on Computer Supported Cooperative Work (CSCW’96). ACM, New York.Google Scholar
- Su, J. H., Hsieh, M. H., Mei, T., and Tseng, V. S. 2011. Photosense: Make sense of your photos with enriched harmonic music via emotion association. In Proceedings of the IEEE International Conference on Multimedia and Expo. 1--6. Google ScholarDigital Library
- Sundström, P., Ståhl, A., and Höök, K. 2007. In situ informants exploring an emotional mobile messaging system in their everyday practice. Int. J. Hum.-Comput. Stud. 65, 4, 388--403. Google ScholarDigital Library
- Thrasher, M., van der Zwaag, M., Bianchi-Berthouze, N., and Westering, J. 2011. Mood recognition based on upper body posture and movement features. In Proceedings of the 4th International Conference on Affective Computing and Intelligent Interaction. Lecture Notes in Computer Science, vol. 6974, Springer, 377--386. Google ScholarDigital Library
- Uvanas-Moberg, K., Arn, I., and Magnusson, D. 2005. The psychobiology of emotion: The role of the oxytocinergic system. Int. J. Behav. Med. 12, 59--65.Google ScholarCross Ref
- von Laban, R. 1971. The Mastery of Movement. MacDonald & Evans Ltd.Google Scholar
- Wallbott, H. G. and Scherer, K. R. 1986. Cues and channels in emotion recognition. J. Personality Social Psych. 51, 4, 690--699.Google ScholarCross Ref
- Wu, G. F., Jang, S. Y., Kwak, H. S., and Pu, J. X. 2008. A new parameter selection method of neural network. In Proceedings of the International Conference on Multimedia and Ubiquitous Engineering. 378--383. Google ScholarDigital Library
- Yannakakis, G. N. and Hallam, J. 2007. Towards optimizing entertainment in computer games. Appl. Artif. Intell. 21, 933--971. Google ScholarDigital Library
- Yannakakis, G. N. and Togelius, J. 2011. Experience-driven procedural content generation. IEEE Trans. Affective Comput. 2, 3, 147--161. Google ScholarDigital Library
- Yannakakis, G. N., Martinez, H. P., and Jhala, A. 2010. Towards affective camera control in games. User Model. User-Adapt. Interact. 20, 313--340. Google ScholarDigital Library
Index Terms
- What Does Touch Tell Us about Emotions in Touchscreen-Based Gameplay?
Recommendations
Speech-based recognition of self-reported and observed emotion in a dimensional space
The differences between self-reported and observed emotion have only marginally been investigated in the context of speech-based automatic emotion recognition. We address this issue by comparing self-reported emotion ratings to observed emotion ratings ...
Robot Touch to Send Sympathy: Divergent Perspectives of Senders and Recipients
HRI '22: Proceedings of the 2022 ACM/IEEE International Conference on Human-Robot InteractionThere is increasingly heavy reliance on online social communication methods for reducing the sense of social isolation. However, this type of communication lacks one of the most critical elements of expressing emotion to comfort people: physical ...
The EmojiGrid as a Rating Tool for the Affective Appraisal of Touch
Haptics: Science, Technology, ApplicationsAbstractWe evaluated the convergent validity of the new language-independent EmojiGrid rating tool for the affective appraisal of perceived touch events. The EmojiGrid is a rectangular response grid, labeled with facial icons (emoji) that express ...
Comments