skip to main content
research-article

What Does Touch Tell Us about Emotions in Touchscreen-Based Gameplay?

Published:01 December 2012Publication History
Skip Abstract Section

Abstract

The increasing number of people playing games on touch-screen mobile phones raises the question of whether touch behaviors reflect players’ emotional states. This prospect would not only be a valuable evaluation indicator for game designers, but also for real-time personalization of the game experience. Psychology studies on acted touch behavior show the existence of discriminative affective profiles. In this article, finger-stroke features during gameplay on an iPod were extracted and their discriminative power analyzed. Machine learning algorithms were used to build systems for automatically discriminating between four emotional states (Excited, Relaxed, Frustrated, Bored), two levels of arousal and two levels of valence. Accuracy reached between 69% and 77% for the four emotional states, and higher results (~89%) were obtained for discriminating between two levels of arousal and two levels of valence. We conclude by discussing the factors relevant to the generalization of the results to applications other than games.

References

  1. Bailenson, N. J., Brave, N. Y. S., Merget, D., and Koslow, D. 2007. Virtual interpersonal touch: Expressing and recognizing emotions through haptic devices. Hum.-Comput. Interact. 22, 325--353. Google ScholarGoogle ScholarDigital LibraryDigital Library
  2. Balaam, M., Fitzpatrick, G., Good, J., and Luckin, R. 2010. Exploring affective technologies for the classroom with the subtle stone. In Proceedings of the 28th Annual SIGCHI Conference on Human Factors in Computing Systems. 1623--1632. Google ScholarGoogle ScholarDigital LibraryDigital Library
  3. Bänziger, T. and Scherer, K. R. 2007. Using actor portrayals to systematically study multimodal emotion expression: The GEMEP corpus. In Proceedings of the International Conference on Affective Computing and Intelligent Interaction. 476--487. Google ScholarGoogle ScholarDigital LibraryDigital Library
  4. Bentley, T., Johnston, L., and Von Baggo, K. 2005. Evaluation using cued-recall debrief to elicit information about a user’s affective experiences. In Proceedings of the 17th Australia Conference on Computer-Human Interaction. 1--10. Google ScholarGoogle ScholarDigital LibraryDigital Library
  5. Bernhardt, D. and Robinson, P. 2007. Detecting affect from non-stylised body motions. In Proceedings of the International Conference on Affective Computing and Intelligent Interaction. Google ScholarGoogle ScholarDigital LibraryDigital Library
  6. Bianchi-Berthouze, N. 2013. Understanding the role of body movement in player engagement. Hum.-Comput. Interact 28, 1, 40--75.Google ScholarGoogle Scholar
  7. Bianchi-Berthouze, N. and Lisetti, C. L. 2002. Modeling multimodal expression of user s affective subjective experience. User Model. User-Adapt. Interact. 12, 1, 49--84. Google ScholarGoogle ScholarDigital LibraryDigital Library
  8. Bleiweiss, A., Eshar, D., Kutliroff, G., Lerner, A. Oshrat, Y., and Yanai, Y. 2010. Enhanced interactive gaming by blending full-body tracking and gesture animation. In Proceedings of ACM SIGGRAPH Asia. Google ScholarGoogle ScholarDigital LibraryDigital Library
  9. Bonanni, L., Vaucelle, C., Lieberman, J., and Zuckerman, O. 2006. Taptap: A haptic wearable for asynchronous distributed touch therapy. In Proceedings of the Conference on Human Factors in Computing Systems (CHI’06). 580--585. Google ScholarGoogle ScholarDigital LibraryDigital Library
  10. Brew, A., Greene, D., and Cunningham, P. 2010. Using crowdsourcing and active learning to track sentiment in online media. In Proceedings of the 19th European Conference on Artificial Intelligence. H. Coelho, R. Studer, and M. Wooldridge Eds., IOS Press, 145--150. Google ScholarGoogle ScholarDigital LibraryDigital Library
  11. Burton, C., Weller, D., and Sharpe, M. 2007. Are electronic diaries useful for symptoms research? A systematic review. J. Psychosomatic Resear. 62, 553--561.Google ScholarGoogle ScholarCross RefCross Ref
  12. Calvo, R. A. and D’Mello, S. 2010. Affect detection: An Interdisciplinary review of models, methods, and their applications. IEEE Trans. Affective Comput. 1, 1, 18--37. Google ScholarGoogle ScholarDigital LibraryDigital Library
  13. Camurri, A., Mazzarino, B., and Volpe, G. 2003a. Analysis of expressive gesture: The EyesWeb expressive gesture processing library. In Proceedings of the International Gesture Workshop. 460--467.Google ScholarGoogle Scholar
  14. Camurri, A., Lagerlof, I., and Volpe, G. 2003b. Recognizing emotion from dance movement: Comparison of spectator recognition and automated techniques. Int. J. Hum.-Comput. Stud. 59, 1--2, 213--225. Google ScholarGoogle ScholarDigital LibraryDigital Library
  15. Camurri, A., Mazzarino, B., Ricchetti, M., Timmers, R., and Volpe, G. 2004. Multimodal analysis of expressive gesture in music and dance performances. In Proceedings of the International Gesture Workshop. 20--39.Google ScholarGoogle Scholar
  16. Castellano, G., Mortillaro, M., Camurri, A., Volpe, G., and Scherer, K. 2008. Automated analysis of body movement in emotionally expressive piano performances. Music Percept. 26, 2, 103--120.Google ScholarGoogle ScholarCross RefCross Ref
  17. Castellano, G., Leite, I., Pereira, A., Martinho, C., Paiva, A., and McOwan, P. 2010. Affect recognition for interactive companions: Challenges and design in real-world scenarios. J. Multimodal User Interfaces 3, 1--2, 89--98.Google ScholarGoogle ScholarCross RefCross Ref
  18. Chandler, J. and Schwarz, N. 2009. How extending your middle finger affects your perception of others: Learned movements influence concept accessibility. J. Exp. Social Psych. 45, 1, 123--128.Google ScholarGoogle ScholarCross RefCross Ref
  19. Chang, A., O’Modhrain, S., Jacob, R., Gunther, E., and Ishii, H. 2002. ComTouch: Design of a vibrotactile communication device. In Proceedings of the ACM Conference on Design of Interactive Systems. Google ScholarGoogle ScholarDigital LibraryDigital Library
  20. Clore, G. L. and Palmer, J. 2009. Affective guidance of intelligent agents: How emotion controls cognition. Cognitive Syst. Resear. 10, 1, 21--30. Google ScholarGoogle ScholarDigital LibraryDigital Library
  21. Clynes, M. 1973. Sentography: Dynamic forms of communication of emotion and qualities. Comput. Biol. Med. 3, 119--130.Google ScholarGoogle ScholarCross RefCross Ref
  22. Conati, C. and Maclaren, H. 2009. Empirically building and evaluating a probabilistic model of user affect. User Model. User-Adapt. Interact. 19, 3, 267--303. Google ScholarGoogle ScholarDigital LibraryDigital Library
  23. Csikszentmihalyi, M. 1990. Flow. Harper Collins Publishers, New York.Google ScholarGoogle Scholar
  24. D’Mello, S. and Graesser, A. 2011. The half-life of cognitive affective states during complex learning. Cognition Emotion, 25, 7, 1299--1308.Google ScholarGoogle ScholarCross RefCross Ref
  25. Damasio, A. R. 1994. The somatic marker hypothesis and the possible functions of the prefrontal cortex. Proc. Royal Soc. 351, 1413--1420.Google ScholarGoogle Scholar
  26. Deethardt, J. F. and Hines, D. G. 1983. Tactile communication and personality differences. J. Nonverbal Behav. 8, 2, 143--156.Google ScholarGoogle ScholarCross RefCross Ref
  27. Elfenbein, H. A. and Ambady, N. 2002. Is there an ingroup advantage in emotion recognition? Psych. Bull. 128, 243--249.Google ScholarGoogle ScholarCross RefCross Ref
  28. Essick, G. K., McGlone, F., Dancer, C., Fabricant, D., Ragin, Y., Phillips, N., Jones, T., and Guest, S. 2010. Quantitative assessment of pleasant touch. Neurosci. Biobehav. Rev. 34, 2, 192--203Google ScholarGoogle ScholarCross RefCross Ref
  29. Gaver, W., Bowers, J., Boucher, A., Law, A., Pennington, S., and Villar, N. 2006. The History Tablecloth: Illuminating domestic activity. In Proceedings of the 6th Conference on Designing Interactive Systems. Google ScholarGoogle ScholarDigital LibraryDigital Library
  30. Gune, H. and Pantic, M. 2010. Automatic, dimensional and continuous emotion recognition. Int. J. Synthetic Emotions 1, 1, 68--99.Google ScholarGoogle ScholarDigital LibraryDigital Library
  31. Gune, H. and Piccardi, M. 2009. Automatic temporal segment detection and affect recognition from face and body display. IEEE Trans. Syst. Man Cybern. Part B 39, 1, 64--84. Google ScholarGoogle ScholarDigital LibraryDigital Library
  32. Haans, A. and Ijsselsteijn, W. 2006. Mediated social touch: A review of current research and future directions. Virtual Reality 9, 2, 149--159. Google ScholarGoogle ScholarDigital LibraryDigital Library
  33. Hertenstein, M. J., Keltner, D., App, B., Bulleit, B. A., and Jaskolka, A. R. 2006. Touch communicates distinct emotions. Emotion 6, 528--533.Google ScholarGoogle ScholarCross RefCross Ref
  34. Hertenstein, M. J., Holmes, R., McCullough, M., and Keltner, D. 2009. The communication of emotion via touch. Emotion 9, 4, 566--573.Google ScholarGoogle ScholarCross RefCross Ref
  35. Ho, C. C., Srinivasan, M. A., and Slater, M. 2000. An experimental study on the role of touch in shared virtual environments. ACM Trans. Comput.-Hum. Interact. 7, 4, 443--460. Google ScholarGoogle ScholarDigital LibraryDigital Library
  36. Hristova, N. and O’Hare, G. M. P. 2004. Ad-me: Wireless advertising adapted to the user location, device and emotions. In Proceedings of the 37th Annual Hawaii International Conference on System Sciences. Google ScholarGoogle ScholarDigital LibraryDigital Library
  37. Iacobini, M., Gonsalves, T., Bianchi-Berthouze, N., and Frith, C. 2010. Emotional contagion in interactive art. In Proceedings of the Conference on Kansei Engineering and Emotion Research. 1975--1984.Google ScholarGoogle Scholar
  38. Isaacs, E., Konrad, A., Cassidy-Eagle, E., and Whittaker, S. 2012. Reflecting on everyday life using technology. In Proceedings of the Workshop on Emotion and Wellbeing, in conjunction with SIGCHI Conference on Human Factors in Computing Systems.Google ScholarGoogle Scholar
  39. Isbister, K., Schwekendiek, U., and Frye, J. 2011. Wriggle: An exploration of emotional and social effects of movement. In Proceedings of the 26th Annual SIGCHI Conference on Human Factors in Computing Systems. 1885--1890. Google ScholarGoogle ScholarDigital LibraryDigital Library
  40. Jones, S. E. and Yarbrough, A. E. 1985. A naturalistic study of the meanings of touch. Comm. Monographs 52, 19--56.Google ScholarGoogle ScholarCross RefCross Ref
  41. Kapoor, A., Burleson, W., and Picard, R. W. 2007. Automatic prediction of frustration. Int. J. Hum. Comput. Stud. 65, 8, 724--736. Google ScholarGoogle ScholarDigital LibraryDigital Library
  42. Kapur, A., Virji-Babul, A., Tzanetakis, G., and Driessen, P. F. 2005. Gesture-based affective computing on motion capture data. In Proceedings of the International Conference on Affective Computing and Intelligent Interaction. 1--7. Google ScholarGoogle ScholarDigital LibraryDigital Library
  43. Keogh, E., Rosser, B. A., and Eccleston, C. 2010. E-health and chronic pain management: Current status and development. Pain 151, 1, 18--21.Google ScholarGoogle ScholarCross RefCross Ref
  44. Khanna, P. and Sasikumar, M. 2010. Recognising emotions from keyboard stroke pattern. Int. J. Comput. Appl. 11, 9, 8975--8887.Google ScholarGoogle Scholar
  45. Kleinsmith, A. and Bianchi-Berthouze, N. 2011. Form as a cue in the automatic recognition of non-acted affective body expressions. In Proceedings of the International Conference on Affective Computing and Intelligent Interaction. Lecture Notes in Computer Science, vol. 6974. Springer, 155--164. Google ScholarGoogle ScholarDigital LibraryDigital Library
  46. Kleinsmith, A. and Bianchi-Berthouze, N. 2013. Affective body expression perception and recognition: A survey. IEEE Trans. Affective Comput. To appear.Google ScholarGoogle ScholarDigital LibraryDigital Library
  47. Kleinsmith, A., Bianchi-Berthouze, N., and Steed, A. 2011. Automatic recognition of non-acted affective postures. IEEE Trans. Syst Man Cybern. Part B 41, 4, 1027--1038. Google ScholarGoogle ScholarDigital LibraryDigital Library
  48. Knapp, M. L. and Hall, J. A. 1997. Nonverbal Communication in Human Interaction 4th Ed. Harcourt Brace College Publishers.Google ScholarGoogle Scholar
  49. Liapis, A., Yannakakis, G. N., and Togelius, J. 2012. Adapting models of visual aesthetics for personalized content creation. IEEE Trans. Comput. Intell. AI Games.Google ScholarGoogle ScholarCross RefCross Ref
  50. Lindstrom, M., Stahl, A., Hook, K., Sundstrom, P., Laaksolahti, J., Combetto, M., Taylor, A., and Bresin, R. 2006. Affective diary: Designing for bodily expressiveness and self-reflection. In Proceedings of the Conference on Human Factors in Computing Systems (CHI’06). ACM Press, 1037--1042. Google ScholarGoogle ScholarDigital LibraryDigital Library
  51. Liu, K. T. and Reimer, R. A. 2008. Social playlist: Enabling touch points and enriching ongoing relationships through collaborative mobile music listening. In Proceedings of the Conference on Human-Computer Interaction with Mobile Devices and Services. 403--406. Google ScholarGoogle ScholarDigital LibraryDigital Library
  52. Lu, L., Liu, D., and Zhang, H. J. 2006. Automatic mood detection and tracking of music audio signals. IEEE Trans. Audio Speech Lang. Process. 14, 1, 5--18. Google ScholarGoogle ScholarDigital LibraryDigital Library
  53. Lv, H., Lin, Z., Yin, W., and Dong, J. 2008. Emotion recognition based on pressure sensor keyboards. In Proceedings of the IEEE International Conference on Multimedia and Expo. 1089--1092.Google ScholarGoogle Scholar
  54. Maldonado, H., Lee, J.R., Brave, S., Nass, C., Nakajima, H., Yamada, R., Iwamura, K., and Morishima, Y. 2005. We learn better together: Enhancing eLearning with emotional characters. In Proceedings of the Conference on Computer Supported Collaborative Learning. T. Koschmann, D. Suthers, and T. W. Chan Eds., 408--417. Google ScholarGoogle ScholarDigital LibraryDigital Library
  55. Marshall, P., Morris, R., Rogers, Y., Kreitmayer, S., and Davies, M. 2011. Rethinking ‘multi-user’: An in-the-wild study of how groups approach a walk-up-and-use tabletop interface. In Proceedings of the ACM Conference on Human Factors in Computing Systems. 3033--3042. Google ScholarGoogle ScholarDigital LibraryDigital Library
  56. Matsuda, Y., Sakuma, I., Jimbo, Y., Kobayashi, E., Arafune, T., and Isomura, T. 2010. Emotion recognition of finger Braille. Int. J. Innovative Comput. Inf. Control, 6, 3(B), 1363--1377.Google ScholarGoogle Scholar
  57. Melzer, A., Derks, I., Heydekorn, J., and Steffgen, G. 2010. Click or strike: Realistic versus standard game controls in violent video games and their effects on agression. In Proceedings of the 9th International Conference on Entertainment Computing. H. S. Yang, R. Malaka, J. Hoshino, and J. H. Han Eds., Lecture Notes in Computer Science, vol. 6243, Springer, 171--182. Google ScholarGoogle ScholarDigital LibraryDigital Library
  58. Nass, C. and Brave, S. 2005. Wired for Speech: How Voice Activates and Advances the Human-Computer Relationship. MIT Press, Cambridge, MA. Google ScholarGoogle ScholarDigital LibraryDigital Library
  59. Niedenthal, P. M., Barsalou, L. W., Winkielman, P., Krauth-Gruber, S., and Ric, F. 2005. Embodiment in attitudes, social perception, and emotion. Personality Soc. Psych. Rev. 9, 3, 184--211.Google ScholarGoogle ScholarCross RefCross Ref
  60. Niedenthal, S. 2009. What we talk about when we talk about game aesthetics. In Proceedings of the International Conference on Digital Game Research Association.Google ScholarGoogle Scholar
  61. Nijhar, J., Bianchi-Berthouze, N., and Boguslawski, G. 2012. Does movement recognition precision affect the player experience in exertion games? In Proceedings of the International Conference on Intelligent Technologies for Interactive Entertainment (INTETAIN ’11). 73--82.Google ScholarGoogle Scholar
  62. Nijholt, A. 2007. Playing and cheating in ambient entertainment. In Proceedings of the 6th International Conference on Entertainment Computing (ICEC’07). Lecture Notes in Computer Science, vol. 4740, Springer, 415--420. Google ScholarGoogle ScholarDigital LibraryDigital Library
  63. Nijholt, A., Plass-Oude Bos, D., and Reuderink, B. 2009. Turning shortcomings into challenges: Brain--computer interfaces for games. Entertainment Comput. 1, 1, 85--94.Google ScholarGoogle ScholarCross RefCross Ref
  64. Noma, H. and Miyasato, T. 1997. Haptic communication for cooperative object manipulation. In Proceedings of the International Workshop on New Media Technology.Google ScholarGoogle Scholar
  65. Oakley, I., Brewster, S. A., and Gray, P. D. 2000. Communicating with feeling. In Proceedings of the 1st Workshop on Haptic Human-Computer Interaction. Springer. Google ScholarGoogle ScholarDigital LibraryDigital Library
  66. Palmblad, M. and Tiplady, B. 2004. Electronic diaries and questionnaires: designing user interfaces that are easy for all patients to use. Qual. Life Resear. 13, 1199--1207.Google ScholarGoogle ScholarCross RefCross Ref
  67. Pasch, M., Bianchi-Berthouze, N., van Dijk, B., and Nijholt, A. 2009. Movement-based sports video games: Investigating motivation and gaming experience. Entertainment Comput. 9, 2, 169--180.Google ScholarGoogle Scholar
  68. Plass-Oude Bos, D., Reuderink, B., van de Laar, B. L. A., Gürkök, H., Mühl, C., Poel, M., Heylen, D. K. J., and Nijholt, A. 2010. Human-computer interaction for BCI Games: Usability and user experience. In Proceedings of the International Conference on Cyberworlds. A. Sourin Ed., IEEE, 277--281. Google ScholarGoogle ScholarDigital LibraryDigital Library
  69. Pollick, F. E., Paterson, H. M., Bruderlin, A., and Sanford, A. J. 2001. Perceiving affect from arm movement. Cognition 82, 51--61.Google ScholarGoogle ScholarCross RefCross Ref
  70. Roccetti, M., Marfia, G., and Semeraro, A. 2012. Playing into the wild: A Gesture-based interface for gaming in public spaces. J. Visual Comm. Image Represent. 23, 3, 426--440. Google ScholarGoogle ScholarDigital LibraryDigital Library
  71. Russell, J. A. 1980. A circumplex model of affect. J. Personality Social Psych. 39, 1161--1178.Google ScholarGoogle ScholarCross RefCross Ref
  72. Savva, N., Scarinzi, A., and Bianchi-Berthouze, N. 2012. Continuous recognition of player’s affective body expression as dynamic quality of aesthetic experience, Trans. Comput. Intell. AI in Games 4, 3, 199--212.Google ScholarGoogle ScholarCross RefCross Ref
  73. Schaefer, M., Heinze, H. J., and Rotte, M. 2012. Touch and personality: Extraversion predicts somatosensory brain response. NeuroImage 62, 1, 432--438.Google ScholarGoogle ScholarCross RefCross Ref
  74. Schutz, R. and Pekrun, P. A. Eds. 2007. Emotion in Education. Academic Press, San Diego, CA.Google ScholarGoogle Scholar
  75. Sebe, N., Sun, Y., Bakker, E., Lew, M. S., Cohen, I., and Huang, T. S. 2004. Towards authentic emotion recognition. In Proceedings of the IEEE International Conference on Systems, Man and Cybernetics. 623--628.Google ScholarGoogle Scholar
  76. Shibata, T., Mitsui, T., Wada, K., Touda, A., Kumasaka, T., Tagami, K., and Tanie, K. 2001. Mental commit robot and its application to therapy of children. In Proceedings of the IEEE/ASME International Conference on Advanced Intelligent Mechatronics. Vol. 2, 1053--1058.Google ScholarGoogle Scholar
  77. Stiehl, W.D., Lieberman, J., Breazeal, C., Basel, L., Lalla, L., and Wolf, M. 2005. Design of a therapeutic robotic companion for relational, affective touch. In Proceedings of the IEEE International Workshop on Robot and Human Interactive Communication. 408--415.Google ScholarGoogle Scholar
  78. Strong, R. and Gaver, B. 1996. Feather, scent and shaker: Supporting simple intimacy. In Videos, Demonstrations, and Short Papers of the Conference on Computer Supported Cooperative Work (CSCW’96). ACM, New York.Google ScholarGoogle Scholar
  79. Su, J. H., Hsieh, M. H., Mei, T., and Tseng, V. S. 2011. Photosense: Make sense of your photos with enriched harmonic music via emotion association. In Proceedings of the IEEE International Conference on Multimedia and Expo. 1--6. Google ScholarGoogle ScholarDigital LibraryDigital Library
  80. Sundström, P., Ståhl, A., and Höök, K. 2007. In situ informants exploring an emotional mobile messaging system in their everyday practice. Int. J. Hum.-Comput. Stud. 65, 4, 388--403. Google ScholarGoogle ScholarDigital LibraryDigital Library
  81. Thrasher, M., van der Zwaag, M., Bianchi-Berthouze, N., and Westering, J. 2011. Mood recognition based on upper body posture and movement features. In Proceedings of the 4th International Conference on Affective Computing and Intelligent Interaction. Lecture Notes in Computer Science, vol. 6974, Springer, 377--386. Google ScholarGoogle ScholarDigital LibraryDigital Library
  82. Uvanas-Moberg, K., Arn, I., and Magnusson, D. 2005. The psychobiology of emotion: The role of the oxytocinergic system. Int. J. Behav. Med. 12, 59--65.Google ScholarGoogle ScholarCross RefCross Ref
  83. von Laban, R. 1971. The Mastery of Movement. MacDonald & Evans Ltd.Google ScholarGoogle Scholar
  84. Wallbott, H. G. and Scherer, K. R. 1986. Cues and channels in emotion recognition. J. Personality Social Psych. 51, 4, 690--699.Google ScholarGoogle ScholarCross RefCross Ref
  85. Wu, G. F., Jang, S. Y., Kwak, H. S., and Pu, J. X. 2008. A new parameter selection method of neural network. In Proceedings of the International Conference on Multimedia and Ubiquitous Engineering. 378--383. Google ScholarGoogle ScholarDigital LibraryDigital Library
  86. Yannakakis, G. N. and Hallam, J. 2007. Towards optimizing entertainment in computer games. Appl. Artif. Intell. 21, 933--971. Google ScholarGoogle ScholarDigital LibraryDigital Library
  87. Yannakakis, G. N. and Togelius, J. 2011. Experience-driven procedural content generation. IEEE Trans. Affective Comput. 2, 3, 147--161. Google ScholarGoogle ScholarDigital LibraryDigital Library
  88. Yannakakis, G. N., Martinez, H. P., and Jhala, A. 2010. Towards affective camera control in games. User Model. User-Adapt. Interact. 20, 313--340. Google ScholarGoogle ScholarDigital LibraryDigital Library

Index Terms

  1. What Does Touch Tell Us about Emotions in Touchscreen-Based Gameplay?

    Recommendations

    Reviews

    Michael G. Murphy

    Touch is an important means of expressing emotion in everyday life, but how do we express emotions in games played using a touchscreen__?__ In this monograph, Gao et al. carefully investigate this question. The popularity of gaming in a touch-based environment, such as a smartphone or tablet, means that the results of this investigation may affect game design in general, and may result in reactive personalization for the player in particular. The authors use machine learning techniques to capture finger-stroke factors and assess four emotional states (excited, relaxed, frustrated, and bored), two arousal levels (low and high), and two valence levels (positive and negative). After providing introductory motivation, the authors present a careful review of the literature on touch behavior and its connection to emotion. This is followed by the experimental protocol for collecting and labeling data to build and test their system to recognize emotion. A careful analysis of touch features and how they relate to player emotion leads into a discussion of the building and testing of models for automated recognition. The underlying classification algorithms include "discriminant analysis (DA), artificial neural network (ANN) with back propagation, and support vector machine (SVM) classifiers." The paper concludes with a discussion of the results and how these techniques might apply in other contexts. Nine figures and nine tables provide details on the many technical aspects of this study, and 88 references indicate the careful preparation that went into this monograph. Readers interested in human-computer interfaces, games, and app design will benefit from this well-organized, insightful, and technically precise publication. Online Computing Reviews Service

    Access critical reviews of Computing literature here

    Become a reviewer for Computing Reviews.

    Comments

    Login options

    Check if you have access through your login credentials or your institution to get full access on this article.

    Sign in

    Full Access

    • Published in

      cover image ACM Transactions on Computer-Human Interaction
      ACM Transactions on Computer-Human Interaction  Volume 19, Issue 4
      December 2012
      236 pages
      ISSN:1073-0516
      EISSN:1557-7325
      DOI:10.1145/2395131
      Issue’s Table of Contents

      Copyright © 2012 ACM

      Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      • Published: 1 December 2012
      • Accepted: 1 September 2012
      • Revised: 1 June 2012
      • Received: 1 March 2012
      Published in tochi Volume 19, Issue 4

      Permissions

      Request permissions about this article.

      Request Permissions

      Check for updates

      Qualifiers

      • research-article
      • Research
      • Refereed

    PDF Format

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader