skip to main content
10.1145/1182475.1182536acmotherconferencesArticle/Chapter ViewAbstractPublication PagesnordichiConference Proceedingsconference-collections
Article

The FaceReader: measuring instant fun of use

Published: 14 October 2006 Publication History

Abstract

Recently, more and more attention has been paid to emotions in the domain of Human-Computer Interaction. When evaluating a product, one can no longer ignore the emotions a product induces. This paper examines the value of a new instrument to measure emotions: the FaceReader. We will assess the extent to which the FaceReader is useful when conducting usability evaluations. To do this, we will compare the data gained from the FaceReader with two other sources: user questionnaires and researcher's loggings. Preliminary analysis shows that the FaceReader is an effective tool to measure instant emotions and fun of use. However, a combination of the FaceReader with another observation method (e.g. researcher's loggings) is necessary. As regards the user questionnaire, our results indicate that it is rather a reflection of the content of the application or the outcome of a task, than a correct self-reflection of how the user felt when accomplishing the task.

References

[1]
Branco, P., Firth, P., Encarnaçao, L. M., and Bonato, P. Faces of emotion in Human-Computer Interaction. In Late breaking results CHI 2005, ACM Press (2005), 1236--1239.
[2]
Chin, J. P., Diehl, V. A. and Norman, K. L. Development of an Instrument Measuring user satisfaction of the Human-Computer Interface. In Proc. CHI 88; ACM Press (1988), 213--218.
[3]
Den Uyl, M. J. and van Kuilenburg, H. The FaceReader: Online Facial Expression Recognition. In Proc. Measuring Behaviour (2005), 589--590.
[4]
Desmet, P. M. A. Measuring emotions. In M. Blythe, C. Overbeeke, A. F. Monk, & P. C. Wright (Eds.), Funology: From Usability to Enjoyment. Kluwer, Dordrecht, the Netherlands, 2003.
[5]
Ekman, P. Strong evidence for universals in facial expressions: a reply to Russell's mistaken critique. Psychological Bulletin 115, 2 (1994), 268--287.
[6]
Ekman, P. and Friesen, W. V. Manual for the Facial Action Coding System. Consulting Psychologists Press, Palo Alto, CA, 1977.
[7]
Hazlett, R. Measuring emotional valence during interactive experience: boys at video game play. In Proc CHI 2006, ACM Press (2006), 1023-1026.
[8]
Hirschman, E. C., and Holbrook, M. B. Hedonic consumption: emerging concepts, methods and propositions. Journal of marketing, 46 (1982), 92--101.
[9]
ISO/IEC. 9241 Ergonomic requirements for office work with visual display terminals (VDTS) Part 11. Guidance on usability, ISO/IEC, Switzerland, 1998.
[10]
Lyons, M. J., and Bartneck, C. HCI and the Face. In Workshop CHI 2006, ACM Press (2006), 1671--1672.
[11]
Mahike, S., Minge, M. and Thüring, M. Measuring multiple components of emotions in interactive contexts. In Work in progress CHI 2006, ACM Press (2006), 1061--1066.
[12]
Mandryk, R. L., Atkins, M. S., and Inkpen, K. M. A continuous and objective evaluation of emotional experience with interactive play environments. In Proc. CHI 2006, ACM Press (2006), 1027--1036.
[13]
Nielsen, J., Clemmensen, T. and Yssing, C. Getting access to what goes on in people's heads?: reflections on the think-aloud technique. In Proc. of the second Nordic conference on Human-computer interaction, ACM Press (2002), 101--110.

Cited By

View all
  • (2024)Objective Measurement of Experiences in Tourism and Hospitality: A Systematic Review of Methodological Approaches and Best PracticesJournal of Hospitality & Tourism Research10.1177/1096348023122608648:8(1382-1403)Online publication date: 8-Feb-2024
  • (2024)The Challenges of Generating Emotion-Focused Product IdeaEmotion-Driven Innovation10.1007/978-3-031-49877-0_1(1-17)Online publication date: 16-Feb-2024
  • (2023)Detect and Interpret: Towards Operationalization of Automated User Experience EvaluationDesign, User Experience, and Usability10.1007/978-3-031-35702-2_6(82-100)Online publication date: 9-Jul-2023
  • Show More Cited By

Index Terms

  1. The FaceReader: measuring instant fun of use

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Other conferences
    NordiCHI '06: Proceedings of the 4th Nordic conference on Human-computer interaction: changing roles
    October 2006
    517 pages
    ISBN:1595933255
    DOI:10.1145/1182475
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 14 October 2006

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. FaceReader
    2. emotions
    3. instant fun of use
    4. usability

    Qualifiers

    • Article

    Conference

    NORDICHI06

    Acceptance Rates

    Overall Acceptance Rate 379 of 1,572 submissions, 24%

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)43
    • Downloads (Last 6 weeks)11
    Reflects downloads up to 01 Mar 2025

    Other Metrics

    Citations

    Cited By

    View all
    • (2024)Objective Measurement of Experiences in Tourism and Hospitality: A Systematic Review of Methodological Approaches and Best PracticesJournal of Hospitality & Tourism Research10.1177/1096348023122608648:8(1382-1403)Online publication date: 8-Feb-2024
    • (2024)The Challenges of Generating Emotion-Focused Product IdeaEmotion-Driven Innovation10.1007/978-3-031-49877-0_1(1-17)Online publication date: 16-Feb-2024
    • (2023)Detect and Interpret: Towards Operationalization of Automated User Experience EvaluationDesign, User Experience, and Usability10.1007/978-3-031-35702-2_6(82-100)Online publication date: 9-Jul-2023
    • (2023)A Quantitative Comparison of Manual vs. Automated Facial Coding Using Real Life Observations of FathersPervasive Computing Technologies for Healthcare10.1007/978-3-031-34586-9_25(379-396)Online publication date: 11-Jun-2023
    • (2022)Comparing the Effectiveness of Speech and Physiological Features in Explaining Emotional Responses during Voice User Interface InteractionsApplied Sciences10.3390/app1203126912:3(1269)Online publication date: 25-Jan-2022
    • (2021)Improving Human–Robot Interaction by Enhancing NAO Robot Awareness of Human Facial ExpressionSensors10.3390/s2119643821:19(6438)Online publication date: 27-Sep-2021
    • (2021)Facial Expression-Based Experimental Analysis of Human Reactions and Psychological Comfort on Glass Structures in BuildingsBuildings10.3390/buildings1105020411:5(204)Online publication date: 14-May-2021
    • (2021)Remote Facial Expression and Heart Rate Measurements to Assess Human Reactions in Glass StructuresAdvances in Civil Engineering10.1155/2021/19781112021:1Online publication date: 5-Nov-2021
    • (2020)Automated vs. manual pain coding and heart rate estimations based on videos of older adults with and without dementiaJournal of Rehabilitation and Assistive Technologies Engineering10.1177/20556683209501967Online publication date: 21-Sep-2020
    • (2020)Facilitating the Child–Robot Interaction by Endowing the Robot with the Capability of Understanding the Child Engagement: The Case of Mio Amico RobotInternational Journal of Social Robotics10.1007/s12369-020-00661-w13:4(677-689)Online publication date: 12-Jun-2020
    • Show More Cited By

    View Options

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Figures

    Tables

    Media

    Share

    Share

    Share this Publication link

    Share on social media