skip to main content
10.1145/1738826.1738914acmotherconferencesArticle/Chapter ViewAbstractPublication PagesozchiConference Proceedingsconference-collections
research-article

Enhancement of human computer interaction with facial electromyographic sensors

Published: 23 November 2009 Publication History

Abstract

In this paper we describe a way to enhance human computer interaction using facial Electromyographic (EMG) sensors. Indeed, to know the emotional state of the user enables adaptable interaction specific to the mood of the user. This way, Human Computer Interaction (HCI) will gain in ergonomics and ecological validity. While expressions recognition systems based on video need exaggerated facial expressions to reach high recognition rates, the technique we developed using electrophysiological data enables faster detection of facial expressions and even in the presence of subtle movements. Features from 8 EMG sensors located around the face were extracted. Gaussian models for six basic facial expressions - anger, surprise, disgust, happiness, sadness and neutral - were learnt from these features and provide a mean recognition rate of 92%. Finally, a prototype of one possible application of this system was developed wherein the output of the recognizer was sent to the expressions module of a 3D avatar that then mimicked the expression.

References

[1]
Ang, L. B. P., E. F. Belen, et al. (2004). Facial expression recognition through pattern analysis of facial muscle movements utilizing electromyogram sensors. TENCON 2004. 2004 IEEE Region 10 Conference.
[2]
Bartlett, M., G. Littlewort, et al. (2006). Automatic recognition of facial actions in spontaneous expressions. Journal of Multimedia 1(6): 22--35.
[3]
Becker, K. (2003). Varioport#8482;. http://www.becker-meditec.de.
[4]
Burnham, D., R. Dale, et al. (2006--2011). From Talking Heads to Thinking Heads: A Research Platform for Human Communication Science from http://thinkinghead.uws.edu.au/index.html.
[5]
Busso, C., Z. Deng, et al. (2004). Analysis of emotion recognition using facial expressions, speech and multimodal information. Sixth International Conference on Multimodal Interfaces ICMI, State College, PA.
[6]
Chin, Z. Y., K. K. Ang, et al. (2008). Multiclass voluntary facial expression classification based on Filter Bank Common Spatial Pattern. Engineering in Medicine and Biology Society, EMBS 2008. 30th Annual International Conference of the IEEE.
[7]
Dimberg, U. and M. Thunberg (1998). Rapid facial reactions to emotional facial expressions. Scandinavian Journal of Psychology 39(1): 39--45.
[8]
Ekman, P. and W. V. Friesen (1971). Constants across Cultures in the Face ans Emotion. Journal of Personality and Social Psychology 17(2): 124--129.
[9]
Fridlund, A. J. and J. T. Cacioppo (1986). Guidelines for Human Electromyographic Research. Psychophysiology 23(5): 567--589.
[10]
Krell, G., R. Niese, et al. (2009). Facial Expression Recognition with Multi-channel Deconvolution. Advances in Pattern Recognition, 2009. ICAPR '09. Seventh International Conference on.
[11]
Lucero, J. C. and K. G. Munhall (1999). A model of facial biomechanics for speech production. Journal of the Acoustical Society of America 106(5): 2834--2842.
[12]
Minoru, H., Y. Chisaki, et al. (2006). Development and Control of a Face Robot Imitating Human Muscular Structures. Intelligent Robots and Systems, 2006 IEEE/RSJ International Conference on.
[13]
Morishima, S. (2001). Face analysis and synthesis. Signal Processing Magazine, IEEE 18(3): 26--34.
[14]
Reaz, M. B. I., M. S. Hussain, et al. (2006). "Techniques of EMG signal analysis: detection, processing, classification and applications." Biological Procedures Online: 11--35.
[15]
Tian, Y. I., T. Kanade, et al. (2001). Recognizing action units for facial expression analysis. Pattern Analysis and Machine Intelligence, IEEE Transactions on 23(2): 97--115.
[16]
Tingfan, W., N. J. Butko, et al. (2009). Learning to Make Facial Expressions. Development and Learning, 2009. ICDL 2009. IEEE 8th International Conference on.
[17]
Toth, A., M. Wand, et al. (2009). Synthesizing Speech from Electromyography using Voice Transformation Techniques. Interspeech.
[18]
Wand, M. and T. Schultz (2009). Towards Speaker-Adaptive Speech Recognition Based on Surface Electromyography. Biosignals.

Cited By

View all
  • (2024)Framework for the Classification of Facial Emotions Using Soft Computing TechniquesCurrent Signal Transduction Therapy10.2174/011574362427391824010206040219:1Online publication date: Mar-2024
  • (2024)An IoT Based Electromyography Signal Transmission from sEMG Electrodes to Client's Server with ICT Infrastructure2024 10th International Conference on Advanced Computing and Communication Systems (ICACCS)10.1109/ICACCS60874.2024.10717233(510-514)Online publication date: 14-Mar-2024
  • (2023)Facial Motion Capture System Based on Facial Electromyogram and Electrooculogram for Immersive Social Virtual Reality ApplicationsSensors10.3390/s2307358023:7(3580)Online publication date: 29-Mar-2023
  • Show More Cited By

Index Terms

  1. Enhancement of human computer interaction with facial electromyographic sensors

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Other conferences
    OZCHI '09: Proceedings of the 21st Annual Conference of the Australian Computer-Human Interaction Special Interest Group: Design: Open 24/7
    November 2009
    445 pages
    ISBN:9781605588544
    DOI:10.1145/1738826
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 23 November 2009

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. EMG
    2. Gaussian models
    3. facial expressions

    Qualifiers

    • Research-article

    Conference

    OZCHI '09

    Acceptance Rates

    OZCHI '09 Paper Acceptance Rate 32 of 60 submissions, 53%;
    Overall Acceptance Rate 362 of 729 submissions, 50%

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)17
    • Downloads (Last 6 weeks)2
    Reflects downloads up to 10 Feb 2025

    Other Metrics

    Citations

    Cited By

    View all
    • (2024)Framework for the Classification of Facial Emotions Using Soft Computing TechniquesCurrent Signal Transduction Therapy10.2174/011574362427391824010206040219:1Online publication date: Mar-2024
    • (2024)An IoT Based Electromyography Signal Transmission from sEMG Electrodes to Client's Server with ICT Infrastructure2024 10th International Conference on Advanced Computing and Communication Systems (ICACCS)10.1109/ICACCS60874.2024.10717233(510-514)Online publication date: 14-Mar-2024
    • (2023)Facial Motion Capture System Based on Facial Electromyogram and Electrooculogram for Immersive Social Virtual Reality ApplicationsSensors10.3390/s2307358023:7(3580)Online publication date: 29-Mar-2023
    • (2021)SonicFaceProceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies10.1145/34949885:4(1-33)Online publication date: 30-Dec-2021
    • (2021)Investigating User Perceptions Towards Wearable Mobile ElectromyographyHuman-Computer Interaction – INTERACT 202110.1007/978-3-030-85610-6_20(339-360)Online publication date: 30-Aug-2021
    • (2020)WiFace: Facial Expression Recognition Using Wi-Fi SignalsIEEE Transactions on Mobile Computing10.1109/TMC.2020.3001989(1-1)Online publication date: 2020
    • (2020)Real-Time Recognition of Facial Expressions Using Facial Electromyograms Recorded Around the Eyes for Social Virtual Reality ApplicationsIEEE Access10.1109/ACCESS.2020.29836088(62065-62075)Online publication date: 2020
    • (2019)Recognition of Emotion Through Facial Expressions Using EMG Signal2019 International Conference on Nascent Technologies in Engineering (ICNTE)10.1109/ICNTE44896.2019.8945843(1-6)Online publication date: Jan-2019
    • (2018)Robust Facial Expression Recognition for MuCIIEEE Transactions on Affective Computing10.1109/TAFFC.2016.25690989:1(102-115)Online publication date: 1-Jan-2018
    • (2018)Machine-learning approaches for recognizing muscle activities involved in facial expressions captured by multi-channels surface electromyogramSmart Health10.1016/j.smhl.2017.11.0025-6(15-25)Online publication date: Jan-2018
    • Show More Cited By

    View Options

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Figures

    Tables

    Media

    Share

    Share

    Share this Publication link

    Share on social media