skip to main content
10.1145/985692.985699acmconferencesArticle/Chapter ViewAbstractPublication PageschiConference Proceedingsconference-collections
Article

Categorical imperative NOT: facial affect is perceived continuously

Published:25 April 2004Publication History

ABSTRACT

Facial affect (or emotion) recognition is a central issue for many VMC and naturalistic computing applications. Most computational models assume "categorical perception" of facial affect, in which a benign illusion promotes robust recognition of emotional expressions even under severe degradation conditions, including temporal compression. However, this applied interest in human facial affect perception is coming at a time when the evidence for categorical perception is being challenged in the basic research literature, largely on methodological grounds. The research presented here systematically addresses the classic evidence for categorical perception of facial affect, using high-quality digital imaging and display technologies and improved research methods. In doing so, it illustrates a fruitful convergence of basic and applied research. The evidence does NOT support categorical perception of facial affect, which in turn underlines the importance of preserving high-fidelity motion information in portraying emotion. This research provides new human behavioral data on facial affect perception, and underscores the importance of careful consideration of facial affect compression methods.

References

  1. Bachmann, T. Identification of spatially quantized tachistoscope images of faces. European Journal of Cognitive Psychology, 3 (1991), 87--103.Google ScholarGoogle Scholar
  2. Bassili, J.N. Emotion Recognition. Jnl of Personality and Social Psychology, 37 (1979), 2049--2058.Google ScholarGoogle Scholar
  3. Bornstein, M.H. & Korda, N.O. (1984). Discrimina-tion and matching within and between hues measured by reaction times. Some implications for categorical perception. Psychological Research, 46, 207--222Google ScholarGoogle ScholarCross RefCross Ref
  4. Bruce, V. The role of the face in communication. Interacting with Computers, 8, (1996), 166--176. Google ScholarGoogle ScholarDigital LibraryDigital Library
  5. Calder, A., Young, A., Perrett, D., Etcoff, N.,Rowland, D. (1996). Categorical perception of morphed facial expressions. Visual Cognition, (3), 81--117.Google ScholarGoogle Scholar
  6. Dailey, M.N., Cottrell, G.W., Padgett, C., & Adolphs, R. (2002), EMPATH: A neural network that categorizes facial expressions. Jnl Cognitive Neuroscience, 14 (8), 1158--1173. Google ScholarGoogle ScholarDigital LibraryDigital Library
  7. Ekman. P. (Ed.) (1982). Emotion in the human face. Cambridge: Cambridge University Press.Google ScholarGoogle Scholar
  8. Ekman, P. & Friesen, W.V. (1978). The Facial Action Coding System. Palo Alto, Consulting Psychologists.Google ScholarGoogle Scholar
  9. Ehrlich, S.M., Schiano, D.J. & Sheridan, K. (2000). Communicating facial affect: It's not the realism; it's the motion. Proc. CHI' 2000, 252--253. NY: ACM. Google ScholarGoogle ScholarDigital LibraryDigital Library
  10. Harnad, S. (2003) Categorical Perception,in Encyclopedia of Cognitive Science. NY: MacMillan.Google ScholarGoogle Scholar
  11. Kurlander, D., Skelly, T. & Salesin, D. (1996). Comic chat. Proc. Computer Graphics, ACM Press, 225--236. Google ScholarGoogle ScholarDigital LibraryDigital Library
  12. Levenson, R. W., Ekman, P., Friesen, W. V. Voluntary facial action generates emotion-specific autonomic nervous system activity. Psychophysiology, 27 (1990), 363--384.Google ScholarGoogle Scholar
  13. Liberman, A., Harris, K., Hoffman, H & Griffith, B. (1957). The discrimination of speech sounds within and across phoneme boundaries. Jnl of Experimental Psychology, 54, 358--368.Google ScholarGoogle ScholarCross RefCross Ref
  14. Lisetti, C.L. & Schiano, D.J. (2000). Automatic facial expression interpretation: Where Human-Computer Interaction, Artificial Intelligence & Cognitive Science intersect. Pragmatics and Cognition, 8 (1), 185--235.Google ScholarGoogle ScholarCross RefCross Ref
  15. Massaro, D.W. (1998). Categorical perception: Important phenomenon or lasting myth? Proc. ICSLP'98. Sydney, Austrlia.Google ScholarGoogle Scholar
  16. Padgett, C. & Cottrell, G. W. (1998). A simple neural network models categorical perception of facial expressions. Cognitive Science, 806--807.Google ScholarGoogle Scholar
  17. Picard, R. (1997). Affective computing. Cambridge: MIT Press. Google ScholarGoogle ScholarDigital LibraryDigital Library
  18. Reeves, B. & Nass, C.I. (1996). The media equation: How people treat computers, television, & new media like real people and places. NY: Cambridge Univ. Press. Google ScholarGoogle ScholarDigital LibraryDigital Library
  19. Russell, J.A. Is there universal recognition of emotion from facial expression? Psychological Bulletin, 95 (1994), 102--141.Google ScholarGoogle Scholar
  20. Russell, J.A., & Fernandez-Dols, J.M. (1997). The psychology of facial expression. NY: Cambridge Univ. Press.Google ScholarGoogle Scholar
  21. Schiano, D.J., Ehrlich, S.M., Rahardja, K. & Sheridan, K. (2000). Measuring and modeling facial affect. Behavior Research Methods, Instruments & Computers, 32 (4), 505--514.Google ScholarGoogle ScholarCross RefCross Ref
  22. Schiano, D.J., Ehrlich, S.M., Rahardja, K. & Sheridan, K. (2000). Face to Interface: Facial Affect in(Hu)Man and Machine. Proc. CHI'2000, 193--200. NY: ACM. Google ScholarGoogle ScholarDigital LibraryDigital Library
  23. Schiano, D.J., Ehrlich, & Sheridan, K. (2001). Categorical Perception of Facial Affect: An Illusion. Ext. Abstracts, CHI'2001, 299--300. NY: ACM. Google ScholarGoogle ScholarDigital LibraryDigital Library
  24. Walker, J.H., Sproull, L., Subramani, R. (1994). Using a human face in an interface. Proc. CHI'1994, 205. Boston, MA. Google ScholarGoogle ScholarDigital LibraryDigital Library
  25. Young, A., Rowland, D., Calder, A., Etcoff, N., Seth, A. Perrett, D. (1997). Facial expression megamix. Cognition, 63, 271--31.Google ScholarGoogle ScholarCross RefCross Ref

Recommendations

Comments

Login options

Check if you have access through your login credentials or your institution to get full access on this article.

Sign in

PDF Format

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader