skip to main content
research-article

Discerning Ambient/Focal Attention with Coefficient K

Authors Info & Claims
Published:10 May 2016Publication History
Skip Abstract Section

Abstract

We introduce coefficient K, defined on a novel parametric scale, derived from processing a traditionally eye-tracked time course of eye movements. Positive and negative ordinates of K indicate focal or ambient viewing, respectively, while the abscissa serves to indicate time, so that K acts as a dynamic indicator of fluctuation between ambient/focal visual behavior. The coefficient indicates the difference between fixation duration and its subsequent saccade amplitude expressed in standard deviation units, facilitating parametric statistical testing. To validate K empirically, we test its utility by capturing ambient and focal attention during serial and parallel visual search tasks (Study 1). We then show how K quantitatively depicts the difference in scanning behaviors when attention is guided by audio description during perception of art (Study 2).

References

  1. R. Bailey, A. McNamara, A. Costello, S. Sridharan, and C. Grimm. 2012. Impact of subtle gaze direction on short-term spatial information recall. In Proceedings of the 2012 Symposium on Eye Tracking Research & Applications (ETRA’’12). ACM, New York, NY, 67--74. DOI:http://dx.doi.org/10.1145/2168556.2168567 Google ScholarGoogle ScholarDigital LibraryDigital Library
  2. R. Bednarik, H. Vrzakova, and M. Hradis. 2012. What do you want to do next: A novel approach for intent prediction in gaze-based interaction. In Proceedings of the 2012 Symposium on Eye Tracking Research & Applications (ETRA’’12). ACM, New York, NY, 83--90. DOI:http://dx.doi.org/10.1145/2168556.2168569 Google ScholarGoogle ScholarDigital LibraryDigital Library
  3. C. Biele, A. Kopacz, and K. Krejtz. 2013. Shall we care about the user’s feelings? Influence of affect and engagement on visual attention. In Proceedings of the International Conference on Multimedia, Interaction, Design and Innovation (MIDI’13). ACM, New York, NY, Article 7, 8 pages. DOI:http://dx.doi.org/10.1145/2500342.2500349 Google ScholarGoogle ScholarDigital LibraryDigital Library
  4. F. Boselie and E. Leeuwenber. 1985. Birkhoff revisited: Beauty as a function of effect and means. The American Journal of Psychology 98, 1 (1985), 1--39.Google ScholarGoogle ScholarCross RefCross Ref
  5. A. Bulling, C. Weichel, and H. Gellersen. 2013. EyeContext: Recognition of high-level contextual cues from human visual behaviour. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI’13). ACM, New York, NY, 305--308. DOI:http://dx.doi.org/10.1145/2470654.2470697 Google ScholarGoogle ScholarDigital LibraryDigital Library
  6. C. Cabeza-Cáceres. 2010. Opera audio description at Barcelona’s Liceu theatre. In Media for All 2: New Insights into Audiovisual Translation and Media Accessibility, Jorge Díaz Cintas, Anna Matamala, and Josélia Neves (Eds.). Rodopi, Amsterdam, Holland, 227--237.Google ScholarGoogle Scholar
  7. M. S. Castelhano and J. M. Henderson. 2007. Initial scene representations facilitate eye movement guidance in visual search. Journal of Experimental Psychology: Human Perception and Performance 33, 4 (2007), 753--763.Google ScholarGoogle ScholarCross RefCross Ref
  8. G. C. Cupchik, O. Vartanian, A. Crawley, and D. J. Mikulis. 2009. Viewing artworks: Contributions of cognitive control and perceptual facilitation to aesthetic experience. Brain and Cognition 70, 1 (2009), 84--91.Google ScholarGoogle ScholarCross RefCross Ref
  9. K. De Coster and V. Muhleis. 2007. Intersensorial translation: Visual art made up by words. In Media for All: Subtitling for the Deaf, Audio Description and Sign Language, Jorge Díaz Cintas, Pilar Orero, and Aline Remael (Eds.). Rodopi, Amsterdam, Holland, 175--187.Google ScholarGoogle Scholar
  10. A. T. Duchowski. 2002. A breadth-first survey of eye tracking applications. Behavior Research Methods, Instruments, and Computers (BRMIC) 34, 4 (2002), 455--470.Google ScholarGoogle ScholarCross RefCross Ref
  11. A. T. Duchowski, S. V. Babu, J.f Bertrand, and K. Krejtz. 2014. Gaze analytics pipeline for unity 3D integration: Signal filtering and analysis. In Proceedings of the 2nd International Workshop on Eye Tracking for Spatial Research (ET4S).Google ScholarGoogle Scholar
  12. A. T. Duchowski and K. Krejtz. 2015. Visualizing dynamic ambient/focal attention with coefficient K. In Proceedings of the 1st Workshop on Eye Tracking and Visualization (ETVIS).Google ScholarGoogle Scholar
  13. B. Follet, O. Le Meur, and T. Baccino. 2011. New insights on ambient and focal visual fixations using an automatic classification algorithm. i-Perception 2, 6 (2011), 592--610.Google ScholarGoogle Scholar
  14. P. Hekkert and P. C. W. van Wieringen. 1996. The impact of level of expertise on the evaluation of original and altered versions of post-impressionistic paintings. Acta Psychologica 94 (1996), 117--131.Google ScholarGoogle ScholarCross RefCross Ref
  15. A. Holland. 2008. Audio description in the theatre and the visual arts: Images into words. In Audiovisual Translation: Language Transfer on Screen, J. Díaz Cintas and G. Anderman (Eds.). Palgrave Macmillan, Basingstoke, UK.Google ScholarGoogle Scholar
  16. A. Hollingworth and J. M. Henderson. 2002. Accurate visual memory for previously attended objects in natural scenes. Journal of Experimental Psychology: Human Perception and Performance 28, 1 (2002), 113--136.Google ScholarGoogle ScholarCross RefCross Ref
  17. K. Holmqvist, P. Lindström, and F. Ferrara. 2011a. A method for quantifying focused versus overview behavior in AOI sequences. Behavior Research 43 (2011), 987--998.Google ScholarGoogle ScholarCross RefCross Ref
  18. K. Holmqvist, M. Nyström, Ri. Andersson, R. Dewhurst, H. Jarodzka, and J. Van de Weijer. 2011b. Eye Tracking: A Comprehensive Guide to Methods and Measures. Oxford University Press.Google ScholarGoogle Scholar
  19. D. E. Irwin and G. J. Zelinsky. 2002. Eye movements and scene perception: Memory for things observed. Perception and Psychophysics 64 (2002), 882--895.Google ScholarGoogle ScholarCross RefCross Ref
  20. T. Jacobsen and L. Hofel. 2002. Aesthetic judgments of novel graphic patterns: Analyses of individual judgments. Perceptual and Motor Skills 95 (2002), 755--766.Google ScholarGoogle ScholarCross RefCross Ref
  21. R. M. Klein. 2004. On the control of visual orienting. In Cognitive Neuroscience of Attention, M. I. Posner (Ed.). The Guilford Press, 29--44.Google ScholarGoogle Scholar
  22. I. Krejtz, A. Szarkowska, K. Krejtz, A. Walczak, and A. Duchowski. 2012b. Audio description as an aural guide of children’s visual attention: Evidence from an eye-tracking study. In Proceedings of the 2012 Symposium on Eye Tracking Research & Applications (ETRA’’12). ACM, New York, NY, 99--106. DOI:http://dx.doi.org/10.1145/2168556.2168572 Google ScholarGoogle ScholarDigital LibraryDigital Library
  23. K. Krejtz, A. Duchowski, T. Szmidt, I. Krejtz, F. González Perilli, A. Pires, A. Vialró, and N. Villalobos. 2015. Gaze transition entropy. ACM Transactions on Applied Perception (TAP) 13, 1 (2015), 4:1--4:20. Google ScholarGoogle ScholarDigital LibraryDigital Library
  24. K. Krejtz, I. Krejtz, A. Duchowski, A. Szarkowska, and A. Walczak. 2012a. Multimodal learning with audio description: An eye tracking study of children’s gaze during a visual recognition task. In Proceedings of the ACM Symposium on Applied Perception (SAP’12). ACM, New York, NY, 83--90. Google ScholarGoogle ScholarDigital LibraryDigital Library
  25. J.-L. Kruger. 2010. Audio narration: Re-narrativising film. Perspectives 18, 3 (2010), 231--249.Google ScholarGoogle ScholarCross RefCross Ref
  26. O. Le Meur and J. C. Chevet. 2010. Relevance of a feed-forward model of visual attention for goal-oriented and free-viewing tasks. IEEE Transactions on Image Processing 19 (2010), 2801--2813. Google ScholarGoogle ScholarDigital LibraryDigital Library
  27. P. Locher. 2006. The usefulness of eye movement recordings to subject an aesthetic episode with visual art to empirical scrutiny. Psychological Science 48, 2 (2006), 106--114.Google ScholarGoogle Scholar
  28. N. H. Mackworth and A. J. Morandi. 1967. The gaze selects informative details within pictures. Attention, Perception, and Psychophysics 2 (1967), 547--554.Google ScholarGoogle ScholarCross RefCross Ref
  29. C. Martindale and K. Moore. 1988. Priming, prototypicality, and preference. Journal of Experimental Psychology: Human Perception and Performance 14, 4 (1988), 661--670.Google ScholarGoogle ScholarCross RefCross Ref
  30. D. Massaro, F. Savazzi, C. Di Dio, D. Freedberg, and V. Gallese. 2012. When art moves the eyes: A behavioral and eye-tracking study. PLoS ONE 7, 5 (2012).Google ScholarGoogle Scholar
  31. A. Matamala and P. Orero. 2007. Accessible opera in Catalan: Opera for all. In Media for All. Subtitling for the Deaf, J. Díaz Cintas, P. Orero, and A. Remael (Eds.). Rodopi, Amsterdam, 201--214.Google ScholarGoogle Scholar
  32. R. E. Mayer. 2002. Multimedia learning. Psychology of Learning and Motivation 41 (2002), 85--139.Google ScholarGoogle ScholarCross RefCross Ref
  33. D. Melcher and E. Kowler. 2001. Visual scene memory and the guidance of saccadic eye movements. Vision Research 41 (2001), 3597--3611.Google ScholarGoogle ScholarCross RefCross Ref
  34. H.-C. Nothdurft. 1999. Focal attention in visual search. Vision Research 39 (1999), 2305--2310.Google ScholarGoogle ScholarCross RefCross Ref
  35. L. Nummenmaa, J. Hyöna, and M. G. Calvo. 2006. Eye movement assessment of selective attentional capture by emotional pictures. Emotion 6, 2 (2006), 257--268. DOI:http://dx.doi.org/10.1037/1528-3542.6.2.257Google ScholarGoogle ScholarCross RefCross Ref
  36. P. Orero. 2007. Sampling audio description in europe. In Media for All: Subtitling for the Deaf, Audio Description and Sign Language, J. Díaz Cintas, P. Orero, and A. Remael (Eds.). Rodopi, Amsterdam, Holland, 111--125.Google ScholarGoogle Scholar
  37. S. Pannasch, J. R. Helmert, K. Roth, A.-K. Herbold, and H. Walter. 2008. Visual fixation durations and saccade amplitudes: Shifting relationship in a variety of conditions. Journal of Eye Movement Research 2, 2 (2008), 1--19.Google ScholarGoogle ScholarCross RefCross Ref
  38. S. Pannasch, J. Schulz, and B. M. Velichkovsky. 2011. On the control of visual fixation durations in free viewing of complex images. Attention, Perception, and Psychophysics 73, 4 (2011), 1120--1132.Google ScholarGoogle ScholarCross RefCross Ref
  39. J. W. Peirce. 2009. Generating stimuli for neuroscience using psychopy. Frontiers in Neuroinformatics 2, 10 (2009).Google ScholarGoogle Scholar
  40. E. Peli, E. M. Fine, and A. T. Labianca. 1996. Evaluating information provided by audio description. Journal of Visual Impairment and Blindness 90 (1996), 378--385.Google ScholarGoogle ScholarCross RefCross Ref
  41. M. I. Posner. 1980. Orienting the attention. Quaterly Journal of Experimental Psychology 32 (1980), 3--25.Google ScholarGoogle ScholarCross RefCross Ref
  42. R Development Core Team. 2011. R: A Language and Environment for Statistical Computing. R Foundation for Statistical Computing, Vienna, Austria.Google ScholarGoogle Scholar
  43. V. S. Ramachandran and W. Hirstein. 1999. The science of art: A neurological theory of aesthetic experience. Journal of Consciousness Studies 6, 6 (1999), 15--51.Google ScholarGoogle Scholar
  44. K. Rayner. 1984. Visual selection in reading, picture perception and visual search: A tutorial review. In Attention and Performance, H. Bouma and D. Bouwhuis (Eds.). Erlbaum, Hillsdale, NJ.Google ScholarGoogle Scholar
  45. K. Rayner. 1998. Eye movements in reading and information processing: 20 years of research. Psychological Bulletin 85 (1998), 618--660.Google ScholarGoogle ScholarCross RefCross Ref
  46. K. Rayner. 2009. Eye movements in reading: Models and data. Journal of Eye Movement Research 2, 5 (2009), 1--10.Google ScholarGoogle ScholarCross RefCross Ref
  47. C. A. Rothkopf, D. H. Ballard, and M. M. Hayhoe. 2007. Task and context determine where you look. Journal of Vision 7 (2007), 1610--1620.Google ScholarGoogle Scholar
  48. M. Sadoski and A. Paivio. 2004. A dual coding theoretical model of reading. In Theoretical Models and Processes of Reading, R. B. Ruddell and N. J. Unrau (Eds.). Vol. 5. International Reading Association, 1329--1362.Google ScholarGoogle Scholar
  49. A. Santella and D. DeCarlo. 2004. Robust clustering of eye movement recordings for quantification of visual interest. In Proceedings of the 2004 Symposium on Eye Tracking Research & Applications (ETRA’’04). Vol. 23. 27--34. Google ScholarGoogle ScholarDigital LibraryDigital Library
  50. E. Schmeidler and C. Kirchner. 2001. Adding audio description: Does it make a difference? Journal of Visual Impairment and Blindness 95, 4 (2001), 198--212.Google ScholarGoogle ScholarCross RefCross Ref
  51. R. Sekuler, A. B. Sekuler, and R. Lau. 1997. Sound alters visual motion perception. Nature 385 (1997), 308.Google ScholarGoogle ScholarCross RefCross Ref
  52. D. Toker, C. Conati, B. Steichen, and G. Carenini. 2013. Individual user characteristics and information visualization: Connecting the dots through eye tracking. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI’13). ACM, New York, NY, 295--304. DOI:http://dx.doi.org/10.1145/2470654.2470696 Google ScholarGoogle ScholarDigital LibraryDigital Library
  53. P. J. A. Unema, S. Pannasch, M. Joos, and B. Velichkovsky. 2005. Time course of information processing during scene perception. Visual Cognition 12, 3 (2005), 473--494.Google ScholarGoogle ScholarCross RefCross Ref
  54. B. M. Velichkovsky, M. Joos, J. R. Helmert, and S. Pannasch. 2005. Two visual systems and their eye movements: Evidence from static and dynamic scene perception. In Proceedings of the XXVII Conference of the Cognitive Science Society (CogSci’05). 2283--2288.Google ScholarGoogle Scholar
  55. A. Vilaró, P. Duchowski, A. T. Orero, T. J. Grindinger, S. Tetreault, and E. Di Giovanni. 2012. How sound is the pear tree story? Testing the effect of varying audio stimuli on visual attention distribution. Perspectives: Studies in Translatology 20, 1 (2012), 55--65. Special Issue on The Pear Stories.Google ScholarGoogle ScholarCross RefCross Ref
  56. A. L. Yarbus. 1967. Eye Movements and Vision. Plenum Press, New York.Google ScholarGoogle Scholar
  57. W. H. Zangemeister, U. Oechsner, and C. Freksa. 1995. Short-term adaptive of eye movements in patients with visual hemifield defects indicates high level control of human scan path. Optometry and Vision Science 72 (1995), 467--478.Google ScholarGoogle ScholarCross RefCross Ref
  58. W. Zoeat and M. Donk. 2004. Bottom-up and top-down control in visual search. Perception 33 (2004), 927--937.Google ScholarGoogle ScholarCross RefCross Ref

Index Terms

  1. Discerning Ambient/Focal Attention with Coefficient K

    Recommendations

    Comments

    Login options

    Check if you have access through your login credentials or your institution to get full access on this article.

    Sign in

    Full Access

    • Published in

      cover image ACM Transactions on Applied Perception
      ACM Transactions on Applied Perception  Volume 13, Issue 3
      May 2016
      137 pages
      ISSN:1544-3558
      EISSN:1544-3965
      DOI:10.1145/2912576
      Issue’s Table of Contents

      Copyright © 2016 ACM

      Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      • Published: 10 May 2016
      • Accepted: 1 November 2015
      • Revised: 1 October 2015
      • Received: 1 August 2014
      Published in tap Volume 13, Issue 3

      Permissions

      Request permissions about this article.

      Request Permissions

      Check for updates

      Qualifiers

      • research-article
      • Research
      • Refereed

    PDF Format

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader