Abstract
We introduce coefficient K, defined on a novel parametric scale, derived from processing a traditionally eye-tracked time course of eye movements. Positive and negative ordinates of K indicate focal or ambient viewing, respectively, while the abscissa serves to indicate time, so that K acts as a dynamic indicator of fluctuation between ambient/focal visual behavior. The coefficient indicates the difference between fixation duration and its subsequent saccade amplitude expressed in standard deviation units, facilitating parametric statistical testing. To validate K empirically, we test its utility by capturing ambient and focal attention during serial and parallel visual search tasks (Study 1). We then show how K quantitatively depicts the difference in scanning behaviors when attention is guided by audio description during perception of art (Study 2).
- R. Bailey, A. McNamara, A. Costello, S. Sridharan, and C. Grimm. 2012. Impact of subtle gaze direction on short-term spatial information recall. In Proceedings of the 2012 Symposium on Eye Tracking Research & Applications (ETRA’’12). ACM, New York, NY, 67--74. DOI:http://dx.doi.org/10.1145/2168556.2168567 Google ScholarDigital Library
- R. Bednarik, H. Vrzakova, and M. Hradis. 2012. What do you want to do next: A novel approach for intent prediction in gaze-based interaction. In Proceedings of the 2012 Symposium on Eye Tracking Research & Applications (ETRA’’12). ACM, New York, NY, 83--90. DOI:http://dx.doi.org/10.1145/2168556.2168569 Google ScholarDigital Library
- C. Biele, A. Kopacz, and K. Krejtz. 2013. Shall we care about the user’s feelings? Influence of affect and engagement on visual attention. In Proceedings of the International Conference on Multimedia, Interaction, Design and Innovation (MIDI’13). ACM, New York, NY, Article 7, 8 pages. DOI:http://dx.doi.org/10.1145/2500342.2500349 Google ScholarDigital Library
- F. Boselie and E. Leeuwenber. 1985. Birkhoff revisited: Beauty as a function of effect and means. The American Journal of Psychology 98, 1 (1985), 1--39.Google ScholarCross Ref
- A. Bulling, C. Weichel, and H. Gellersen. 2013. EyeContext: Recognition of high-level contextual cues from human visual behaviour. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI’13). ACM, New York, NY, 305--308. DOI:http://dx.doi.org/10.1145/2470654.2470697 Google ScholarDigital Library
- C. Cabeza-Cáceres. 2010. Opera audio description at Barcelona’s Liceu theatre. In Media for All 2: New Insights into Audiovisual Translation and Media Accessibility, Jorge Díaz Cintas, Anna Matamala, and Josélia Neves (Eds.). Rodopi, Amsterdam, Holland, 227--237.Google Scholar
- M. S. Castelhano and J. M. Henderson. 2007. Initial scene representations facilitate eye movement guidance in visual search. Journal of Experimental Psychology: Human Perception and Performance 33, 4 (2007), 753--763.Google ScholarCross Ref
- G. C. Cupchik, O. Vartanian, A. Crawley, and D. J. Mikulis. 2009. Viewing artworks: Contributions of cognitive control and perceptual facilitation to aesthetic experience. Brain and Cognition 70, 1 (2009), 84--91.Google ScholarCross Ref
- K. De Coster and V. Muhleis. 2007. Intersensorial translation: Visual art made up by words. In Media for All: Subtitling for the Deaf, Audio Description and Sign Language, Jorge Díaz Cintas, Pilar Orero, and Aline Remael (Eds.). Rodopi, Amsterdam, Holland, 175--187.Google Scholar
- A. T. Duchowski. 2002. A breadth-first survey of eye tracking applications. Behavior Research Methods, Instruments, and Computers (BRMIC) 34, 4 (2002), 455--470.Google ScholarCross Ref
- A. T. Duchowski, S. V. Babu, J.f Bertrand, and K. Krejtz. 2014. Gaze analytics pipeline for unity 3D integration: Signal filtering and analysis. In Proceedings of the 2nd International Workshop on Eye Tracking for Spatial Research (ET4S).Google Scholar
- A. T. Duchowski and K. Krejtz. 2015. Visualizing dynamic ambient/focal attention with coefficient K. In Proceedings of the 1st Workshop on Eye Tracking and Visualization (ETVIS).Google Scholar
- B. Follet, O. Le Meur, and T. Baccino. 2011. New insights on ambient and focal visual fixations using an automatic classification algorithm. i-Perception 2, 6 (2011), 592--610.Google Scholar
- P. Hekkert and P. C. W. van Wieringen. 1996. The impact of level of expertise on the evaluation of original and altered versions of post-impressionistic paintings. Acta Psychologica 94 (1996), 117--131.Google ScholarCross Ref
- A. Holland. 2008. Audio description in the theatre and the visual arts: Images into words. In Audiovisual Translation: Language Transfer on Screen, J. Díaz Cintas and G. Anderman (Eds.). Palgrave Macmillan, Basingstoke, UK.Google Scholar
- A. Hollingworth and J. M. Henderson. 2002. Accurate visual memory for previously attended objects in natural scenes. Journal of Experimental Psychology: Human Perception and Performance 28, 1 (2002), 113--136.Google ScholarCross Ref
- K. Holmqvist, P. Lindström, and F. Ferrara. 2011a. A method for quantifying focused versus overview behavior in AOI sequences. Behavior Research 43 (2011), 987--998.Google ScholarCross Ref
- K. Holmqvist, M. Nyström, Ri. Andersson, R. Dewhurst, H. Jarodzka, and J. Van de Weijer. 2011b. Eye Tracking: A Comprehensive Guide to Methods and Measures. Oxford University Press.Google Scholar
- D. E. Irwin and G. J. Zelinsky. 2002. Eye movements and scene perception: Memory for things observed. Perception and Psychophysics 64 (2002), 882--895.Google ScholarCross Ref
- T. Jacobsen and L. Hofel. 2002. Aesthetic judgments of novel graphic patterns: Analyses of individual judgments. Perceptual and Motor Skills 95 (2002), 755--766.Google ScholarCross Ref
- R. M. Klein. 2004. On the control of visual orienting. In Cognitive Neuroscience of Attention, M. I. Posner (Ed.). The Guilford Press, 29--44.Google Scholar
- I. Krejtz, A. Szarkowska, K. Krejtz, A. Walczak, and A. Duchowski. 2012b. Audio description as an aural guide of children’s visual attention: Evidence from an eye-tracking study. In Proceedings of the 2012 Symposium on Eye Tracking Research & Applications (ETRA’’12). ACM, New York, NY, 99--106. DOI:http://dx.doi.org/10.1145/2168556.2168572 Google ScholarDigital Library
- K. Krejtz, A. Duchowski, T. Szmidt, I. Krejtz, F. González Perilli, A. Pires, A. Vialró, and N. Villalobos. 2015. Gaze transition entropy. ACM Transactions on Applied Perception (TAP) 13, 1 (2015), 4:1--4:20. Google ScholarDigital Library
- K. Krejtz, I. Krejtz, A. Duchowski, A. Szarkowska, and A. Walczak. 2012a. Multimodal learning with audio description: An eye tracking study of children’s gaze during a visual recognition task. In Proceedings of the ACM Symposium on Applied Perception (SAP’12). ACM, New York, NY, 83--90. Google ScholarDigital Library
- J.-L. Kruger. 2010. Audio narration: Re-narrativising film. Perspectives 18, 3 (2010), 231--249.Google ScholarCross Ref
- O. Le Meur and J. C. Chevet. 2010. Relevance of a feed-forward model of visual attention for goal-oriented and free-viewing tasks. IEEE Transactions on Image Processing 19 (2010), 2801--2813. Google ScholarDigital Library
- P. Locher. 2006. The usefulness of eye movement recordings to subject an aesthetic episode with visual art to empirical scrutiny. Psychological Science 48, 2 (2006), 106--114.Google Scholar
- N. H. Mackworth and A. J. Morandi. 1967. The gaze selects informative details within pictures. Attention, Perception, and Psychophysics 2 (1967), 547--554.Google ScholarCross Ref
- C. Martindale and K. Moore. 1988. Priming, prototypicality, and preference. Journal of Experimental Psychology: Human Perception and Performance 14, 4 (1988), 661--670.Google ScholarCross Ref
- D. Massaro, F. Savazzi, C. Di Dio, D. Freedberg, and V. Gallese. 2012. When art moves the eyes: A behavioral and eye-tracking study. PLoS ONE 7, 5 (2012).Google Scholar
- A. Matamala and P. Orero. 2007. Accessible opera in Catalan: Opera for all. In Media for All. Subtitling for the Deaf, J. Díaz Cintas, P. Orero, and A. Remael (Eds.). Rodopi, Amsterdam, 201--214.Google Scholar
- R. E. Mayer. 2002. Multimedia learning. Psychology of Learning and Motivation 41 (2002), 85--139.Google ScholarCross Ref
- D. Melcher and E. Kowler. 2001. Visual scene memory and the guidance of saccadic eye movements. Vision Research 41 (2001), 3597--3611.Google ScholarCross Ref
- H.-C. Nothdurft. 1999. Focal attention in visual search. Vision Research 39 (1999), 2305--2310.Google ScholarCross Ref
- L. Nummenmaa, J. Hyöna, and M. G. Calvo. 2006. Eye movement assessment of selective attentional capture by emotional pictures. Emotion 6, 2 (2006), 257--268. DOI:http://dx.doi.org/10.1037/1528-3542.6.2.257Google ScholarCross Ref
- P. Orero. 2007. Sampling audio description in europe. In Media for All: Subtitling for the Deaf, Audio Description and Sign Language, J. Díaz Cintas, P. Orero, and A. Remael (Eds.). Rodopi, Amsterdam, Holland, 111--125.Google Scholar
- S. Pannasch, J. R. Helmert, K. Roth, A.-K. Herbold, and H. Walter. 2008. Visual fixation durations and saccade amplitudes: Shifting relationship in a variety of conditions. Journal of Eye Movement Research 2, 2 (2008), 1--19.Google ScholarCross Ref
- S. Pannasch, J. Schulz, and B. M. Velichkovsky. 2011. On the control of visual fixation durations in free viewing of complex images. Attention, Perception, and Psychophysics 73, 4 (2011), 1120--1132.Google ScholarCross Ref
- J. W. Peirce. 2009. Generating stimuli for neuroscience using psychopy. Frontiers in Neuroinformatics 2, 10 (2009).Google Scholar
- E. Peli, E. M. Fine, and A. T. Labianca. 1996. Evaluating information provided by audio description. Journal of Visual Impairment and Blindness 90 (1996), 378--385.Google ScholarCross Ref
- M. I. Posner. 1980. Orienting the attention. Quaterly Journal of Experimental Psychology 32 (1980), 3--25.Google ScholarCross Ref
- R Development Core Team. 2011. R: A Language and Environment for Statistical Computing. R Foundation for Statistical Computing, Vienna, Austria.Google Scholar
- V. S. Ramachandran and W. Hirstein. 1999. The science of art: A neurological theory of aesthetic experience. Journal of Consciousness Studies 6, 6 (1999), 15--51.Google Scholar
- K. Rayner. 1984. Visual selection in reading, picture perception and visual search: A tutorial review. In Attention and Performance, H. Bouma and D. Bouwhuis (Eds.). Erlbaum, Hillsdale, NJ.Google Scholar
- K. Rayner. 1998. Eye movements in reading and information processing: 20 years of research. Psychological Bulletin 85 (1998), 618--660.Google ScholarCross Ref
- K. Rayner. 2009. Eye movements in reading: Models and data. Journal of Eye Movement Research 2, 5 (2009), 1--10.Google ScholarCross Ref
- C. A. Rothkopf, D. H. Ballard, and M. M. Hayhoe. 2007. Task and context determine where you look. Journal of Vision 7 (2007), 1610--1620.Google Scholar
- M. Sadoski and A. Paivio. 2004. A dual coding theoretical model of reading. In Theoretical Models and Processes of Reading, R. B. Ruddell and N. J. Unrau (Eds.). Vol. 5. International Reading Association, 1329--1362.Google Scholar
- A. Santella and D. DeCarlo. 2004. Robust clustering of eye movement recordings for quantification of visual interest. In Proceedings of the 2004 Symposium on Eye Tracking Research & Applications (ETRA’’04). Vol. 23. 27--34. Google ScholarDigital Library
- E. Schmeidler and C. Kirchner. 2001. Adding audio description: Does it make a difference? Journal of Visual Impairment and Blindness 95, 4 (2001), 198--212.Google ScholarCross Ref
- R. Sekuler, A. B. Sekuler, and R. Lau. 1997. Sound alters visual motion perception. Nature 385 (1997), 308.Google ScholarCross Ref
- D. Toker, C. Conati, B. Steichen, and G. Carenini. 2013. Individual user characteristics and information visualization: Connecting the dots through eye tracking. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI’13). ACM, New York, NY, 295--304. DOI:http://dx.doi.org/10.1145/2470654.2470696 Google ScholarDigital Library
- P. J. A. Unema, S. Pannasch, M. Joos, and B. Velichkovsky. 2005. Time course of information processing during scene perception. Visual Cognition 12, 3 (2005), 473--494.Google ScholarCross Ref
- B. M. Velichkovsky, M. Joos, J. R. Helmert, and S. Pannasch. 2005. Two visual systems and their eye movements: Evidence from static and dynamic scene perception. In Proceedings of the XXVII Conference of the Cognitive Science Society (CogSci’05). 2283--2288.Google Scholar
- A. Vilaró, P. Duchowski, A. T. Orero, T. J. Grindinger, S. Tetreault, and E. Di Giovanni. 2012. How sound is the pear tree story? Testing the effect of varying audio stimuli on visual attention distribution. Perspectives: Studies in Translatology 20, 1 (2012), 55--65. Special Issue on The Pear Stories.Google ScholarCross Ref
- A. L. Yarbus. 1967. Eye Movements and Vision. Plenum Press, New York.Google Scholar
- W. H. Zangemeister, U. Oechsner, and C. Freksa. 1995. Short-term adaptive of eye movements in patients with visual hemifield defects indicates high level control of human scan path. Optometry and Vision Science 72 (1995), 467--478.Google ScholarCross Ref
- W. Zoeat and M. Donk. 2004. Bottom-up and top-down control in visual search. Perception 33 (2004), 927--937.Google ScholarCross Ref
Index Terms
- Discerning Ambient/Focal Attention with Coefficient K
Recommendations
Eye Movements in Extended Tasks: Analyses of Ambient/Focal Attention with Coefficient K
ETRA '22: 2022 Symposium on Eye Tracking Research and ApplicationsPatterns of scene exploration can be characterized by two modes of processing: an ambient mode, serving global spatial orientation in the visual environment, followed by a focal mode, facilitating a more elaborated and object-centered analysis. Previous ...
Detecting ambient/focal visual attention in professional airline pilots with a modified Coefficient K: a full flight simulator study
ETRA '20 Adjunct: ACM Symposium on Eye Tracking Research and ApplicationsFlight instruments, from which a pilot monitors an aircraft, usually serve as areas-of-interest (AOI) that help to investigate the dynamics of the visual behavior of pilots. Consequently, several meta-metrics have been proposed to provide more ...
Predicting image influence on visual saliency distribution: the focal and ambient dichotomy
ETRA '20 Short Papers: ACM Symposium on Eye Tracking Research and ApplicationsThe computational modelling of visual attention relies entirely on visual fixations that are collected during eye-tracking experiments. Although all fixations are assumed to follow the same attention paradigm, some studies suggest the existence of two ...
Comments