skip to main content
research-article

An empirical pipeline to derive gaze prediction heuristics for 3D action games

Published: 10 November 2010 Publication History

Abstract

Gaze analysis and prediction in interactive virtual environments, such as games, is a challenging topic since the 3D perspective and variations of the viewpoint as well as the current task introduce many variables that affect the distribution of gaze. In this article, we present a novel pipeline to study eye-tracking data acquired from interactive 3D applications. The result of the pipeline is an importance map which scores the amount of gaze spent on each object. This importance map is then used as a heuristic to predict a user's visual attention according to the object properties present at runtime. The novelty of this approach is that the analysis is performed in object space and the importance map is defined in the feature space of high-level properties. High-level properties are used to encode task relevance and other attributes, such as eccentricity, which may have an impact on gaze behavior.
The pipeline has been tested with an exemplary study on a first-person shooter game. In particular, a protocol is presented describing the data acquisition procedure, the learning of different importance maps from the data, and finally an evaluation of the performance of the derived gaze predictors. A metric measuring the degree of correlation between attention predicted by the importance map and the actual gaze yielded clearly positive results. The correlation becomes particularly strong when the player is attentive to an in-game task.

References

[1]
Baylis, G. and Driver, J. 1993. Visual attention and objects: evidence for hierarchical coding of location. J. Exp. Psych.: Hum. Percept. Perform. 19, 3, 451--470.
[2]
Behrmann, M., Zemel, R., and Mozer, M. 1998. Object-based attention and occlusion evidence from normal participants and a computational model. J. Exp. Psych.: Hum. Percept. Perform. 24, 1011--1036.
[3]
Bittner, J., Wimmer, M., Piringer, H., and Purgathofer, W. 2004. Coherent hierarchical culling: Hardware occlusion queries made useful. Computer Graph. Forum 23, 3, 615--624.
[4]
Canosa, R. L., Pelz, J. B., Mennie, N. R., and Peak, J. 2003. High-level aspects of oculomotor control during viewing of natural-task images. In Proceedings of SPIE. B. E. Rogowitz and T. N. Pappas, Eds., vol. 5007. 240--251.
[5]
Cater, K., Chalmers, A., and Ledda, P. 2002. Selective quality rendering by exploiting human inattentional blindness: looking but not seeing. In Proceedings of the ACM Symposium on Virtual Reality Software and Technology. ACM, Hong Kong, China, 17--24.
[6]
Cater, K., Chalmers, A., and Ward, G. 2003. Detail to attention: Exploiting visual tasks for selective rendering. In Proceedings of the 14th Eurographics Workshop on Rendering. EuroGraphics Association, 270--280.
[7]
Cerf, M., Frady, E. P., and Koch, C. 2008. Using semantic content as cues for better scanpath prediction. In Proceedings of the Symposium on Eye Tracking Research and Applications (ETRA'08). ACM, New York, 143--146.
[8]
Collewijn, H., Steinman, R. M., Erkelens, C. J., Pizlo, Z., and van der Steen, J. 1992. Effect of freeing the head on eye movement characteristics during three dimensional shifts of gaze and tracking. In The Head-Neck Sensory Motor System, A. Berthoz, P. P. Vidal, and W. Graf, Eds. Oxford University Press, 412--418.
[9]
De Graef, P., Christiaens, D., and d'Ydewalle, G. 1990. Perceptual effects of scene context on object identification. Psycholog. Res. 52, 4, 317--329.
[10]
Duchowski, A. T. 2003. Eye Tracking Methodology: Theory and Practice. Springer, New York.
[11]
Duncan, J. 1984. Selective attention and the organization of visual information. J. Exper. Psychol. General 113, 4, 501--517.
[12]
Einhäuser, W., Spain, M., and Perona, P. 2008. Objects predict fixations better than early saliency. J. Vision 8, 14, 1--26.
[13]
El-Nasr, M. S. and Yan, S. 2006. Visual attention in 3d video games. In Proceedings of the ACM SIGCHI International Conference on Advances in Computer Entertainment Technology (ACE'06). ACM, New York, 22.
[14]
Elazary, L. and Itti, L. 2008. Interesting objects are visually salient. J. Vision 8, 3:3, 1--15.
[15]
Hayhoe, M. M., Shrivastava, A., Mruczek, R., and Pelz, J. B. 2003. Visual memory and motor planning in a natural task. J. Vision 3, 1, 49--63.
[16]
Henderson, J. 2003. Human gaze control during real-world scene perception. Trends Cognitive Sci. 7, 11, 498--504.
[17]
Henderson, J., Williams, C., Castelhano, M., and Falk, R. 2003. Eye movements and picture processing during recognition. Percept. Psychophys. 65, 5, 725--734.
[18]
Henderson, J. M., Weeks, P. A., and Hollingworth, A. 1999. The Effects of Semantic Consistency on Eye Movements During Complex Scene Viewing. J. Exp. Psych.: Hum. Percept. Perform. 25, 210--228.
[19]
Itti, L. and Baldi, P. 2006. Bayesian surprise attracts human attention. In Advances in Neural Information Processing Systems, Vol. 19. MIT Press, Cambridge, MA, 547--554.
[20]
Itti, L. and Koch, C. 2000. A saliency-based search mechanism for overt and covert shifts of visual attention. Vision Resear. 40, 10-12, 1489--1506.
[21]
Itti, L. and Koch, C. 2001. Computational modelling of visual attention. Nature Reviews Neurosci. 2, 3, 194--203.
[22]
Itti, L., Koch, C., and Niebur, E. 1998. A model of saliency-based visual attention for rapid scene analysis. IEEE Trans. Patt. Anal. Mach. Intell. 20, 11, 1254--1259.
[23]
James, W. 1890. The Principles of Psychology Vol. 1. Dover Publications.
[24]
Jie, L. and Clark, J. J. 2007. Game design guided by visual attention. In Entertainment Computing, L. Ma, M. Rauterberg, and R. Nakatsu, Eds. Lecture Notes in Computer Science, vol. 4740, Berlin, Springer, 345--355.
[25]
Kenny, A., Koesling, H., Delaney, D., McLoone, S., and Ward, T. 2005. A preliminary investigation into eye gaze data in a first person shooter game. In Proceedings of the 19th European Conference on Modelling and Simulation. ECMS, 733--740.
[26]
Koch, C. and Ullman, S. 1985. Shifts in selective visual attention: towards the underlying neural circuitry. Human Neurobiol. 4, 219--227.
[27]
Komogortsev, O. and Khan, J. 2006. Perceptual attention focus prediction for multiple viewers in case of multimedia perceptual compression with feedback delay. In Proceedings of the Symposium on Eye Tracking Research and Applications (ETRA'06). ACM, New York, 101--108.
[28]
Land, M., Mennie, N., and Rusted, J. 1999. The roles of vision and eye movements in the control of activities of daily living. Perception 28, 11, 1311--1328.
[29]
Lee, S., Kim, G. J., and Choi, S. 2007. Real-time tracking of visually attended objects in interactive virtual environments. In Proceedings of the ACM Symposium on Virtual Reality Software and Technology. ACM, 29--38.
[30]
Luebke, D., Hallen, B., Newfield, D., and Watson, B. 2000. Perceptually driven simplification using gaze-directed rendering. In Proceedings of Eurographics Workshop on Rendering.
[31]
Mack, J. and Rock, I. 1998. Inattentional Blindness. The MIT Press, Cambridge, MA.
[32]
Marmitt, G. and Duchowski, A. T. 2002. Modeling visual attention in VR: Measuring the accuracy of predicted scanpaths. In Proceedings of Eurographics, (Short Presentations). Eurographics Association, 217--226.
[33]
Murphy, H. and Duchowski, A. 2001. Gaze-contingent level of detail rendering. In Proceedings of EuroGraphics (Short Papers). EuroGraphics Association.
[34]
Navalpakkam, V. and Itti, L. 2005. Modeling the influence of task on attention. Vision Resear. 45, 2, 205--231.
[35]
O'Craven, K. M., Downing, P. E., and Kanwisher, N. 1999. fmri evidence for objects as the units of attentional selection. Nature 401, 584--587.
[36]
Oliva, A., Torralba, A., Castelhano, M. S., and Henderson, J. M. 2003. Top-down control of visual attention in object detection. In Proceedings of the IEEE International Conference on Image Processing (ICIP'03). IEEE.
[37]
Palmer, S. E. 1999. Vision Science: Photons to Phenomenology. The MIT Press, Cambridge, MA.
[38]
Parkhurst, D., Law, K., and Niebur, E. 2002. Modeling the role of salience in the allocation of overt visual attention. Vision Resear. 42, 1, 107--123.
[39]
Pelz, J. B. and Canosa, R. 2001. Oculomotor behavior and perceptual strategies in complex tasks. Vision Resear. 41, 3587--96.
[40]
Peters, R. J. and Itti, L. 2007. Beyond bottom-up: Incorporating task-dependent influences into a computational model of spatial attention. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition. IEEE, 1--8.
[41]
Peters, R. J. and Itti, L. 2008. Applying computational tools to predict gaze direction in interactive visual environments. ACM Trans. Appl. Percept. 5, 2, 1--19.
[42]
Peters, R. J., Iyer, A., Itti, L., and Koch, C. 2005. Components of bottom-up gaze allocation in natural images. Vision Resear. 45, 8, 2397--2416.
[43]
Rothkopf, C. A., Ballard, D. H., and Hayhoe, M. M. 2007. Task and context determine where you look. J. Vision 7, 14, 1--20.
[44]
Salthouse, T. A., Ellis, C. L., Diener, D. C., and Somberg, B. L. 1981. Stimulus processing during eye fixations. J. Exp. Psych. Hum. Percept. Perform. 7, 3, 611--623.
[45]
Scholl, B. J. 2001. Objects and attention: the state of the art. Cognition 80, 1-2, 1--46.
[46]
Simons, D. J. and Chabris, C. F. 1999. Gorillas in our midst: Sustained inattentional blindness for dynamic events. Perception 28, 1059--74.
[47]
Starker, I. and Bolt, R. A. 1990. A gaze-responsive self-disclosing display. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI'90). ACM, New York, 3--10.
[48]
Sundstedt, V., Debattista, K., Longhurst, P., Chalmers, A., and Troscianko, T. 2005. Visual attention for efficient high-fidelity graphics. In Proceedings of the 21st Spring Conference on Computer Graphics (SCCG'05). ACM, New York, 169--175.
[49]
Sundstedt, V., Gutierrez, D., Anson, O., Banterle, F., and Chalmers, A. 2007. Perceptual rendering of participating media. ACM Trans. Appl. Percept. 4, 3, 15.
[50]
Sundstedt, V., Stavrakis, E., Wimmer, M., and Reinhard, E. 2008. A psychophysical study of fixation behavior in a computer game. In Proceedings of the 5th Symposium on Applied Perception in Graphics and Visualization, S. Creem-Regehr and K. Myszkowski, Eds. ACM, New York, 43--50.
[51]
Treisman, A. M. and Gelade, G. 1980. A feature-integration theory of attention. Cognitive Psych. 12, 1, 97--136.
[52]
van Zoest, W. and Donk, M. 2004. Bottom - up and top - down control in visual search. Perception 33, 927--937.
[53]
Wolfe, J. 2000. Seeing, 2nd ed. Handbook of Perception and Cognition. Academic Press, Chapter 8, 335--386.
[54]
Wolfe, J. M. 1994. Guided search 2.0: A revised model of visual search. Psychonomic Bull. Rev. 1, 2, 202--238.
[55]
Wolfe, J. M. 2007. Guided search 4.0: Current progress with a model of visual search. In Integrated Models of Cognitive Systems, W. Gray, Ed. Oxford University Press, 99--119.
[56]
Yarbus, A. L. 1967. Eye movements during perception of complex objects. In Eye Movements and Vision, Plenum Press, New York, 171--196.

Cited By

View all
  • (2025)Experience, habit, and flow: Games user research and the forgetting of player identityThe Information Society10.1080/01972243.2025.2469229(1-18)Online publication date: 25-Feb-2025
  • (2024)Gaze-directed and saliency-guided approaches of stereo camera control in interactive virtual realityComputers & Graphics10.1016/j.cag.2023.10.012118(23-32)Online publication date: Feb-2024
  • (2021)To See or Not to SeeProceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies10.1145/34481235:1(1-25)Online publication date: 30-Mar-2021
  • Show More Cited By

Index Terms

  1. An empirical pipeline to derive gaze prediction heuristics for 3D action games

        Recommendations

        Comments

        Information & Contributors

        Information

        Published In

        cover image ACM Transactions on Applied Perception
        ACM Transactions on Applied Perception  Volume 8, Issue 1
        October 2010
        156 pages
        ISSN:1544-3558
        EISSN:1544-3965
        DOI:10.1145/1857893
        Issue’s Table of Contents
        Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

        Publisher

        Association for Computing Machinery

        New York, NY, United States

        Publication History

        Published: 10 November 2010
        Accepted: 01 December 2009
        Revised: 01 November 2009
        Received: 01 April 2009
        Published in TAP Volume 8, Issue 1

        Permissions

        Request permissions for this article.

        Check for updates

        Author Tags

        1. Gaze analysis
        2. eye-tracking
        3. gaze predictor
        4. high-level properties
        5. importance map
        6. video games
        7. virtual environments
        8. visual attention

        Qualifiers

        • Research-article
        • Research
        • Refereed

        Funding Sources

        Contributors

        Other Metrics

        Bibliometrics & Citations

        Bibliometrics

        Article Metrics

        • Downloads (Last 12 months)14
        • Downloads (Last 6 weeks)1
        Reflects downloads up to 03 Mar 2025

        Other Metrics

        Citations

        Cited By

        View all
        • (2025)Experience, habit, and flow: Games user research and the forgetting of player identityThe Information Society10.1080/01972243.2025.2469229(1-18)Online publication date: 25-Feb-2025
        • (2024)Gaze-directed and saliency-guided approaches of stereo camera control in interactive virtual realityComputers & Graphics10.1016/j.cag.2023.10.012118(23-32)Online publication date: Feb-2024
        • (2021)To See or Not to SeeProceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies10.1145/34481235:1(1-25)Online publication date: 30-Mar-2021
        • (2018)Visual Attention for Rendered 3D ShapesComputer Graphics Forum10.1111/cgf.1335337:2(191-203)Online publication date: 22-May-2018
        • (2016)Reconstructing User’s Attention on the Web through Mouse Movements and Perception-Based Content IdentificationACM Transactions on Applied Perception10.1145/291212413:3(1-21)Online publication date: 28-May-2016
        • (2016)Gaze prediction using machine learning for dynamic stereo manipulation in games2016 IEEE Virtual Reality (VR)10.1109/VR.2016.7504694(113-120)Online publication date: Mar-2016
        • (2016)eMot-iCan: Design of an assessment game for emotion recognition in players with Autism2016 IEEE International Conference on Serious Games and Applications for Health (SeGAH)10.1109/SeGAH.2016.7586228(1-7)Online publication date: May-2016
        • (2015)Anticipatory Gaze Shifts during Navigation in a Naturalistic Virtual EnvironmentProceedings of the 2015 Annual Symposium on Computer-Human Interaction in Play10.1145/2793107.2793136(277-283)Online publication date: 5-Oct-2015
        • (2015)Adapting virtual camera behaviour through player modellingUser Modeling and User-Adapted Interaction10.1007/s11257-015-9156-425:2(155-183)Online publication date: 1-Jun-2015
        • (2015)Point of Regard from Eye Velocity in Stereoscopic Virtual Environments Based on Intersections of Hypothesis SurfacesArtificial Life and Computational Intelligence10.1007/978-3-319-14803-8_10(125-141)Online publication date: 2015
        • Show More Cited By

        View Options

        Login options

        Full Access

        View options

        PDF

        View or Download as a PDF file.

        PDF

        eReader

        View online with eReader.

        eReader

        Figures

        Tables

        Media

        Share

        Share

        Share this Publication link

        Share on social media