skip to main content
research-article

How late can you update gaze-contingent multiresolutional displays without detection?

Published: 12 December 2007 Publication History

Abstract

This study investigated perceptual disruptions in gaze-contingent multiresolutional displays (GCMRDs) due to delays in updating the center of highest resolution after an eye movement. GCMRDs can be used to save processing resources and transmission bandwidth in many types of single-user display applications, such as virtual reality, video-telephony, simulators, and remote piloting. The current study found that image update delays as late as 60 ms after an eye movement did not significantly increase the detectability of image blur and/or motion transients due to the update. This is good news for designers of GCMRDs, since 60 ms is ample time to update many GCMRDs after an eye movement without disrupting perception. The study also found that longer eye movements led to greater blur and/or transient detection due to moving the eyes further into the low-resolution periphery, effectively reducing the image resolution at fixation prior to the update. In GCMRD applications where longer saccades are more likely (e.g., displays with relatively large distances between objects), this problem could be overcome by increasing the size of the region of highest resolution.

References

[1]
Burr, D. C., Morrone, M. C., and Ross, J. 1994. Selective suppression of the magnocellular visual pathway during saccadic eye movements. Nature 371, 6947, 511--513.
[2]
Draper, M. H., Viirre, E. S., Furness, T. A., and Gawron, V. J. 2001. Effects of image scale and system time delay on simulator sickness within head-coupled virtual environments. Hum. Factors 43, 1, 129--146.
[3]
Duchowski, A. T., Cournia, N., and Murphy, H. 2004. Gaze-Contingent displays: A review. Cyberpsychol. Behav. 7, 6, 621--634.
[4]
Duchowski, A. T. and Çöltekin, A. 2007. Foveated gaze-contingent displays for peripheral LOD management, 3D visualization and stereo imaging. ACM Trans. Multimedia Comput. Commun. Appl. 3, 4 (this issue).
[5]
Frank, L. H., Casali, J. G., and Wierwille, W. W. 1988. Effects of visual display and motion system delays on operator performance and uneasiness in a driving simulator. Hum. Factors 30, 2, 201--217.
[6]
Geisler, W. S. and Perry, J. S. 1998. A real-time foveated multi-resolution system for low-bandwidth video communication. In Proceedings of the SPIE: The International Society for Optical Engineering 3299, 294--305.
[7]
Geisler, W. S. and Perry, J. S. 1999. Variable-Resolution displays for visual communication and simulation. Soc. Inf. Display, 30, 420--423.
[8]
Geri, G. A. and Zeevi, Y. Y. 1995. Visual assessment of variable-resolution imagery. J. Optic. Soc. Amer. A-Optic. Image Sci. 12, 10, 2367--2375.
[9]
Grunwald, A. J. and Kohn, S. 1994. Visual field information in low-altitude visual flight by line-of-sight slaved helmet-mounted displays. IEEE Trans. Syst. Man Cybernet. 24, 1, 120--134.
[10]
Hodgson, T. L., Murray, P. M., and Plummer, A. R. 1993. Eye movements during “area of interest” viewing. In Perception and Cognition: Advances in Eye Movement Research, G. d'Ydewalle and J. V. Rensbergen, eds. Elsevier Science, New York, 115--123.
[11]
Komogortsey, O. and Khan, J. 2004. Predictive perceptual compression for real time video compression. In Proceedings of the 12th Annual ACM International Conference on Multimedia, New York, 220--227.
[12]
Kortum, P. T. and Geisler, W. S. 1996. Search performance in natural scenes: The role of peripheral vision. Investigat. Ophthamal. Visual Sci. 37, 3, S297.
[13]
Loschky, L. C. and McConkie, G. W. 2002. Investigating spatial vision and dynamic attentional selection using a gaze-contingent multi-resolutional display. J. Exper. Psychol. Appl. 8, 2, 99--117.
[14]
Loschky, L. C. and McConkie, G. W. 2000. User performance with gaze contingent multiresolutional displays. In Proceedings of the Eye Tracking Research and Applications Symposium, Palm Beach, FL, 97--103.
[15]
Loschky, L. C., McConkie, G. W., Yang, J., and Miller, M. E. 2005. The limits of visual resolution in natural scene viewing. Visual Cogn. 12, 6, 1057--1092.
[16]
Luebke, D., Hallen, B., Newfield, D., and Watson, B. 2000. Perceptually driven simplification using gaze-directed rendering. Tech. Rep. CS-2000-04, Department of Computer Science, University of Virginia, Charlottesville, VA.
[17]
McConkie, G. W. and Loschky, L. C. 2002. Perception onset time during fixations in free viewing. Behav. Res. Methods Instrum. Comput. 34, 4, 481--490.
[18]
Ohshima, T., Yamamoto, H., and Tamura, H. 1996. Gaze-Directed adaptive rendering for interacting with virtual space. In Proceedings of the Annual International IEEE Symposium on Virtual Reality, 103--110.
[19]
Parkhurst, D., Culurciello, E., and Neibur, E. 2000. Evaluating variable resolution displays with visual search: Task performance and eye movements. In Proceedings of the Eye Tracking Research and Applications Symposium, Palm Beach, FL, A. T. Duchowski, ed. 105--109.
[20]
Parkhurst, D. J. and Neibur, E. 2002. Variable resolution displays: A theoretical, practical and behavioral evaluation. Hum. Factors 44, 611--629.
[21]
Reingold, E. M. and Loschky, L. C. 2002. Saliency of peripheral targets in gaze-contingent multiresolutional displays. Behav. Res. Methods Instrum. Comput. 34, 4, 491--499.
[22]
Reingold, E. M., Loschky, L. C., McConkie, G. W., and Stampe, D. M. 2003. Gaze-Contingent multiresolutional displays: An integrative review. Hum. Factors 45, 2, 307--328.
[23]
Reingold, E. M. and Stampe, D. M. 2000. Saccadic inhibition and gaze contingent research paradigms. In Reading as a Perceptual Process, A. Kennedy et al., Eds. Elsevier, Amsterdam, 119--145.
[24]
Ross, J., Morrone, M. C., Goldberg, M. E., and Burr, D. C. 2001. Changes in visual perception at the time of saccades. Trends Neurosci. 24, 2, 113--121.
[25]
Sere, B., Marendaz, C., and Herault, J. 2000. Nonhomogenous resolution of images of natural scenes. Percept. 29, 12, 1403--1412.
[26]
Shioiri, S. 1993. Postsaccadic processing of the retinal image during picture scanning. Percept. Psychophys. 53, 3, 305--314.
[27]
Shioiri, S. and Ikeda, M. 1989. Useful resolution for picture perception as a function of eccentricity. Percept. 18, 347--361.
[28]
Thomas, M. and Geltmacher, H. 1993. Combat simulator display development. Inf. Display 9, 23--26.
[29]
Turner, J. A. 1984. Evaluation of an eye-slaved area-of-interest display for tactical combat simulation. In Proceedings of the 6th International Interservice/Industry Training Equipment Conference and Exhibition, 75--86.
[30]
van Diepen, P. M. J. and Wampers, M. 1998. Scene exploration with Fourier-filtered peripheral information. Percept. 27, 10, 1141--1151.
[31]
Volkmann, F. C., Riggs, L. A., White, K. D., and Moore, R. K. 1978. Contrast sensitivity during saccadic eye movements. Vision Res. 18, 9, 1193--1199.
[32]
Watson, B. A., Walker, N., Hodges, L. F., and Worden, A. 1997. Managing level of detail through peripheral degradation: Effects on search performance with a head-mounted display. ACM Trans. Comput.-Hum. Interact. 4, 4, 323--346.
[33]
Yang, J., Coia, T., and Miller, M. 2001. Subjective evaluation of retinal-dependent image degradations. In Proceedings of the Conference on Processing, Image Quality and Image Capture Systems, Springfield, VA. 142--147.

Cited By

View all
  • (2025)The fundamentals of eye tracking part 3: How to choose an eye trackerBehavior Research Methods10.3758/s13428-024-02587-x57:2Online publication date: 22-Jan-2025
  • (2025)The fundamentals of eye tracking part 4: Tools for conducting an eye tracking studyBehavior Research Methods10.3758/s13428-024-02529-757:1Online publication date: 6-Jan-2025
  • (2024)Theia: Gaze-driven and Perception-aware Volumetric Content Delivery for Mixed Reality HeadsetsProceedings of the 22nd Annual International Conference on Mobile Systems, Applications and Services10.1145/3643832.3661858(70-84)Online publication date: 3-Jun-2024
  • Show More Cited By

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Transactions on Multimedia Computing, Communications, and Applications
ACM Transactions on Multimedia Computing, Communications, and Applications  Volume 3, Issue 4
December 2007
147 pages
ISSN:1551-6857
EISSN:1551-6865
DOI:10.1145/1314303
Issue’s Table of Contents
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 12 December 2007
Accepted: 01 August 2007
Received: 01 August 2007
Published in TOMM Volume 3, Issue 4

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. Gaze-contingent
  2. area of interest
  3. bandwidth
  4. blur detection
  5. contrast thresholds
  6. display updates
  7. eye movements
  8. eye tracking
  9. foveated
  10. foveation
  11. level-of-detail
  12. multiresolution
  13. perceptual compression
  14. peripheral vision
  15. saccades
  16. saccadic suppression
  17. visual perception

Qualifiers

  • Research-article
  • Research
  • Refereed

Funding Sources

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)18
  • Downloads (Last 6 weeks)2
Reflects downloads up to 05 Mar 2025

Other Metrics

Citations

Cited By

View all
  • (2025)The fundamentals of eye tracking part 3: How to choose an eye trackerBehavior Research Methods10.3758/s13428-024-02587-x57:2Online publication date: 22-Jan-2025
  • (2025)The fundamentals of eye tracking part 4: Tools for conducting an eye tracking studyBehavior Research Methods10.3758/s13428-024-02529-757:1Online publication date: 6-Jan-2025
  • (2024)Theia: Gaze-driven and Perception-aware Volumetric Content Delivery for Mixed Reality HeadsetsProceedings of the 22nd Annual International Conference on Mobile Systems, Applications and Services10.1145/3643832.3661858(70-84)Online publication date: 3-Jun-2024
  • (2024)Saccade-Contingent RenderingACM SIGGRAPH 2024 Conference Papers10.1145/3641519.3657420(1-9)Online publication date: 13-Jul-2024
  • (2024)Uncovering and Addressing Blink-Related Challenges in Using Eye Tracking for Interactive SystemsProceedings of the 2024 CHI Conference on Human Factors in Computing Systems10.1145/3613904.3642086(1-23)Online publication date: 11-May-2024
  • (2023)Practical Perception-Based Evaluation of Gaze Prediction for Gaze Contingent RenderingProceedings of the ACM on Human-Computer Interaction10.1145/35911347:ETRA(1-17)Online publication date: 18-May-2023
  • (2022)RETRACTED ARTICLE: Eye tracking: empirical foundations for a minimal reporting guidelineBehavior Research Methods10.3758/s13428-021-01762-855:1(364-416)Online publication date: 6-Apr-2022
  • (2022)Opportunities and Limitations of a Gaze-Contingent Display to Simulate Visual Field Loss in Driving Simulator StudiesFrontiers in Neuroergonomics10.3389/fnrgo.2022.9161693Online publication date: 10-Jun-2022
  • (2022)Towards retina-quality VR video streamingACM SIGCOMM Computer Communication Review10.1145/3523230.352323352:1(10-19)Online publication date: 1-Mar-2022
  • (2022)The Eye in Extended Reality: A Survey on Gaze Interaction and Eye Tracking in Head-worn Extended RealityACM Computing Surveys10.1145/349120755:3(1-39)Online publication date: 25-Mar-2022
  • Show More Cited By

View Options

Login options

Full Access

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Figures

Tables

Media

Share

Share

Share this Publication link

Share on social media