skip to main content
10.1145/1385569.1385589acmconferencesArticle/Chapter ViewAbstractPublication PagesaviConference Proceedingsconference-collections
research-article

The inspection of very large images by eye-gaze control

Published: 28 May 2008 Publication History

Abstract

The increasing availability and accuracy of eye gaze detection equipment has encouraged its use for both investigation and control. In this paper we present novel methods for navigating and inspecting extremely large images solely or primarily using eye gaze control. We investigate the relative advantages and comparative properties of four related methods: Stare-to-Zoom (STZ), in which control of the image position and resolution level is determined solely by the user's gaze position on the screen; Head-to-Zoom (HTZ) and Dual-to-Zoom (DTZ), in which gaze control is augmented by head or mouse actions; and Mouse-to-Zoom (MTZ), using conventional mouse input as an experimental control.
The need to inspect large images occurs in many disciplines, such as mapping, medicine, astronomy and surveillance. Here we consider the inspection of very large aerial images, of which Google Earth is both an example and the one employed in our study. We perform comparative search and navigation tasks with each of the methods described, and record user opinions using the Swedish User-Viewer Presence Questionnaire. We conclude that, while gaze methods are effective for image navigation, they, as yet, lag behind more conventional methods and interaction designers may well consider combining these techniques for greatest effect.

References

[1]
Adams, N., Witkowski, M. and Spence, R. (2007) The Exploration of Large Image Spaces by Gaze Control, Proc. COGAIN-07, 78--81.
[2]
Ashmore, M., Duchowski, A. T., & Shoemaker, G. (2005) Efficient eye pointing with a fisheye lens, Proc. Graphics Interface GI-2005, 203--210.
[3]
Bates, R. and Istance, H. (2005) Fly Where You Look: Enhancing Gaze Based Interaction in 3D Environments, Proc. COGAIN-05, 30--32.
[4]
Bertera, J. H. and Rayner, K. (2000) Eye Movements and the Span of the Effective Stimulus in Visual Search, Perception & Psychophysics, 62(3), 576--585.
[5]
Cooper, K., de Bruijn, O., Spence, R. and Witkowski, M. (2006) A Comparison on Static and Moving Presentation Modes for Image Collections, Proc. AVI-06, 381--388.
[6]
Duchowski., A. T. (2003) Eye Tracking Methodology: Theory & Practice. Springer-Verlag, London, UK.
[7]
Duchowski, A. T., Cournia, N. and Murphy, H. (2004) Gaze-Contingent Displays: A Review, Cyberpsychology & Behavior, 7(6), 621--634.
[8]
Fono, D. & Vertegaal, R. (2005) EyeWindows: evaluation of eye-controlled zooming windows for focus selection. Proc. CHI-05, 151--160.
[9]
Gips, J. and Olivieri, P. (1996) EagleEyes: An Eye Control System for Persons with Disabilities, Proc. 11th Int. Conf on Technology and Persons with Disabilities, 13pp.
[10]
Gutwin, C. (2002) Improving Focus Targeting in Interactive Fisheye Views, Proc. CHI-02, 267--274.
[11]
Jacob, R. J. K. (1995) Eye Tracking in Advanced Interface Design, in: Barfield, W. and Furness, T. A. (eds.) Virtual Environemts and Advanced Interface Design, New York:Oxford University Press, 258--288.
[12]
Kumar, M., Paepcke, A. and Winograd, T. (2007) EyePoint: Practical Pointing and Selection Using Gaze and Keyboard, Proc. CHI-07, 421--430.
[13]
Larsson, P., Västfjäll, D., and Kleiner, M. (2001). The Actor-observer Effect in Virtual Reality Presentations. CyberPsychology and Behavior, 4(2), 239--246.
[14]
Lepinski, G. J. and Vertegaal, R. (2007) Using Face Position for Low Cost Input, Long Range and Oculomotor Impaired Users, Proc. COGAIN-07, 71--73.
[15]
Majaranta, P. and Räihä, K. J. (2002) Twenty Years of Eye Typing: Systems and Design Issues, Proc. ETRA-02, 15--22.
[16]
Mello-Thomas, C. (2003) Perception of Breast Cancer: Eye-Position Analysis of Mammogram Interpretation, Acad. Radiol., 10, 4--12.
[17]
Miniotas, D. and Špakov, O. (2004) An Algorithm to Counteract Eye Jitter in Gaze-Controlled Interfaces. Information Technology and Control, 1(30), 65--68.
[18]
Miniotas, D., Špakov, O. and Scott MacKenzie, I. (2004) Eye Gaze Interaction with Expanding Targets, Proc. CHI-04, 1255--1258.
[19]
Pirolli, P., Card, S. K. and van der Wege, M. M. (2000) The Effect of Information Scent on Searching Information Visualizations of Large Tree Structures, Proc. AVI-00, 161--172.
[20]
Sibert, L. E. and Jacob, R. J. K. (2000) Evaluation of Eye Gaze interaction, Proc. CHI-00, 281--288.
[21]
Spence, R. and Apperley, M. D. (1982): Data Base Navigation: An Office Environment for the Professional. Behaviour and Information Technology, 1(1), 43--54.
[22]
Starker, I. and Bolt, R. A. (1990) A Gaze Responsive Self-Disclosing Display, Proc. CHI-90, 3--9.
[23]
Tiersma, E. S. M., Peters, A. A. W., Mooij, H. A. and Fleuren, G. J. (2003) Visualising Scanning Patterns of Pathologists in the Grading of Cervical Intraepithelial Neoplasia, J. Clin. Pathol., 56, 677--680.
[24]
Zelinsky, G. J. and Sheinberg, D. L. (1997) Eye Movements During Parallel-Serial Visual Search, J. Exp. Psychol.: Human Perception and Performance, 23(1), 244--262.
[25]
Zhai, S., Morimoto, C. and Ihde, S. (1999) Manual and Gaze Input Cascaded (MAGIC) Pointing, Proc. CHI-99, 246--253

Cited By

View all
  • (2018)Interaction for Immersive AnalyticsImmersive Analytics10.1007/978-3-030-01388-2_4(95-138)Online publication date: 16-Oct-2018
  • (2017)Enhancing Zoom and Pan in Ultrasound Machines with a Multimodal Gaze-based InterfaceProceedings of the 2017 CHI Conference Extended Abstracts on Human Factors in Computing Systems10.1145/3027063.3053174(1648-1654)Online publication date: 6-May-2017
  • (2016)Towards cognitively grounded gaze-controlled interfacesPersonal and Ubiquitous Computing10.1007/s00779-016-0970-420:6(1035-1047)Online publication date: 1-Nov-2016
  • Show More Cited By

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Conferences
AVI '08: Proceedings of the working conference on Advanced visual interfaces
May 2008
483 pages
ISBN:9781605581415
DOI:10.1145/1385569
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 28 May 2008

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. eye-gaze control
  2. image space navigation
  3. user interaction studies
  4. visual interaction

Qualifiers

  • Research-article

Conference

AVI '08
Sponsor:

Acceptance Rates

Overall Acceptance Rate 128 of 490 submissions, 26%

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)7
  • Downloads (Last 6 weeks)0
Reflects downloads up to 27 Jan 2025

Other Metrics

Citations

Cited By

View all
  • (2018)Interaction for Immersive AnalyticsImmersive Analytics10.1007/978-3-030-01388-2_4(95-138)Online publication date: 16-Oct-2018
  • (2017)Enhancing Zoom and Pan in Ultrasound Machines with a Multimodal Gaze-based InterfaceProceedings of the 2017 CHI Conference Extended Abstracts on Human Factors in Computing Systems10.1145/3027063.3053174(1648-1654)Online publication date: 6-May-2017
  • (2016)Towards cognitively grounded gaze-controlled interfacesPersonal and Ubiquitous Computing10.1007/s00779-016-0970-420:6(1035-1047)Online publication date: 1-Nov-2016
  • (2015)Look & PedalProceedings of the 2015 ACM on International Conference on Multimodal Interaction10.1145/2818346.2820751(123-130)Online publication date: 9-Nov-2015
  • (2015)Sequential and simultaneous tactile stimulation with multiple actuators on head, neck and back for gaze cuing2015 IEEE World Haptics Conference (WHC)10.1109/WHC.2015.7177734(333-338)Online publication date: Jun-2015
  • (2014)U-RemoCHI '14 Extended Abstracts on Human Factors in Computing Systems10.1145/2559206.2581215(1609-1614)Online publication date: 26-Apr-2014
  • (2013)A review of eye-tracking applications as tools for trainingCognition, Technology and Work10.1007/s10111-012-0234-715:3(313-327)Online publication date: 1-Aug-2013
  • (2012)Computer Control by GazeGaze Interaction and Applications of Eye Tracking10.4018/978-1-61350-098-9.ch009(78-102)Online publication date: 2012
  • (2012)Look & touchProceedings of the SIGCHI Conference on Human Factors in Computing Systems10.1145/2207676.2208709(2981-2990)Online publication date: 5-May-2012
  • (2012)Investigating gaze-supported multimodal pan and zoomProceedings of the Symposium on Eye Tracking Research and Applications10.1145/2168556.2168636(357-360)Online publication date: 28-Mar-2012
  • Show More Cited By

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Figures

Tables

Media

Share

Share

Share this Publication link

Share on social media