skip to main content
10.1145/1647314.1647379acmconferencesArticle/Chapter ViewAbstractPublication Pagesicmi-mlmiConference Proceedingsconference-collections
poster

GaZIR: gaze-based zooming interface for image retrieval

Published: 02 November 2009 Publication History

Abstract

We introduce GaZIR, a gaze-based interface for browsing and searching for images. The system computes on-line predictions of relevance of images based on implicit feedback, and when the user zooms in, the images predicted to be the most relevant are brought out. The key novelty is that the relevance feedback is inferred from implicit cues obtained in real-time from the gaze pattern, using an estimator learned during a separate training phase. The natural zooming interface can be connected to any content-based information retrieval engine operating on user feedback. We show with experiments on one engine that there is sufficient amount of information in the gaze patterns to make the estimated relevance feedback a viable choice to complement or even replace explicit feedback by pointing-and-clicking.

References

[1]
R.Bates and H.Istance. Towards eye based virtual environment interaction for users with high-level motor disabilities. Special Issue of International Journal of Disability&Human Development: The International Conference Series on Disability, Virtual Reality and Associated Technologies, 4(3):275--282, 2005.
[2]
G.Buscher, A.Dengel, and L.van Elst. Eye movements as implicit relevance feedback. In CHI '08: extended abstracts on Human Factors in Computing Systems, pages 2991--2996, ACM, New York, NY, USA, 2008.
[3]
R.Datta, D.Joshi, J.Li, and J.Z. Wang. Image retrieval: Ideas, influences, and trends of the new age. ACM Computing Surveys, 40(2):1--60, April 2008.
[4]
D.R. Hardoon, J.Shawe-Taylor, A.Ajanki, K.Puolamäki, and S.Kaski. Information retrieval by inferring implicit queries from eye movements. In 11th International Conference on Artificial Intelligence and Statistics, Omnipress, 2007.
[5]
M.J. Huiskes and M.S. Lew. The MIR Flickr retrieval evaluation. In MIR '08: Proceedings of the 2008 ACM International Conference on Multimedia Information Retrieval, pages 39--43. ACM, New York, NY, USA, 2008.
[6]
R.J.K. Jacob. The use of eye movements in human-computer interaction techniques: what you look at is what you get. ACM Transactions on Information Systems, 9(2):152--169, 1991.
[7]
D.Kelly and J.Teevan. Implicit feedback for inferring user preference: a bibliography. SIGIR Forum, 37(2):18--28, 2003.
[8]
A.Klami, C.Saunders, T.de Campos, and S.Kaski. Can relevance of images be inferred from eye movements? In MIR '08: Proceeding of the 2008 ACM International Conference on Multimedia Information Retrieval, pages 134--140. ACM, New York, NY, USA, 2008.
[9]
J.Laaksonen, M.Koskela, and E.Oja. PicSOM -- self-organizing image retrieval with MPEG-7 content descriptions. IEEE Transactions on Neural Networks, 13(4):841--853, 2002.
[10]
O.Oyekoya and F.Stentiford. Perceptual image retrieval using eye movements. International Journal of Computer Mathematics, Special Issue on Computer Vision and Pattern Recognition, 84:9, 2007.
[11]
K.Puolamäki, J.Salojärvi, E.Savia, J.Simola, and S.Kaski. Combining eye movements and collaborative filtering for proactive information retrieval. Proceedings of the 28th annual international ACM SIGIR Conference on Research and Development in Information Retrieval, pages 146--153. ACM, New York, NY, USA, 2005.
[12]
K.Rayner. Eye movements in reading and information processing: 20 years of research. Psychological Bulletin, 124(3):372--422, 1998.
[13]
D.Salvucci and J.Goldberg. Identifying fixations and saccades in eye-tracking protocols. In Proceedings of the Eye Tracking Research and Applications Symposium 2000 (ETRA2000), pages 71--78. ACM, New York, NY, USA, 2000.
[14]
J.SanAgustin, H.Skovsgaard, J.P. Hansen, and D.W. Hansen. Low-cost gaze interaction: ready to deliver the promises. In CHI EA '09: extended abstracts on Human Factors in Computing Systems, pages 4453--4458. ACM, New York, NY, USA, 2009.
[15]
A.Torralba, A.Oliva, M.Castelhano, and J.Henderson. Contextual guidance of eye movements and attention in real-world scenes: The role of global features on object search. Psychological Review, 113(4):766--786, 2006.
[16]
D.J. Ward and D.J. MacKay. Fast hands-free writing by gaze direction. Nature, 418:838--840, 2002.
[17]
C.Ware. Information Visualization: Perception for Design. Morgan Kaufmann Publishers, 2nd edition, 2004.

Cited By

View all
  • (2022)Lessons Learned from an Eye Tracking Study for Targeted Advertising in the Wild2022 IEEE International Conference on Pervasive Computing and Communications Workshops and other Affiliated Events (PerCom Workshops)10.1109/PerComWorkshops53856.2022.9767470(539-544)Online publication date: 21-Mar-2022
  • (2021)Is This Really Relevant? A Guide to Best Practice Gaze-based Relevance Prediction ResearchAdjunct Proceedings of the 29th ACM Conference on User Modeling, Adaptation and Personalization10.1145/3450614.3464476(220-228)Online publication date: 21-Jun-2021
  • (2021)Conditioning Gaze-Contingent Systems for the Real World: Insights from a Field Study in the Fast Food IndustryExtended Abstracts of the 2021 CHI Conference on Human Factors in Computing Systems10.1145/3411763.3451658(1-7)Online publication date: 8-May-2021
  • Show More Cited By

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Conferences
ICMI-MLMI '09: Proceedings of the 2009 international conference on Multimodal interfaces
November 2009
374 pages
ISBN:9781605587721
DOI:10.1145/1647314
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 02 November 2009

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. gaze-based interface
  2. image retrieval
  3. implicit feedback
  4. zooming interface

Qualifiers

  • Poster

Conference

ICMI-MLMI '09
Sponsor:

Acceptance Rates

Overall Acceptance Rate 453 of 1,080 submissions, 42%

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)7
  • Downloads (Last 6 weeks)0
Reflects downloads up to 15 Feb 2025

Other Metrics

Citations

Cited By

View all
  • (2022)Lessons Learned from an Eye Tracking Study for Targeted Advertising in the Wild2022 IEEE International Conference on Pervasive Computing and Communications Workshops and other Affiliated Events (PerCom Workshops)10.1109/PerComWorkshops53856.2022.9767470(539-544)Online publication date: 21-Mar-2022
  • (2021)Is This Really Relevant? A Guide to Best Practice Gaze-based Relevance Prediction ResearchAdjunct Proceedings of the 29th ACM Conference on User Modeling, Adaptation and Personalization10.1145/3450614.3464476(220-228)Online publication date: 21-Jun-2021
  • (2021)Conditioning Gaze-Contingent Systems for the Real World: Insights from a Field Study in the Fast Food IndustryExtended Abstracts of the 2021 CHI Conference on Human Factors in Computing Systems10.1145/3411763.3451658(1-7)Online publication date: 8-May-2021
  • (2021)Exploring Gaze-Based Prediction Strategies for Preference Detection in Dynamic Interface ElementsProceedings of the 2021 Conference on Human Information Interaction and Retrieval10.1145/3406522.3446013(129-139)Online publication date: 14-Mar-2021
  • (2021)The Subconscious Director: Dynamically Personalizing Videos Using Gaze DataProceedings of the 26th International Conference on Intelligent User Interfaces10.1145/3397481.3450679(98-108)Online publication date: 14-Apr-2021
  • (2020)Detecting Relevance during Decision-Making from Eye Movements for UI AdaptationACM Symposium on Eye Tracking Research and Applications10.1145/3379155.3391321(1-11)Online publication date: 2-Jun-2020
  • (2019)Deep Spatio-Temporal Modeling for Object-Level Gaze-Based Relevance Assessment2019 27th European Signal Processing Conference (EUSIPCO)10.23919/EUSIPCO.2019.8902990(1-5)Online publication date: Sep-2019
  • (2017)Gaze movement-driven random forests for query clustering in automatic video annotationMultimedia Tools and Applications10.5555/3048787.304883776:2(2861-2889)Online publication date: 1-Jan-2017
  • (2017)Assessing the Usability of Gaze-Adapted Interface against Conventional Eye-Based Input Emulation2017 IEEE 30th International Symposium on Computer-Based Medical Systems (CBMS)10.1109/CBMS.2017.155(793-798)Online publication date: Jun-2017
  • (2016)Catching Relevance in One GlimpseProceedings of the International Working Conference on Advanced Visual Interfaces10.1145/2909132.2926078(324-325)Online publication date: 7-Jun-2016
  • Show More Cited By

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Figures

Tables

Media

Share

Share

Share this Publication link

Share on social media