skip to main content
10.1145/1785455.1785464acmotherconferencesArticle/Chapter ViewAbstractPublication PagesahConference Proceedingsconference-collections
research-article

KIBITZER: a wearable system for eye-gaze-based mobile urban exploration

Published: 02 April 2010 Publication History

Abstract

Due to the vast amount of available georeferenced information novel techniques to more intuitively and efficiently interact with such content are increasingly required. In this paper, we introduce KIBITZER, a lightweight wearable system that enables the browsing of urban surroundings for annotated digital information. KIBITZER exploits its user's eye-gaze as natural indicator of attention to identify objects-of-interest and offers speech- and non-speech auditory feedback. Thus, it provides the user with a 6th sense for digital georeferenced information. We present a description of our system's architecture and the interaction technique and outline experiences from first functional trials.

References

[1]
Barakonyi, I., Prendinger, H., Schmalstieg, D., and Ishizuka, M. 2007. Cascading Hand and Eye Movement for Augmented Reality Videoconferencing. In Proc. of 3D User Interfaces, 71--78.
[2]
Bolt, R. A. 1982. Eyes at the Interface. In Proc. of Human Factors in Computer Systems Conference, 360--362.
[3]
Bulling, A., Ward, J. A., Gellersen, H., and Tröster, G. 2009. Eye Movement Analysis for Activity Recognition. In Proc. of the 11th International Conference on Ubiquitous Computing, 41--50.
[4]
Duchowski, A. T. 2002. A breadth-first survey of eye tracking applications. In Behavior Research Methods, Instruments, & Computers (BRMIC), 34(4), 455--470.
[5]
Feiner, S., MacIntyre, B., Höllerer T., and Webster, A. 1997. A Touring Machine: Prototyping 3D Mobile Augmented Reality Systems for Exploring the Urban Environment. In Personal and Ubiquitous Computing, Vol. 1, No. 4, 208--217.
[6]
Fitts, P. M., Jones, R. E., and Milton, J. L. 1950. Eye movements of aircraft pilots during instrument-landing approaches. In Aeronautical Engineering Review 9(2), 24--29.
[7]
Fröhlich, P., Simon, R., and Baillie, L. 2009. Mobile Spatial Interaction. Personal and Ubiquitous Computing, Vol. 13, No. 4, 251--253.
[8]
Google Earth. http://earth.google.com. Accessed January 7 2010.
[9]
Jacob, R. J. K. 1990. What you look at is what you get: eye movement-based interaction techniques. In Proc. of the SIGCHI conference on Human factors in computing systems, 11--18.
[10]
Kooper, R., and MacIntyre, B. 2003. Browsing the Real-World Wide Web: Maintaining Awareness of Virtual Information in an AR Information Space. In International Journal of Human-Computer Interaction, Vol. 16, No. 3, 425--446.
[11]
Morimoto, C. H., and Mimica, M. R. M. 2005. Eye gaze tracking techniques for interactive applications. In Computer Vision and Image Understanding, Vol. 98, No. 1, 4--24.
[12]
Park, H. M., Lee, S. H., and Choi, J. S. 2008. Wearable Augmented Reality System using Gaze Interaction. In Proc. of the 7th IEEE/ACM international Symposium on Mixed and Augmented Reality, 175--176.
[13]
Reitmayr, G., and D. Schmalstieg, D. 2004. Collaborative Augmented Reality for Outdoor Navigation and Information Browsing. In Proc. of Symposium on Location Based Services and TeleCartography, 31--41.
[14]
Schmalstieg, D., and Wagner, D. 2007. The World as a User Interface: Augmented Reality for Ubiquitous Computing. In Proc. of Symposium on Location Based Services and TeleCartography, 369--391.
[15]
Simon, R. 2006. The Creative Histories Mobile Explorer - Implementing a 3D Multimedia Tourist Guide for Mass-Market Mobile Phones. In Proc. of EVA.
[16]
Simon, R., and Fröhlich, P. 2007. A Mobile Application Framework for the Geo-spatial Web. In Proc. of the 16th International World Wide Web Conference, 381--390.
[17]
SMI iView X#8482; HED. http://www.smivision.com/en/eye-gaze-tracking-systems/products/iview-x-hed.html. Accessed January 07 2010.
[18]
Vertegaal, R. 2002. Designing Attentive Interfaces. In Proc. of the 2002 Symposium on Eye Tracking Research & Applications, 23--30.
[19]
Ware, C., and Mikaelian, H. T. 1987. An evaluation of an eye tracker as a device for computer input. In Proc. of the ACM CHI + GI-87 Human Factors in Computing Systems Conference, 183--188.
[20]
Wikitude. http://www.mobilizy.com/wikitude.php. Accessed January 07 2010.

Cited By

View all
  • (2024)Eye Tracking in Virtual RealityEncyclopedia of Computer Graphics and Games10.1007/978-3-031-23161-2_170(681-688)Online publication date: 5-Jan-2024
  • (2022)Emerging Wearable Biosensor Technologies for Stress Monitoring and Their Real-World ApplicationsBiosensors10.3390/bios1212109712:12(1097)Online publication date: 30-Nov-2022
  • (2022)A Systematic Review of Visualization Techniques and Analysis Tools for Eye-Tracking in 3D EnvironmentsFrontiers in Neuroergonomics10.3389/fnrgo.2022.9100193Online publication date: 13-Jul-2022
  • Show More Cited By

Index Terms

  1. KIBITZER: a wearable system for eye-gaze-based mobile urban exploration

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Other conferences
    AH '10: Proceedings of the 1st Augmented Human International Conference
    April 2010
    175 pages
    ISBN:9781605588254
    DOI:10.1145/1785455
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 02 April 2010

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. eye-gaze
    2. mobile spatial interaction
    3. wearable computing

    Qualifiers

    • Research-article

    Conference

    AH '10

    Acceptance Rates

    AH '10 Paper Acceptance Rate 25 of 46 submissions, 54%;
    Overall Acceptance Rate 121 of 306 submissions, 40%

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)10
    • Downloads (Last 6 weeks)0
    Reflects downloads up to 17 Feb 2025

    Other Metrics

    Citations

    Cited By

    View all
    • (2024)Eye Tracking in Virtual RealityEncyclopedia of Computer Graphics and Games10.1007/978-3-031-23161-2_170(681-688)Online publication date: 5-Jan-2024
    • (2022)Emerging Wearable Biosensor Technologies for Stress Monitoring and Their Real-World ApplicationsBiosensors10.3390/bios1212109712:12(1097)Online publication date: 30-Nov-2022
    • (2022)A Systematic Review of Visualization Techniques and Analysis Tools for Eye-Tracking in 3D EnvironmentsFrontiers in Neuroergonomics10.3389/fnrgo.2022.9100193Online publication date: 13-Jul-2022
    • (2022)Augmented CBRNE Responder - Directions for Future Research13th Augmented Human International Conference10.1145/3532525.3532533(1-4)Online publication date: 26-May-2022
    • (2021)Gaze-driven placement of items for proactive visual explorationJournal of Visualization10.1007/s12650-021-00808-525:3(613-633)Online publication date: 11-Nov-2021
    • (2020)Automatic Museum Audio GuideSensors10.3390/s2003077920:3(779)Online publication date: 31-Jan-2020
    • (2018)Eye-tracking (nejen) v kognitivní kartografii10.5507/prf.18.24453132Online publication date: 2018
    • (2018)Mobile consumer shopping journey in fashion retailProceedings of the 2018 ACM Symposium on Eye Tracking Research & Applications10.1145/3204493.3208335(1-3)Online publication date: 14-Jun-2018
    • (2018)Eye Tracking in Virtual RealityEncyclopedia of Computer Graphics and Games10.1007/978-3-319-08234-9_170-1(1-8)Online publication date: 24-Nov-2018
    • (2017)Points of Interest Density Based Zooming Interface for Map Exploration on Smart GlassHuman Interface and the Management of Information: Information, Knowledge and Interaction Design10.1007/978-3-319-58521-5_16(208-216)Online publication date: 18-May-2017
    • Show More Cited By

    View Options

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Figures

    Tables

    Media

    Share

    Share

    Share this Publication link

    Share on social media