skip to main content
10.1145/1095034.1095043acmconferencesArticle/Chapter ViewAbstractPublication PagesuistConference Proceedingsconference-collections
Article

ViewPointer: lightweight calibration-free eye tracking for ubiquitous handsfree deixis

Published: 23 October 2005 Publication History

Abstract

We introduce ViewPointer, a wearable eye contact sensor that detects deixis towards ubiquitous computers embedded in real world objects. ViewPointer consists of a small wearable camera no more obtrusive than a common Bluetooth headset. ViewPointer allows any real-world object to be augmented with eye contact sensing capabilities, simply by embedding a small infrared (IR) tag. The headset camera detects when a user is looking at an infrared tag by determining whether the reflection of the tag on the cornea of the user's eye appears sufficiently central to the pupil. ViewPointer not only allows any object to become an eye contact sensing appliance, it also allows identification of users and transmission of data to the user through the object. We present a novel encoding scheme used to uniquely identify ViewPointer tags, as well as a method for transmitting URLs over tags. We present a number of scenarios of application as well as an analysis of design principles. We conclude eye contact sensing input is best utilized to provide context to action.

References

[1]
Argyle, M. and M. Cook. Gaze and Mutual Gaze. London: Cambridge University Press, 1976.]]
[2]
Ballard, D., Hayhoe, M., Pook, P., Rao, R. Deictic codes for the embodiment of cognition. Behavioral and Brain Sciences, 20, 723--767, 1997.]]
[3]
Dickie, C., Vertegaal, R, Chen, D., Fono, D., Cheng, D. and Sohn, C. Augmenting and Sharing Memory with eyeBlog. In Proceedings of 1st ACM Workshop on Continous Archival and Retrieval of Personal Experiences. NYC: ACM Press, 2004.]]
[4]
Gibbs, W. Considerate Computing. Scientific American Jan 2005, pp. 54--61.]]
[5]
Guiard, Y. Asymmetric division of labor in human skilled bimanual action: The kinematic chain as a model. Journal of Motor Behavior 19, 4, 1987, pp. 486--517.]]
[6]
Henderson, J. M., & Ferreira, F. Scene perception for psycholinguists. In The interface of language, vision, and action: Eye movements and the visual world. pp. 1-58. New York: Psychology Press, 2004.]]
[7]
Hornof, A. J., Cavender, A., & Hoselton, R. EyeDraw: A system for drawing pictures with the eyes. Extended Abstracts of ACM CHI 2004: Conference on Human Factors in Computing Systems, New York: ACM, 2004, pp. 1251--1254.]]
[8]
Jacob, R. J. What You Look at is What You Get: Eye Movement-Based Interaction Techniques. In CHI '90 Proceedings, 1990, pp. 11--18.]]
[9]
Levinson, S. C. Pragmatics. Cambridge, England: Cambridge University. 1983.]]
[10]
Maglio, P., Matlock, T., Campbell, C., Zhai, Z., Smith, B. Gaze and speech in attentive user interfaces. In Proceedings of the International Conference on Multimodal Interfaces. Berlin: Springer-Verlag, 2000.]]
[11]
Mamuji, A., Vertegaal, R., Shell, J., Pham, T. and Sohn, C. AuraLamp: Contextual Speech Recognition In An Eye Contact Sensing Light Appliance. In Extended Abstracts of Ubicomp 03, 2003.]]
[12]
Morimoto, C. H., D. Koon, A. Amir, and M. Flickner. Pupil detection and tracking using multiple light sources. Image and Vision Computing 18, 2000, pp. 331--334.]]
[13]
Norman, D.A. Some Observations on Mental Models. In: Gentner, D. & Stevens, A.L. (eds.) Mental Models. Lawrence Erlbaum, Hillsdale, New Jersey, 1983, pp 7--14]]
[14]
Nyquist, H. Certain topics in telegraph transmission theory. Trans. AIEE, vol. 47, Apr. 1928, pp. 617-644.]]
[15]
Oh, A., Fox, H., Van Kleek, M., Adler, A,. Gajos, K., Morency, L-P, and T. Darrell. Evaluating Look-to-Talk: A gaze-aware interface in a collaborative environment. In Extended Abstracts of CHI 2002. Seattle: ACM Press, 2002, pp. 650--651.]]
[16]
Pylyshyn, Z. W. Visual indexes, preconceptual objects, and situated vision. Cognition, 80:127--158, 2001.]]
[17]
Selker, T., Lockerd, A., Martinez J., Eye-R, a glasses-mounted eye motion detection interface, CHI '01 extended abstracts on Human factors in computing systems, March 31-April 05, 2001, Seattle, Washington]]
[18]
Shell, J. S., Selker, T., and R. Vertegaal. Interacting with Groups of Computers. Communications of the ACM Vol. 46 No. 3 March 2003, ACM Press, New York. pp. 40--46.]]
[19]
Shell, J. S., Vertegaal, R. and A. Skaburskis. EyePliances: Attention-Seeking Devices that Respond to Visual Attention In Extended Abstracts CHI 2003. Ft. Lauderdale: ACM Press, 2003, pp. 770--771.]]
[20]
Shell, J. S., Vertegaal, R., Cheng, D., Skaburskis, A. W., Sohn, C., Stewart, A. J., Aoudeh, O., Dickie, C. ECSGlasses and EyePliances: Using Attention to Open Sociable Windows of Interaction. In Proceedings of ACM Eye Tracking Research and Applications Symposium 04, San Antonio, TX, 2004.]]
[21]
Shell, J. S., Vertegaal, R., Mamuji, A., Pham, T., Sohn, C. and A. Skaburskis. EyePliances and EyeReason: Using Attention to Drive Interactions with Ubiquitous Appliances. In Extended Abstracts of UIST 2003. Vancouver: ACM Press, 2003.]]
[22]
Think Outside, Inc., Stowaway Bluetooth Keyboard. http://www.thinkoutside.com, 2005.]]
[23]
Velichkovsky, B.M. and Hansen, J.P. New technological windows to mind: There is more in eyes and brains for human computer interaction. In Proceedings of ACM CHI'96 Conference on Human Factors in Computing Systems. Vancouver, Canada: ACM, 1996, pp. 496--503.]]
[24]
Vertegaal, R. The GAZE Groupware System: Mediating joint Attention in Multiparty Communication and Collaboration. In Proceedings of CHI'99. Pittsburgh: ACM Press, 1999. pp. 294--301.]]
[25]
Vertegaal, R., Van der Veer, G. and Vons, H. Effects of Gaze on Multiparty Mediated Communication. In Proceedings of Graphics Interface 2000, 2000, pp.95--102.]]
[26]
Weiser, M. The Computer for the 21st Century. Scientific American 265(3), 1991, pp. 94--104.]]
[27]
X10, http://www.x10.com, 2005.]]
[28]
Yu, C., Ballard, D. H., "A Multimodal Learning Interface for Grounding Spoken Language in Sensory Perceptions", ACM Transactions on Applied Perception, 1, 2004, 57--80.]]
[29]
Zhai, S., Morimoto, C., Ihde, S., Manual and gaze input cascaded (MAGIC) pointing, Proceedings of the CHI'99 Conference on Human Factors in Computing Systems, 1999, pp. 246--253.]]

Cited By

View all
  • (2023)Assessing Eye Tracking for Continuous Central Field Loss MonitoringProceedings of the 22nd International Conference on Mobile and Ubiquitous Multimedia10.1145/3626705.3627776(54-64)Online publication date: 3-Dec-2023
  • (2022)A new robust multivariate mode estimator for eye-tracking calibrationBehavior Research Methods10.3758/s13428-022-01809-455:2(516-553)Online publication date: 16-Mar-2022
  • (2022)Visual Typer for the Handicapped2022 Advances in Science and Engineering Technology International Conferences (ASET)10.1109/ASET53988.2022.9735081(1-6)Online publication date: 21-Feb-2022
  • Show More Cited By

Index Terms

  1. ViewPointer: lightweight calibration-free eye tracking for ubiquitous handsfree deixis

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Conferences
    UIST '05: Proceedings of the 18th annual ACM symposium on User interface software and technology
    October 2005
    270 pages
    ISBN:1595932712
    DOI:10.1145/1095034
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

    Sponsors

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 23 October 2005

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. attentive user interface
    2. eye tracking

    Qualifiers

    • Article

    Conference

    UIST05

    Acceptance Rates

    UIST '05 Paper Acceptance Rate 31 of 159 submissions, 19%;
    Overall Acceptance Rate 561 of 2,567 submissions, 22%

    Upcoming Conference

    UIST '25
    The 38th Annual ACM Symposium on User Interface Software and Technology
    September 28 - October 1, 2025
    Busan , Republic of Korea

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)9
    • Downloads (Last 6 weeks)1
    Reflects downloads up to 12 Feb 2025

    Other Metrics

    Citations

    Cited By

    View all
    • (2023)Assessing Eye Tracking for Continuous Central Field Loss MonitoringProceedings of the 22nd International Conference on Mobile and Ubiquitous Multimedia10.1145/3626705.3627776(54-64)Online publication date: 3-Dec-2023
    • (2022)A new robust multivariate mode estimator for eye-tracking calibrationBehavior Research Methods10.3758/s13428-022-01809-455:2(516-553)Online publication date: 16-Mar-2022
    • (2022)Visual Typer for the Handicapped2022 Advances in Science and Engineering Technology International Conferences (ASET)10.1109/ASET53988.2022.9735081(1-6)Online publication date: 21-Feb-2022
    • (2022)Learnability evaluation of the markup language for designing applications controlled by gazeInternational Journal of Human-Computer Studies10.1016/j.ijhcs.2022.102863165:COnline publication date: 1-Sep-2022
    • (2019)SMACProceedings of the ACM on Human-Computer Interaction10.1145/33009613:EICS(1-47)Online publication date: 13-Jun-2019
    • (2019)Visual-Based Eye Contact Detection in Multi-Person Interactions2019 International Conference on Content-Based Multimedia Indexing (CBMI)10.1109/CBMI.2019.8877471(1-6)Online publication date: Sep-2019
    • (2018)A Study on the Use of Eye Tracking to Adapt Gameplay and Procedural Content Generation in First-Person Shooter GamesMultimodal Technologies and Interaction10.3390/mti20200232:2(23)Online publication date: 7-May-2018
    • (2018)Robust eye contact detection in natural multi-person interactions using gaze and speaking behaviourProceedings of the 2018 ACM Symposium on Eye Tracking Research & Applications10.1145/3204493.3204549(1-10)Online publication date: 14-Jun-2018
    • (2017)EyemirrorProceedings of the 16th International Conference on Mobile and Ubiquitous Multimedia10.1145/3152832.3152839(279-291)Online publication date: 26-Nov-2017
    • (2017)Everyday Eye Contact Detection Using Unsupervised Gaze Target DiscoveryProceedings of the 30th Annual ACM Symposium on User Interface Software and Technology10.1145/3126594.3126614(193-203)Online publication date: 20-Oct-2017
    • Show More Cited By

    View Options

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Figures

    Tables

    Media

    Share

    Share

    Share this Publication link

    Share on social media