skip to main content
10.1145/2501988.2501998acmconferencesArticle/Chapter ViewAbstractPublication PagesuistConference Proceedingsconference-collections
research-article

Pursuit calibration: making gaze calibration less tedious and more flexible

Published: 08 October 2013 Publication History

Abstract

Eye gaze is a compelling interaction modality but requires user calibration before interaction can commence. State of the art procedures require the user to fixate on a succession of calibration markers, a task that is often experienced as difficult and tedious. We present pursuit calibration, a novel approach that, unlike existing methods, is able to detect the user's attention to a calibration target. This is achieved by using moving targets, and correlation of eye movement and target trajectory, implicitly exploiting smooth pursuit eye movement. Data for calibration is then only sampled when the user is attending to the target. Because of its ability to detect user attention, pursuit calibration can be performed implicitly, which enables more flexible designs of the calibration task. We demonstrate this in application examples and user studies, and show that pursuit calibration is tolerant to interruption, can blend naturally with applications and is able to calibrate users without their awareness.

Supplementary Material

suppl.mov (uist147.mp4)
Supplemental video

References

[1]
Chen, J., and Ji, Q. Probabilistic gaze estimation without active personal calibration. In Proc. of CVPR '11 (2011), 609--616.
[2]
Drewes, H., and Schmidt, A. Interacting with the computer using gaze gestures. In Proc. of INTERACT '07, Springer (2007), 475--488.
[3]
Fischler, M. A., and Bolles, R. C. Random sample consensus: a paradigm for model fitting with applications to image analysis and automated cartography. Commun. ACM 24, 6 (1981), 381--395.
[4]
Flatla, D., Gutwin, C., Nacke, L., Bateman, S., and Mandryk, R. Calibration games: Making calibration tasks enjoyable by adding motivating game elements. In UIST '11 (2011), 403--412.
[5]
Guestrin, E., and Einzenmann, M. General theory of remote gaze estimation using the pupil center and corneal reflections. In IEEE Trans. on Biomedical Engineering, vol. 53 (2006), 1124--1133.
[6]
Guestrin, E. D., and Eizenman, M. Remote point-of-gaze estimation requiring a single-point calibration for applications with infants. In Proc. of ETRA '08, ACM (2008), 267--274.
[7]
Hutchinson, T., White, K.P., J., Martin, W., Reichert, K., and Frey, L. Human-computer interaction using eye-gaze input. IEEE Trans. on Systems, Man, and Cybernetics, Part C 19, 6 (1989), 1527--1534.
[8]
Jacob, R. J. K. What you look at is what you get: eye movement-based interaction techniques. In Proc. of CHI '90, ACM (1990), 11--18.
[9]
Kang, I., and Malpeli, J. G. Behavioral calibration of eye movement recording systems using moving targets, vol. 124. Journal of Neuroscience Methods, 2003.
[10]
Kondou, Y., and Ebisawa, Y. Easy eye-gaze calibration using a moving visual target in the head-free remote eye-gaze detection system. In Proc. of VECIMS '08 (2008), 145--150.
[11]
Kumar, M., Garfinkel, T., Boneh, D., and Winograd, T. Reducing shoulder-surfing by using gaze-based password entry. In Proc. of SOUPS '07, ACM (2007), 13--19.
[12]
Latane, B., and Darley, J. M. Bystander "apathy". American Scientist 57 (1969).
[13]
Model, D., and Eizenman, M. User-calibration-free remote eye-gaze tracking system with extended tracking range. In Proc. of CCECE '2011 (2011), 1268--1271.
[14]
Morimoto, C. H., and Mimica, M. R. M. Eye gaze tracking techniques for interactive applications. Computer Vision and Image Understanding 98, 1 (2005), 4--24.
[15]
Nystrom, M., Andersson, R., Holmqvist, K., and van de Weijer, J. The influence of calibration method and eye physiology on eyetracking data quality. Behaviour Research Methods (2012).
[16]
Ohno, T., and Mukawa, N. A free-head, simple calibration, gaze tracking system that enables gaze-based interaction. In Proc. of ETRA '04, ACM (2004), 115--122.
[17]
Renner, P., Ludike, N., Wittrowski, J., and Pfeiffer, T. Towards continuous gaze-based interaction in 3d environments - unobtrusive calibration and accuracy monitoring. In GI VRAR workshop, Shaker Verlag (2011), 13--24.
[18]
Robinson, D. The mechanics of human smooth pursuit eye movement. The Journal of Physiology 180, 3 (Oct. 1965), 569--591.
[19]
Schnipke, S. K., and Todd, M. W. Trials and tribulations of using an eye-tracking system. In Proc. of CHI EA '00, ACM (2000), 273--274.
[20]
Sibert, L. E., and Jacob, R. J. K. Evaluation of eye gaze interaction. In Proc. of CHI '00, ACM (2000), 281--288.
[21]
Smith, J. D., Vertegaal, R., and Sohn, C. Viewpointer: lightweight calibration-free eye tracking for ubiquitous handsfree deixis. In Proc. of UIST '05, ACM (2005), 53--61.
[22]
Stellmach, S., and Dachselt, R. Look & touch: gaze-supported target acquisition. In Proc. of CHI '12, ACM (2012), 2981--2990.
[23]
Turner, J., Alexander, J., Bulling, A., Schmidt, D., and Gellersen, H. Eye pull, eye push: Moving objects between large screens and personal devices with gaze & touch. In Proc. of INTERACT '13 (2013).
[24]
Vertegaal, R. Attentive user interfaces. Commun. ACM 46, 3 (2003), 30--33.
[25]
Vidal, M., Bulling, A., and Gellersen, H. Pursuits: Spontaneous interaction with displays based on smooth pursuit eye movement and moving targets. In Proc. of UbiComp '13, ACM (2013).
[26]
Vidal, M., Pfeuffer, K., Bulling, A., and Gellersen, H. Pursuits: Eye-based interaction with moving targets. In Proc. of CHI EA '13, ACM (2013).
[27]
Villanueva, A., and Cabeza, R. A novel gaze estimation system with one calibration point. IEEE Trans. on Systems, Man, and Cybernetics, Part B 38, 4 (2008), 1123--1138.
[28]
Villanueva, A., Cabeza, R., and Porta, S. Eye tracking system model with easy calibration. In Proc. of ETRA '04, ACM (2004), 55--55.
[29]
Ware, C., and Mikaelian, H. H. An evaluation of an eye tracker as a device for computer input. SIGCHI Bull. 17, SI (1986), 183--188.
[30]
Young, L., and Sheena, D. Survey of eye movement recording methods. Behavior Research Methods & Instrumentation 7 (1975), 397--429.
[31]
Zhai, S., Morimoto, C., and Ihde, S. Manual and gaze input cascaded (magic) pointing. In Proc. of CHI '99, CHI '99, ACM (1999), 246--253.
[32]
Zhang, Y., Bulling, A., and Gellersen, H. Sideways: A gaze interface for spontaneous interaction with public displays. In Proc. of CHI '13, ACM (2013).
[33]
Zhu, Z., and Ji, Q. Eye gaze tracking under natural head movements. In Proc. of CVPR '05, IEEE Computer Society (2005), 918--923.

Cited By

View all
  • (2025)The fundamentals of eye tracking part 4: Tools for conducting an eye tracking studyBehavior Research Methods10.3758/s13428-024-02529-757:1Online publication date: 6-Jan-2025
  • (2024)CalibRead: Unobtrusive Eye Tracking Calibration from Natural Reading BehaviorProceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies10.1145/36997378:4(1-30)Online publication date: 21-Nov-2024
  • (2024)UnitEye: Introducing a User-Friendly Plugin to Democratize Eye Tracking Technology in Unity EnvironmentsProceedings of Mensch und Computer 202410.1145/3670653.3670655(1-10)Online publication date: 1-Sep-2024
  • Show More Cited By

Index Terms

  1. Pursuit calibration: making gaze calibration less tedious and more flexible

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Conferences
    UIST '13: Proceedings of the 26th annual ACM symposium on User interface software and technology
    October 2013
    558 pages
    ISBN:9781450322683
    DOI:10.1145/2501988
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

    Sponsors

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 08 October 2013

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. eye gaze calibration
    2. eye tracking
    3. gaze interaction
    4. gaze interfaces
    5. smooth pursuit eye movement

    Qualifiers

    • Research-article

    Conference

    UIST'13
    UIST'13: The 26th Annual ACM Symposium on User Interface Software and Technology
    October 8 - 11, 2013
    St. Andrews, Scotland, United Kingdom

    Acceptance Rates

    UIST '13 Paper Acceptance Rate 62 of 317 submissions, 20%;
    Overall Acceptance Rate 561 of 2,567 submissions, 22%

    Upcoming Conference

    UIST '25
    The 38th Annual ACM Symposium on User Interface Software and Technology
    September 28 - October 1, 2025
    Busan , Republic of Korea

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)73
    • Downloads (Last 6 weeks)3
    Reflects downloads up to 10 Feb 2025

    Other Metrics

    Citations

    Cited By

    View all
    • (2025)The fundamentals of eye tracking part 4: Tools for conducting an eye tracking studyBehavior Research Methods10.3758/s13428-024-02529-757:1Online publication date: 6-Jan-2025
    • (2024)CalibRead: Unobtrusive Eye Tracking Calibration from Natural Reading BehaviorProceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies10.1145/36997378:4(1-30)Online publication date: 21-Nov-2024
    • (2024)UnitEye: Introducing a User-Friendly Plugin to Democratize Eye Tracking Technology in Unity EnvironmentsProceedings of Mensch und Computer 202410.1145/3670653.3670655(1-10)Online publication date: 1-Sep-2024
    • (2024)Where Do You Look When Unlocking Your Phone? : A Field Study of Gaze Behaviour During Smartphone UnlockExtended Abstracts of the CHI Conference on Human Factors in Computing Systems10.1145/3613905.3651094(1-7)Online publication date: 11-May-2024
    • (2024)A Meta-Bayesian Approach for Rapid Online Parametric Optimization for Wrist-based InteractionsProceedings of the 2024 CHI Conference on Human Factors in Computing Systems10.1145/3613904.3642071(1-38)Online publication date: 11-May-2024
    • (2024)Filtering on the Go: Effect of Filters on Gaze Pointing Accuracy During Physical Locomotion in Extended RealityIEEE Transactions on Visualization and Computer Graphics10.1109/TVCG.2024.345615330:11(7234-7244)Online publication date: 1-Nov-2024
    • (2024)Self-Calibrating Gaze Estimation With Optical Axes Projection for Head-Mounted Eye TrackingIEEE Transactions on Industrial Informatics10.1109/TII.2023.327632220:2(1397-1407)Online publication date: Feb-2024
    • (2024)Eye-tracking AD: Cutting-Edge Web Advertising on Smartphone Aligned with User’s Gaze2024 IEEE International Conference on Pervasive Computing and Communications Workshops and other Affiliated Events (PerCom Workshops)10.1109/PerComWorkshops59983.2024.10502602(469-474)Online publication date: 11-Mar-2024
    • (2024)Optimising virtual object position for efficient eye-gaze interaction in Hololens2Computer Methods in Biomechanics and Biomedical Engineering: Imaging & Visualization10.1080/21681163.2024.233776512:1Online publication date: 14-Apr-2024
    • (2024)Use of eye tracking in medical educationMedical Teacher10.1080/0142159X.2024.231686346:11(1502-1509)Online publication date: 21-Feb-2024
    • Show More Cited By

    View Options

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Figures

    Tables

    Media

    Share

    Share

    Share this Publication link

    Share on social media