skip to main content
10.1145/1344471.1344527acmconferencesArticle/Chapter ViewAbstractPublication PagesetraConference Proceedingsconference-collections
research-article

Remote gaze estimation with a single camera based on facial-feature tracking without special calibration actions

Published: 26 March 2008 Publication History

Abstract

We propose a real-time gaze estimation method based on facial-feature tracking using a single video camera that does not require any special user action for calibration. Many gaze estimation methods have been already proposed; however, most conventional gaze tracking algorithms can only be applied to experimental environments due to their complex calibration procedures and lacking of usability. In this paper, we propose a gaze estimation method that can apply to daily-life situations. Gaze directions are determined as 3D vectors connecting both the eyeball and the iris centers. Since the eyeball center and radius cannot be directly observed from images, the geometrical relationship between the eyeball centers and the facial features and eyeball radius (face/eye model) are calculated in advance. Then, the 2D positions of the eyeball centers can be determined by tracking the facial features. While conventional methods require instructing users to perform such special actions as looking at several reference points in the calibration process, the proposed method does not require such special calibration action of users and can be realized by combining 3D eye-model-based gaze estimation and circle-based algorithms for eye-model calibration. Experimental results show that the gaze estimation accuracy of the proposed method is 5° horizontally and 7° vertically. With our proposed method, various application such as gaze-communication robots, gaze-based interactive signboards, etc. that require gaze information in daily-life situations are possible.

References

[1]
Baluja, S., and Pomerleau, D. 1994. Non-intrusive gaze tracking using artificial neural networks. Tech. Rep. CMU-CS-94-102, CMU.
[2]
Ishikawa, T., Baker, S., Matthews, I., and Kanade, T. 2004. Passive driver gaze tracking with active appearance models. In Proc. 11th World Congress on Intelligent Transportation Systems.
[3]
Kawato, S., Tetsutani, N., and Hosaka, K. 2005. Scale-adaptive face detection and tracking in real time with ssr filters and support vector machine. IEICE Trans. on Info. and Sys. E88-D, 12, 2857--2863.
[4]
Lucas, B., and Kanade, T. 1981. An iterative image registration technique with an application to stereo vision. In Proc. Int'l Joint Conf. Aretificial Intelligence, 674--679.
[5]
Matsumoto, Y., and Zelinsky, A. 2000. An algorithm for real-time stereoo vision implementation of head pose and gaze direction measurement. In Proc. Int. Conf. Automatic Face and Gesture Recognition, 499--504.
[6]
Miyake, T., Haruta, S., and Horihata, S. 2002. Image based eye-gaze estimation irrespective of head direction. In Proc. IEEE Int. Symp. Industrial Electronics, vol. 1, 332--336.
[7]
Morency, L. P., Chrristoudias, C., and Darrell, T. 2006. Recognizing gaze aversion gestures in embodied conversational discourse. In Proc. ICMI'06, 287--294.
[8]
Morimoto, C. H., and Mimica, M. R. M. 2005. Eye gaze tracking techniques for interactive applications. Computer Vision and Image Understanding 98, 1, 4--24.
[9]
Ohno, T., Mukawa, N., and Kawato, S. 2003. Just blink your eyes: A head-free gaze tracking system. In Proc. CHI2003, 950--951.
[10]
Ono, Y., Okabe, T., and Sato, Y. 2006. Gaze estimation from low resolution images. In Proc. IEEE Pacigic-Rim Symp. on Image and Video Technology(PSIVT'06), 178--188.
[11]
Poelman, C., and Kanade, T. 1992. A paraperspective factorization method for shape and motion recovery. Tech. Rep. 92--208, CMU-CS.
[12]
Quan, L. 1996. Self-calibration of an affine camera from multiple views. Int'l Journal of Computer Vision 19, 93--105.
[13]
Shiele, B., and Waibel, A. 1995. Gaze tracking based on face-color. In Proc. Int'l Workshop on Automatic Face and Gesture Recognition, 344--349.
[14]
Takegami, T., Gotoh, T., and Ohyama, G. 2002. An algorithm for an eye tracking system with self-calibration. Systems and Computers in Japan 33, 10, 10--20.
[15]
Utsumi, A., Kawato, S., Susami, K., Kuwahara, N., and Kuwabara, K. 2004. Face-orientation detection and monitoring for networked interaction therapy. In Proc. of SCIS & ISIS 2004.
[16]
Wang, J., and Sung, E. 2001. Gaze determination via images of irises. Image and Vision Computing 19, 12, 891--911.
[17]
Wu, H., Chen, Q., and Wada, T. 2004. Conic-based algorithm for visual line estimation from image. In Proc. Automatic Face and Gesture Recognition, 260--265.
[18]
Yonezawa, T., Yamazoe, H., Utsumi, A., and Abe, S. 2007. Gaze-communicative behavior of stuffed-toy robot with joint attention and eye contact based on ambient gaze-tracking. 140--145.
[19]
Yonezawa, T., Yamazoe, H., Utsumi, A., and Abe, S. 2007. Gazecoppet: Hierarchical gaze-communication in ambient space.

Cited By

View all
  • (2025)Gam360: sensing gaze activities of multi-persons in 360 degreesCCF Transactions on Pervasive Computing and Interaction10.1007/s42486-024-00168-7Online publication date: 6-Jan-2025
  • (2024)Appearance-Based Gaze Estimation With Deep Learning: A Review and BenchmarkIEEE Transactions on Pattern Analysis and Machine Intelligence10.1109/TPAMI.2024.339357146:12(7509-7528)Online publication date: Dec-2024
  • (2024)Gaze Estimation by Attention-Induced Hierarchical Variational Auto-EncoderIEEE Transactions on Cybernetics10.1109/TCYB.2023.331239254:4(2592-2605)Online publication date: Apr-2024
  • Show More Cited By

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Conferences
ETRA '08: Proceedings of the 2008 symposium on Eye tracking research & applications
March 2008
285 pages
ISBN:9781595939821
DOI:10.1145/1344471
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 26 March 2008

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. daily-life situations
  2. non-intrusive
  3. remote gaze tracking

Qualifiers

  • Research-article

Conference

ETRA '08
ETRA '08: Eye Tracking Research and Applications
March 26 - 28, 2008
Georgia, Savannah

Acceptance Rates

Overall Acceptance Rate 69 of 137 submissions, 50%

Upcoming Conference

ETRA '25

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)30
  • Downloads (Last 6 weeks)0
Reflects downloads up to 17 Feb 2025

Other Metrics

Citations

Cited By

View all
  • (2025)Gam360: sensing gaze activities of multi-persons in 360 degreesCCF Transactions on Pervasive Computing and Interaction10.1007/s42486-024-00168-7Online publication date: 6-Jan-2025
  • (2024)Appearance-Based Gaze Estimation With Deep Learning: A Review and BenchmarkIEEE Transactions on Pattern Analysis and Machine Intelligence10.1109/TPAMI.2024.339357146:12(7509-7528)Online publication date: Dec-2024
  • (2024)Gaze Estimation by Attention-Induced Hierarchical Variational Auto-EncoderIEEE Transactions on Cybernetics10.1109/TCYB.2023.331239254:4(2592-2605)Online publication date: Apr-2024
  • (2024)Fine-grained gaze estimation based on the combination of regression and classification lossesApplied Intelligence10.1007/s10489-024-05778-3Online publication date: 3-Sep-2024
  • (2023)Research on Gaze Estimation Method Combined with Head Motion ChangesInternational Journal of Advanced Network, Monitoring and Controls10.2478/ijanmc-2022-00407:4(89-96)Online publication date: 26-May-2023
  • (2023)LiteGaze: Neural architecture search for efficient gaze estimationPLOS ONE10.1371/journal.pone.028481418:5(e0284814)Online publication date: 1-May-2023
  • (2023)Continuous Gaze Tracking With Implicit Saliency-Aware Calibration on Mobile DevicesIEEE Transactions on Mobile Computing10.1109/TMC.2022.318513422:10(5816-5828)Online publication date: 1-Oct-2023
  • (2023)HAZE-Net: High-Frequency Attentive Super-Resolved Gaze Estimation in Low-Resolution Face ImagesComputer Vision – ACCV 202210.1007/978-3-031-26348-4_9(142-160)Online publication date: 9-Mar-2023
  • (2023)‘Labelling the Gaps’: A Weakly Supervised Automatic Eye Gaze EstimationComputer Vision – ACCV 202210.1007/978-3-031-26316-3_44(745-763)Online publication date: 2-Mar-2023
  • (2022)Neural 3D Gaze: 3D Pupil Localization and Gaze Tracking based on Anatomical Eye Model and Neural Refraction Correction2022 IEEE International Symposium on Mixed and Augmented Reality (ISMAR)10.1109/ISMAR55827.2022.00053(375-383)Online publication date: Oct-2022
  • Show More Cited By

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Figures

Tables

Media

Share

Share

Share this Publication link

Share on social media