ABSTRACT
In the study of eye movements in natural tasks, where subjects are able to freely move in their environment, it is desirable to capture a video of the surroundings of the subject not limited to a small field of view as obtained by the scene camera of an eye tracker. Moreover, recovering the head movements could give additional information about the type of eye movement that was carried out, the overall gaze change in world coordinates, and insight into high-order perceptual strategies. Algorithms for the classification of eye movements in such natural tasks could also benefit form the additional head movement data.We propose to use an omnidirectional vision sensor consisting of a small CCD video camera and a hyperbolic mirror. The camera is mounted on an ASL eye tracker and records an image sequence at 60 Hz. Several algorithms for the extraction of rotational motion from this image sequence were implemented and compared in their performance against the measurements of a Fasttrack magnetic tracking system. Using data from the eye tracker together with the data obtained by the omnidirectional image sensor, a new algorithm for the classification of different types of eye movements based on a Hidden-Markov-Model was developed.
- A. MAKADIA, K. D. 2003. Direct 3d-rotation estimation from spherical images via a generalized shift theorem. In 2003 Conference on Computer Vision and Pattern Recognition (CVPR '03), 15-22.Google ScholarCross Ref
- C. GEYER, K. D. 2000. Equivalence of catadioptric projections and mappings of the sphere. In IEEE Workshop on Omnidirectional Vision. Google ScholarDigital Library
- CARPENTER, R. H. S. 1991. Eye Movements. CRC Press.Google Scholar
- DUCHOWSKI, A. T. 2002. Eye Tracking Methodology. Springer Verlag. Google ScholarDigital Library
- G. S. CHIRIKJIAN, A. K. 2001. Engineering applications of noncommutative harmonic analysis. CRC Press.Google Scholar
- IRANI, M., ROUSSO, B., AND PELEG, S. 1994. Recovery of Ego-Motion Using Image Stabilization. In ICVPR, 454-460.Google Scholar
- J. GLUCKMAN, S. K. N. 1998. Ego-motion and omnidirectional cameras. In IEEE International Conference on Computer Vision (ICCV'98), 999-1005. Google ScholarDigital Library
- M. HAYHOE, D. H. BALLARD, E. A. 2002. Vision in natural and virtual environments. In Eye Tracking Research and Application(ETRA 2002). Google ScholarDigital Library
- M. LAND, M. H. 2001. In what ways do eye movements contribute to everyday activities? Vision Research, special issue on Eye Movements and vision in the Natural World 41.Google Scholar
- RABINER, L. 1989. A tutorial on hidden markov models and selected applications in speech recognition. Proceedings of the IEEE 77, 257-286. Google ScholarCross Ref
- RUNGSARITYOTIN, W., AND STARNER, T. 2000. Finding location using omnidirectional video on a wearable computing platform. In ISWC, 61-68. Google ScholarDigital Library
- S. BAKER, S. K. N. 1999. A theory of single-viewpoint catadioptric image formation. In International Journal of Computer Vision(IJCV 1999). Google ScholarDigital Library
- SALVUCCI, D. D. 1999. Mapping Eye Movements to Cognitive Processes. PhD thesis, Carnegie Mellon University. Google ScholarDigital Library
- SNEEUW, N. J. 1992. Representation coefficients and their use in satellite geodesy. Manuscripta Geodaetica.Google Scholar
- T. SVOBODA, T. PAJDLA, V. H. 1998. Epipolar geometry for panoramic cameras. Lecture Notes in Computer Science 1406, 218-?? Google ScholarDigital Library
- T. SVOBODA, T. PAJDLA, V. H., 1998. Motion estimation using central panoramic cameras. In IEEE International Conference on Intelligent Vehicles, October 1998. to appear.Google Scholar
Index Terms
- Head movement estimation for wearable eye tracker
Recommendations
Wearable eye tracker calibration at your fingertips
ETRA '18: Proceedings of the 2018 ACM Symposium on Eye Tracking Research & ApplicationsCommon calibration techniques for head-mounted eye trackers rely on markers or an additional person to assist with the procedure. This is a tedious process and may even hinder some practical applications. We propose a novel calibration technique which ...
Compensating for eye tracker camera movement
ETRA '06: Proceedings of the 2006 symposium on Eye tracking research & applicationsAn algorithm was developed to improve prediction of eye position from video-based eye tracker data. Eye trackers that determine eye position relying on images of pupil and corneal reflection positions typically make poor differentiation between changes ...
Task-embedded online eye-tracker calibration for improving robustness to head motion
ETRA '19: Proceedings of the 11th ACM Symposium on Eye Tracking Research & ApplicationsRemote eye trackers are widely used for screen-based interactions. They are less intrusive than head mounted eye trackers, but are generally quite sensitive to head movement. This leads to the requirement for frequent recalibration, especially in ...
Comments