ABSTRACT
Handheld mobile devices that have a touch screen are widely used but are awkward to use with one hand. To solve this problem, we propose MobiGaze, which is a user interface that uses one's gaze (gaze interface) to operate a handheld mobile device. By using stereo cameras, the user's line of sight is detected in 3D, enabling the user to interact with a mobile device by means of his/her gaze. We have constructed a prototype system of MobiGaze that consists of two cameras with IR-LED, a Windows-based notebook PC, and iPod touch. Moreover, we have developed several applications for MobiGaze.
- Drewes, H., Luca, A.D. and Schmidt, A. Eye-gaze interaction for mobile phones. in Proceedings of the 4th international conference on mobile technology, applications, and systems (2007), 364--371. Google ScholarDigital Library
- Jacob, R.J.K. The use of eye movements in human-computer interaction techniques: what you look at is what you get ACM Transactions on Information Systems, 9 (2) (1991), 152--169. Google ScholarDigital Library
- LUKANDER, K. A system for tracking gaze on handheld devices Behavior Research Methods, 38 (4) (2006), 660--666.Google ScholarCross Ref
- Nagamatsu, T., Kamahara, J. and Tanaka, N. 3D Gaze Tracking with Easy Calibration Using stereo Cameras for Robot and Human Communication. in Proceedings of the 17th International Symposium on Robot and Human Interactive Communication (IEEE ROMAN) 2008 (2008), 59--64.Google Scholar
- Nagamatsu, T., Iwamoto, Y., Kamahara, J., Tanaka, N. and Yamamoto, M. Gaze Estimation Method based on an Aspherical Model of the Cornea: Surface of Revolution about the Optical Axis of the Eye. in Proceedings of the 2010 Symposium on Eye Tracking Research & Applications (2010). Google ScholarDigital Library
Index Terms
MobiGaze: development of a gaze interface for handheld mobile devices
Recommendations
Low-cost head position tracking for gaze point estimation
PETRA '12: Proceedings of the 5th International Conference on PErvasive Technologies Related to Assistive EnvironmentsIn this paper, we present a low-cost solution for real-time tracking of a human user's head position with respect to a video display source for eye gaze estimation in an assistive setting. The solution utilizes a wearable headset equipped with sensors ...
Pistol: Pupil Invisible Supportive Tool in the Wild
AbstractThis paper is an in the wild evaluation of the eye tracking tool Pistol. Pistol supports Pupil Invisible projects and other eye trackers (Dikablis, Emke GmbH, Look, Pupil, and many more) in offline mode. For all eye tracking recordings, Pistol is ...
What can a mouse cursor tell us more?: correlation of eye/mouse movements on web browsing
CHI EA '01: CHI '01 Extended Abstracts on Human Factors in Computing SystemsIn this paper, we describe a study on the relationship between gaze position and cursor position on a computer screen during web browsing. Users were asked to browse several web sites while their eye/mouse movements were recorded. The data suggest that ...
Comments