ABSTRACT
This work explores the combination of gaze and speech to interact with objects in the environment. A head-mounted wireless gaze tracker in the form of gaze tracking glasses is used for mobile monitoring of a subject's point of regard on the surrounding environment. In our proposed system, a mobile subject gazes at an object of interest in the environment which opens an interaction window with the object being gazed upon, and a specific interaction command is then given to the object using speech commands. The gaze tracking glasses were made from low-cost hardware consisting of a safety glasses' frame and wireless eye tracking and scene cameras. An open source gaze estimation algorithm is used for eye tracking and user's gaze estimation. The Windows Speech Recognition engine is used for recognition of voice commands. A visual markers recognition library is used to identify objects in the environment through the scene camera. When combining all these elements, the emerging system permits a subject to move freely in an environment, select the object he wants to interact with using gaze (identification) and transmit a control command to it by uttering a speech command (control).
- Bolt, Richard A. 1982 Eyes at the Interface. In Proceedings of the 1982 Conference on Human Factors in Computing Systems Pp. 360--362. CHI '82. New York, NY, USA: ACM. http://doi.acm.org/10.1145/800049.801811, accessed April 29, 2014. Google ScholarDigital Library
- Hales, Jeremy, David Rozado, and Diako Mardanbegi N.d. Interacting with Objects in the Environment by Gaze and Hand Gestures.Google Scholar
- Jacob, Robert J. K. 1991 The Use of Eye Movements in Human-Computer Interaction Techniques: What You Look at Is What You Get. ACM Trans. Inf. Syst. 9(2): 152--169. Google ScholarDigital Library
- MacKenzie, I. Scott, and Xuang Zhang 2008 Eye Typing Using Word and Letter Prediction and a Fixation Algorithm. In Proceedings of the 2008 Symposium on Eye Tracking Research & Applications Pp. 55--58. ETRA '08. New York, NY, USA: ACM. http://doi.acm.org/10.1145/1344471.1344484, accessed April 29, 2014. Google ScholarDigital Library
- Mardanbegi, Diako, and Dan Witzner Hansen 2011 Mobile Gaze-Based Screen Interaction in 3D Environments. In Proceedings of the 1st Conference on Novel Gaze-Controlled Applications Pp. 2:1--2:4. NGCA '11. New York, NY, USA: ACM. http://doi.acm.org/10.1145/1983302.1983304, accessed April 29, 2014. Google ScholarDigital Library
- Rozado, David, Francisco Rodriguez, and Pablo Varona 2011 Gaze Gesture Recognition with Hierarchical Temporal Memory Networks. Advances in Computational Intelligence: 1--8. Google ScholarDigital Library
Index Terms
- Interacting with objects in the environment using gaze tracking glasses and speech
Recommendations
Vibrotactile stimulation of the head enables faster gaze gestures
Gaze gestures are a promising input technology for wearable devices especially in the smart glasses form factor because gaze gesturing is unobtrusive and leaves the hands free for other tasks. We were interested in how gaze gestures can be enhanced with ...
Haptic feedback of gaze gestures with glasses: localization accuracy and effectiveness
UbiComp/ISWC'15 Adjunct: Adjunct Proceedings of the 2015 ACM International Joint Conference on Pervasive and Ubiquitous Computing and Proceedings of the 2015 ACM International Symposium on Wearable ComputersWearable devices including smart eyewear require new interaction methods between the device and the user. In this paper, we describe our work on the combined use of eye tracking for input and haptic (touch) stimulation for output with eyewear. Input ...
Low cost human-robot gaze estimation system
OzCHI '14: Proceedings of the 26th Australian Computer-Human Interaction Conference on Designing Futures: the Future of DesignThis work presents the development of a low cost Human-Robot gaze estimation system for the purpose of promoting joint Human-Robot workspaces in daily scenarios. We have developed this system using only monocular eye tracking and the 2D gaze point. ...
Comments