skip to main content
10.1145/2686612.2686676acmotherconferencesArticle/Chapter ViewAbstractPublication PagesozchiConference Proceedingsconference-collections
research-article

Interacting with objects in the environment using gaze tracking glasses and speech

Published:02 December 2014Publication History

ABSTRACT

This work explores the combination of gaze and speech to interact with objects in the environment. A head-mounted wireless gaze tracker in the form of gaze tracking glasses is used for mobile monitoring of a subject's point of regard on the surrounding environment. In our proposed system, a mobile subject gazes at an object of interest in the environment which opens an interaction window with the object being gazed upon, and a specific interaction command is then given to the object using speech commands. The gaze tracking glasses were made from low-cost hardware consisting of a safety glasses' frame and wireless eye tracking and scene cameras. An open source gaze estimation algorithm is used for eye tracking and user's gaze estimation. The Windows Speech Recognition engine is used for recognition of voice commands. A visual markers recognition library is used to identify objects in the environment through the scene camera. When combining all these elements, the emerging system permits a subject to move freely in an environment, select the object he wants to interact with using gaze (identification) and transmit a control command to it by uttering a speech command (control).

References

  1. Bolt, Richard A. 1982 Eyes at the Interface. In Proceedings of the 1982 Conference on Human Factors in Computing Systems Pp. 360--362. CHI '82. New York, NY, USA: ACM. http://doi.acm.org/10.1145/800049.801811, accessed April 29, 2014. Google ScholarGoogle ScholarDigital LibraryDigital Library
  2. Hales, Jeremy, David Rozado, and Diako Mardanbegi N.d. Interacting with Objects in the Environment by Gaze and Hand Gestures.Google ScholarGoogle Scholar
  3. Jacob, Robert J. K. 1991 The Use of Eye Movements in Human-Computer Interaction Techniques: What You Look at Is What You Get. ACM Trans. Inf. Syst. 9(2): 152--169. Google ScholarGoogle ScholarDigital LibraryDigital Library
  4. MacKenzie, I. Scott, and Xuang Zhang 2008 Eye Typing Using Word and Letter Prediction and a Fixation Algorithm. In Proceedings of the 2008 Symposium on Eye Tracking Research & Applications Pp. 55--58. ETRA '08. New York, NY, USA: ACM. http://doi.acm.org/10.1145/1344471.1344484, accessed April 29, 2014. Google ScholarGoogle ScholarDigital LibraryDigital Library
  5. Mardanbegi, Diako, and Dan Witzner Hansen 2011 Mobile Gaze-Based Screen Interaction in 3D Environments. In Proceedings of the 1st Conference on Novel Gaze-Controlled Applications Pp. 2:1--2:4. NGCA '11. New York, NY, USA: ACM. http://doi.acm.org/10.1145/1983302.1983304, accessed April 29, 2014. Google ScholarGoogle ScholarDigital LibraryDigital Library
  6. Rozado, David, Francisco Rodriguez, and Pablo Varona 2011 Gaze Gesture Recognition with Hierarchical Temporal Memory Networks. Advances in Computational Intelligence: 1--8. Google ScholarGoogle ScholarDigital LibraryDigital Library

Index Terms

  1. Interacting with objects in the environment using gaze tracking glasses and speech

      Recommendations

      Comments

      Login options

      Check if you have access through your login credentials or your institution to get full access on this article.

      Sign in
      • Published in

        cover image ACM Other conferences
        OzCHI '14: Proceedings of the 26th Australian Computer-Human Interaction Conference on Designing Futures: the Future of Design
        December 2014
        689 pages
        ISBN:9781450306539
        DOI:10.1145/2686612
        • Conference Chair:
        • Tuck Leong

        Copyright © 2014 ACM

        Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

        Publisher

        Association for Computing Machinery

        New York, NY, United States

        Publication History

        • Published: 2 December 2014

        Permissions

        Request permissions about this article.

        Request Permissions

        Check for updates

        Qualifiers

        • research-article

        Acceptance Rates

        OzCHI '14 Paper Acceptance Rate85of176submissions,48%Overall Acceptance Rate362of729submissions,50%

      PDF Format

      View or Download as a PDF file.

      PDF

      eReader

      View online with eReader.

      eReader