skip to main content
10.1145/1117309.1117311acmconferencesArticle/Chapter ViewAbstractPublication PagesetraConference Proceedingsconference-collections
Article

Communication through eye-gaze: where we have been, where we are now and where we can go from here

Published: 27 March 2006 Publication History

Abstract

Throughout the history of gaze tracking, there have been several dimensions along which the evolution of gaze-based communication can be viewed. Perhaps the most important of these dimensions is the ease of use, or usability, of systems incorporating eye tracking. Usable communication through eye-gaze has been a goal for many years and offers the prospect of effortless and fast communication for able bodied and disabled users alike. To date such communication has been hampered by a number of problems limiting its widespread uptake. Systems have evolved over time and can provide effective means of communication within restricted bounds, but these are typically incompatible and limited to a few application areas, and each has suffered from particular usability problems. As a consequence uptake remains low and the cost of individual eyetracking systems remains high. However, more is being understood and published about the usability requirements for eye-gaze communication systems, particularly for users with different types of disability. With the advance of research and technology we are now seeing genuinely usable systems which can be used for a broad range of applications and, with this, the prospect of much wider acceptance of gaze as a means of communication.A second dimension is how we can utilise our communication through eye gaze. Much work has been undertaken addressing the nature of control and the concepts of active and passive control, or command-based and non-command based interaction. Active control and the giving of commands by eye to on-screen keyboards and other control interfaces is now well known and has lead to greatly improved usability via compensation for the limitations of eyetrackers as a data source, and by providing predictive and corrective capabilities to the user interface. Passive monitoring of gaze position leads to the notion of gaze-aware objects which are capable of responding to user attention in a way appropriate to the specific task context. Early work by Starker and Bolt [1990], for example, assigned objects in a virtual world gaze based indices of interest where control was mediated by system evaluation of user interest without the need for active user control. By employing these concepts current gaze control systems have now achieved acceptable ease of use by making on-screen objects gaze-aware, allowing compensation for tracking and manipulation inaccuracies. Gaze aware interaction is now migrating from the confines of the desktop to the user task space in the real world within a ubiquitous computing context. Instead of attempting to track gaze position in world space relative to the user, with the many difficulties this presents in inaccuracies and encumbering equipment, gaze tracking can be moved to many ubiquitous objects in the real world. Visual contact and manipulation with gaze aware instrumented objects is now possible by equipping objects with eye-contact sensors detecting infra-red corneal reflection from users looking at these objects. Alternatively, objects can be equipped with infra-red emitters and the detection of the corneal reflection of these can be moved to low-cost head-mounted cameras worn by the user. These two approaches to visual contact detection parallel desk-mounted and head-mounted eye-tracking systems.In the future, we can expect very real benefits from gaze-based communication in a wider set of task domains as ubiquitous systems become more able to make informed decisions about the intent of a user. Such systems will finally liberate eye gaze communication from the confines of laboratory and desktop, to the real world, enabling low-cost communication through gaze to be available for all.This paper gratefully acknowledges the support of the COGAIN Communication by Gaze Interaction European Commission Fp6 Network of Excellence project.

Reference

[1]
Starker, I. and Bolt, R. A. 1990. A Gaze-Responsive Self-Disclosing Display. In Human Factors in Computing Systems: CHI '90 Conference Proceedings. ACM Press, 3-9.

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Conferences
ETRA '06: Proceedings of the 2006 symposium on Eye tracking research & applications
March 2006
175 pages
ISBN:1595933050
DOI:10.1145/1117309
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 27 March 2006

Permissions

Request permissions for this article.

Check for updates

Qualifiers

  • Article

Conference

ETRA06
ETRA06: Eye Tracking Research and Applications
March 27 - 29, 2006
California, San Diego

Acceptance Rates

Overall Acceptance Rate 69 of 137 submissions, 50%

Upcoming Conference

ETRA '25

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • 0
    Total Citations
  • 894
    Total Downloads
  • Downloads (Last 12 months)3
  • Downloads (Last 6 weeks)0
Reflects downloads up to 15 Feb 2025

Other Metrics

Citations

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Figures

Tables

Media

Share

Share

Share this Publication link

Share on social media