skip to main content
10.1145/1028630.1028647acmconferencesArticle/Chapter ViewAbstractPublication PagesassetsConference Proceedingsconference-collections
Article

Eyedraw: a system for drawing pictures with eye movements

Published: 01 September 2003 Publication History

Abstract

This paper describes the design and development of EyeDraw, a software program that will enable children with severe mobility impairments to use an eye tracker to draw pictures with their eyes so that they can have the same creative developmental experiences as nondisabled children. EyeDraw incorporates computer-control and software application advances that address the special needs of people with motor impairments, with emphasis on the needs of children. The contributions of the project include (a) a new technique for using the eyes to control the computer when accomplishing a spatial task, (b) the crafting of task-relevant functionality to support this new technique in its application to drawing pictures, and (c) a user-tested implementation of the idea within a working computer program. User testing with nondisabled users suggests that we have designed and built an eye-cursor and eye drawing control system that can be used by almost anyone with normal control of their eyes. The core technique will be generally useful for a range of computer control tasks such as selecting a group of icons on the desktop by drawing a box around them.

References

[1]
Bates, R., & Istance, H. (2002). Zooming Interfaces! Enhancing the Performance of Eye Controlled Pointing Devices. Proceedings of ASSETS 2002: The fifth international ACM conference on Assistive technologies, New York: ACM.]]
[2]
Bolt, R. A. (1982). Eyes at the interface. Proceedings of Human Factors in Computing Systems, New York: ACM, 360--362.]]
[3]
Card, S. K., Moran, T. P., & Newell, A. (1983). The Psychology of Human Computer Interaction. Hillsdale, NJ: Lawrence Erlbaum Associates.]]
[4]
Duncum, P. (1995). Colouring-in and alternatives in early childhood. Australian Journal of Early Childhood, 20(3), 33--38.]]
[5]
Escobedo, T. H., & Bhargava, A. (1991). A study of children's computer-generated graphics. Journal of Computing in Childhood Education, 2(4), 3--25.]]
[6]
Gips, J., & Olivieri, P. (1996). EagleEyes: An Eye Control System for Persons with Disabilities. The Eleventh International Conference on Technology and Persons with Disabilities.]]
[7]
Jacob, R. J. K. (1990). What you look at is what you get: Eye movement-based interaction techniques. Proceedings of ACM CHI '90: Conference on Human Factors in Computing Systems, ACM: New York, 11--18.]]
[8]
Jacob, R. J. K., & Karn, K. S. (2003). Eye tracking in human-computer interaction and usability research: Ready to deliver the promises (Section commentary). In J. Hyona, R. Radach, & H. Deubel (Eds.), The Mind's Eyes: Cognitive and Applied Aspects of Eye Movements. Oxford: Elsevier Science.]]
[9]
John, B. E., & Kieras, D. E. (1996). The GOMS family of user interface analysis techniques: Comparison and contrast. ACM Transactions on Computer-Human Interaction, 3(4), 320--351.]]
[10]
Majaranta, P., & Raiha, K.-J. (2002). Twenty years of eye typing: systems and design issue. Proceedings of the Symposium on Eye Tracking Research and Applications: ETRA 2002, New York: ACM Press, 15--22.]]
[11]
Tchalenko, J. (2001). Free-eye drawing. Point: Art and Design Research Journal, 11, 36--41.]]
[12]
Vertegaal, R. (2003). "Attentive User Interfaces" Editorial, Special Issue on Attentive User Interfaces. Communications of the ACM, 46(3).]]
[13]
Ware, C., & Mikaelian, H. H. (1987). An evaluation of an eye tracker as a device for computer input. Proceedings of ACM CHI+GI, New York: ACM, 183--188.]]
[14]
Welford, A. T. (1980). Choice reaction time: Basic concepts. In A. T. Welford (Ed.), Reaction Times.New York: Academic Press, 73--128.]]
[15]
Zhai, S., Morimoto, C., & Ihde, S. (1999). Manual and gaze input cascaded (MAGIC) pointing. Proceedings of ACM CHI '99: Conference on Human Factors in Computing Systems, New York: ACM, 246-25.]]

Cited By

View all
  • (2024)Pro-Tact: Hierarchical Synthesis of Proprioception and Tactile Exploration for Eyes-Free Ray Pointing on Out-of-View VR MenusProceedings of the 37th Annual ACM Symposium on User Interface Software and Technology10.1145/3654777.3676324(1-11)Online publication date: 13-Oct-2024
  • (2024)Eye Strokes: An Eye-gaze Drawing System for Mandarin CharactersProceedings of the ACM on Computer Graphics and Interactive Techniques10.1145/36547027:2(1-15)Online publication date: 17-May-2024
  • (2022)Preliminary testing of eye gaze interfaces for controlling a haptic system intended to support play in children with physical impairments: Attentive versus explicit interfacesJournal of Rehabilitation and Assistive Technologies Engineering10.1177/205566832210796949Online publication date: 28-Feb-2022
  • Show More Cited By

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Conferences
Assets '04: Proceedings of the 6th international ACM SIGACCESS conference on Computers and accessibility
October 2004
202 pages
ISBN:158113911X
DOI:10.1145/1028630
  • cover image ACM SIGACCESS Accessibility and Computing
    ACM SIGACCESS Accessibility and Computing Just Accepted
    Sept. 2003 - Jan. 2004
    192 pages
    EISSN:1558-1187
    DOI:10.1145/1029014
    Issue’s Table of Contents
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 01 September 2003

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. art
  2. children
  3. drawing
  4. eye tracking
  5. input devices
  6. interaction techniques
  7. universal access

Qualifiers

  • Article

Conference

ASSETS04
Sponsor:

Acceptance Rates

Assets '04 Paper Acceptance Rate 25 of 47 submissions, 53%;
Overall Acceptance Rate 436 of 1,556 submissions, 28%

Upcoming Conference

ASSETS '25

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)20
  • Downloads (Last 6 weeks)0
Reflects downloads up to 14 Feb 2025

Other Metrics

Citations

Cited By

View all
  • (2024)Pro-Tact: Hierarchical Synthesis of Proprioception and Tactile Exploration for Eyes-Free Ray Pointing on Out-of-View VR MenusProceedings of the 37th Annual ACM Symposium on User Interface Software and Technology10.1145/3654777.3676324(1-11)Online publication date: 13-Oct-2024
  • (2024)Eye Strokes: An Eye-gaze Drawing System for Mandarin CharactersProceedings of the ACM on Computer Graphics and Interactive Techniques10.1145/36547027:2(1-15)Online publication date: 17-May-2024
  • (2022)Preliminary testing of eye gaze interfaces for controlling a haptic system intended to support play in children with physical impairments: Attentive versus explicit interfacesJournal of Rehabilitation and Assistive Technologies Engineering10.1177/205566832210796949Online publication date: 28-Feb-2022
  • (2022)Methodological Standards in Accessibility Research on Motor Impairments: A SurveyACM Computing Surveys10.1145/354350955:7(1-35)Online publication date: 15-Dec-2022
  • (2022)Lattice Menu: A Low-Error Gaze-Based Marking Menu Utilizing Target-Assisted Gaze Gestures on a Lattice of Visual AnchorsProceedings of the 2022 CHI Conference on Human Factors in Computing Systems10.1145/3491102.3501977(1-12)Online publication date: 29-Apr-2022
  • (2019)Inducing gaze gestures by static illustrationsProceedings of the 11th ACM Symposium on Eye Tracking Research & Applications10.1145/3317956.3318151(1-5)Online publication date: 25-Jun-2019
  • (2019)Wavelet Method for Automatic Detection of Eye-Movement BehaviorsIEEE Sensors Journal10.1109/JSEN.2018.287694019:8(3085-3091)Online publication date: 15-Apr-2019
  • (2018)A comparison of eye-head coordination between virtual and physical realitiesProceedings of the 15th ACM Symposium on Applied Perception10.1145/3225153.3225157(1-7)Online publication date: 10-Aug-2018
  • (2016)EV: A First Version of the Surface Evolver with a Human-Computer InterfaceJournal of Software10.17706/jsw.11.11.1073-108211:11(1073-1082)Online publication date: Nov-2016
  • (2015)Human-computer interfaces applied to numerical solution of the Plateau problemJournal of Physics: Conference Series10.1088/1742-6596/633/1/012130633(012130)Online publication date: 21-Sep-2015
  • Show More Cited By

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Figures

Tables

Media

Share

Share

Share this Publication link

Share on social media