skip to main content
10.1145/1056808.1057041acmconferencesArticle/Chapter ViewAbstractPublication PageschiConference Proceedingsconference-collections
Article

Media eyepliances: using eye tracking for remote control focus selection of appliances

Published: 02 April 2005 Publication History

Abstract

This paper discusses the use of eye contact sensing for focus selection operations in remote controlled media appliances. Focus selection with remote controls tends to be cumbersome as selection buttons place the remote in a device-specific modality. We addressed this issue with the design of Media EyePliances, home theatre appliances augmented with a digital eye contact sensor. An appliance is selected as the focus of remote commands by looking at its sensor. A central server subsequently routes all commands provided by remote, keyboard or voice input to the focus EyePliance. We discuss a calibration-free digital eye contact sensing technique that allows Media EyePliances to determine the user's point of gaze.

References

[1]
Apple Computers, Inc. www.apple.com/airtunes, 2004.]]
[2]
Bellotti, V., et al. Making Sense of Sensing Systems. In Proc. of CHI 2002. Minneapolis: ACM Press, 2002, pp. 415--422.]]
[3]
Bolt, R. A. Gaze-Orchestrated Dynamic Windows. In Proc. of the 8th Annual Conference on Computer Graphics and Interactive Techniques, 1981, pp. 109--119.]]
[4]
Duchowski, A. Eye Tracking Methodology: Theory & Practice. Berlin: Springer-Verlag, 2003.]]
[5]
Fono, D. and Vertegaal, R. EyeWindows: Evaluation of Eye-controlled Zooming Windows for Focus Selection. In Proc. CHI'05. Portland: ACM Press, 2005 (in press).]]
[6]
Jacob, R.J.K. The Use of Eye Movements in Human- Computer Interaction Techniques. ACM Transactions on Information Systems 9 (3), 1991, pp. 152--169.]]
[7]
Maglio, P., et al. Gaze and Speech in Attentive User Interfaces. In Proceedings of the International Conference on Multimodal Interfaces. Berlin: Springer-Verlag, 2000.]]
[8]
Oh, A., et al. Evaluating Look-to-Talk: A Gaze-aware Interface in a Collaborative Environment. In Extended Abstracts of CHI 2002. Seattle: ACM, 2002, pp. 650--651.]]
[9]
Phidgets, Inc. LCD Display. http://www.phidgets.com, 2004.]]
[10]
Shell, J.S., Selker, T., and Vertegaal, R. Interacting with Groups of Computers. Communications of the ACM Vol. 46 No. 3 (March 2003), ACM Press, New York; 40--46.]]
[11]
Shell, J.S., et al. ECSGlasses and EyePliances: Using Attention to Open Sociable Windows of Interaction. In Proc.of ACM ETRA 04, San Antonio, TX, 2004.]]
[12]
Vertegaal, R., Slagter, R., Van der Veer, G., and Nijholt, A. Eye Gaze Patterns in Conversations: There is More to Conversational Agents than Meets the Eyes. In Proc. CHI 2001. Seattle: ACM Press, 2001, pp. 301--308.]]

Cited By

View all
  • (2023)ConeSpeech: Exploring Directional Speech Interaction for Multi-Person Remote Communication in Virtual RealityIEEE Transactions on Visualization and Computer Graphics10.1109/TVCG.2023.324708529:5(2647-2657)Online publication date: May-2023
  • (2022)FaceOri: Tracking Head Position and Orientation Using Ultrasonic Ranging on EarphonesProceedings of the 2022 CHI Conference on Human Factors in Computing Systems10.1145/3491102.3517698(1-12)Online publication date: 29-Apr-2022
  • (2020)Listen Only When Spoken To: Interpersonal Communication Cues as Smart Speaker Privacy ControlsProceedings on Privacy Enhancing Technologies10.2478/popets-2020-00262020:2(251-270)Online publication date: 8-May-2020
  • Show More Cited By

Index Terms

  1. Media eyepliances: using eye tracking for remote control focus selection of appliances

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Conferences
    CHI EA '05: CHI '05 Extended Abstracts on Human Factors in Computing Systems
    April 2005
    1358 pages
    ISBN:1595930027
    DOI:10.1145/1056808
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

    Sponsors

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 02 April 2005

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. attentive user interfaces
    2. eye tracking
    3. focus selection
    4. input devices

    Qualifiers

    • Article

    Conference

    CHI05
    Sponsor:

    Acceptance Rates

    Overall Acceptance Rate 6,164 of 23,696 submissions, 26%

    Upcoming Conference

    CHI 2025
    ACM CHI Conference on Human Factors in Computing Systems
    April 26 - May 1, 2025
    Yokohama , Japan

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)12
    • Downloads (Last 6 weeks)0
    Reflects downloads up to 12 Feb 2025

    Other Metrics

    Citations

    Cited By

    View all
    • (2023)ConeSpeech: Exploring Directional Speech Interaction for Multi-Person Remote Communication in Virtual RealityIEEE Transactions on Visualization and Computer Graphics10.1109/TVCG.2023.324708529:5(2647-2657)Online publication date: May-2023
    • (2022)FaceOri: Tracking Head Position and Orientation Using Ultrasonic Ranging on EarphonesProceedings of the 2022 CHI Conference on Human Factors in Computing Systems10.1145/3491102.3517698(1-12)Online publication date: 29-Apr-2022
    • (2020)Listen Only When Spoken To: Interpersonal Communication Cues as Smart Speaker Privacy ControlsProceedings on Privacy Enhancing Technologies10.2478/popets-2020-00262020:2(251-270)Online publication date: 8-May-2020
    • (2020)iWink: Exploring Eyelid Gestures on Mobile DevicesProceedings of the 1st International Workshop on Human-centric Multimedia Analysis10.1145/3422852.3423479(83-89)Online publication date: 12-Oct-2020
    • (2020)GazeConduits: Calibration-Free Cross-Device Collaboration through Gaze and TouchProceedings of the 2020 CHI Conference on Human Factors in Computing Systems10.1145/3313831.3376578(1-10)Online publication date: 21-Apr-2020
    • (2020)BlyncSync: Enabling Multimodal Smartwatch Gestures with Synchronous Touch and BlinkProceedings of the 2020 CHI Conference on Human Factors in Computing Systems10.1145/3313831.3376132(1-14)Online publication date: 21-Apr-2020
    • (2019)Assisted Music Score Reading Using Fixed-Gaze Head MovementProceedings of the ACM on Human-Computer Interaction10.1145/33009623:EICS(1-29)Online publication date: 13-Jun-2019
    • (2017)Motion CorrelationACM Transactions on Computer-Human Interaction10.1145/306493724:3(1-35)Online publication date: 28-Apr-2017
    • (2016)AmbiGazeProceedings of the 2016 ACM Conference on Designing Interactive Systems10.1145/2901790.2901867(812-817)Online publication date: 4-Jun-2016
    • (2015)Combining Direct and Indirect Touch Input for Interactive Workspaces using Gaze InputProceedings of the 3rd ACM Symposium on Spatial User Interaction10.1145/2788940.2788949(79-88)Online publication date: 8-Aug-2015
    • Show More Cited By

    View Options

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Figures

    Tables

    Media

    Share

    Share

    Share this Publication link

    Share on social media