skip to main content
10.1145/1095034.1095050acmconferencesArticle/Chapter ViewAbstractPublication PagesuistConference Proceedingsconference-collections
Article

eyeLook: using attention to facilitate mobile media consumption

Published: 23 October 2005 Publication History

Abstract

One of the problems with mobile media devices is that they may distract users during critical everyday tasks, such as navigating the streets of a busy city. We addressed this issue in the design of eyeLook: a platform for attention sensitive mobile computing. eyeLook appliances use embedded low cost eyeCONTACT sensors (ECS) to detect when the user looks at the display. We discuss two eyeLook applications, seeTV and seeTXT, that facilitate courteous media consumption in mobile contexts by using the ECS to respond to user attention. seeTV is an attentive mobile video player that automatically pauses content when the user is not looking. seeTXT is an attentive speed reading application that flashes words on the display, advancing text only when the user is looking. By making mobile media devices sensitive to actual user attention, eyeLook allows applications to gracefully transition users between consuming media, and managing life.

References

[1]
Laarni, J. "Searching for Optimal Methods of Presenting Dynamic Text on different Types of Screens." In Extended Abstracts of NordiCHI 2002. Aarhus, Denmark, ACM Press 2002.]]
[2]
Lockerd, A. and F. Mueller. "LAFCam: Leveraging Affective Feedback Camcorder." In Extended Abstracts of CHI 02. Minneapolis: ACM Press, 2002, pp. 574--575]]
[3]
Mills, C., Weldon, L. "Reading Text From Computer Screens." ACM Computing Survey. ACM Press 1988.]]
[4]
Oulasvirta, A., Salovaara, A. "A Cognitive Meta-Analysis of Design Approaches to Interruptions in Intelligent Environments." Extended Abstracts, CHI 2004. Vienna, Austria.]]
[5]
Skaburskis, A. W., Shell, J.S., Vertegaal, R., and Dickie, C. "AuraMirror: Artistically Visualizing Attention." In Extended Abstracts of ACM CHI 2003 Conference on Human Factors in Computing Systems, 2003.]]
[6]
Selker, T. et al. "Eye-aRe, a Glasses-Mounted Eye Motion Detection Interface." In Extended Abstracts of CHI 2001. Seattle: ACM, 2001.]]
[7]
Sony Ericsson, http://www.sonyericsson.com]]
[8]
Vertegaal, R. Dickie, C., Sohn, C, and Flickner, M. "Designing Attentive Cell Phones Using Wearable EyeContact Sensors." In Extended Abstracts of ACM CHI 2002 Conference on Human Factors in Computing Systems. Minneapolis: ACM Press, 2002.]]
[9]
Vertegaal, R. and Ding, Y. "Explaining Effects of Eye Gaze on Mediated Group Conversations: Amount or Synchronization?" In Proceedings of CSCW 2002 Conference on Computer Supported Collaborative Work. New Orleans: ACM Press, 2002.]]
[10]
Vertegaal, R. Weevers, I. and Sohn, C. "GAZE-2: An Attentive Video Conferencing System." In Extended Abstracts of ACM CHI 2002 Conference on Human Factors in Computing Systems. Minneapolis: ACM Press, 2002.]]

Cited By

View all
  • (2023)Comparing Dwell time, Pursuits and Gaze Gestures for Gaze Interaction on Handheld Mobile DevicesProceedings of the 2023 CHI Conference on Human Factors in Computing Systems10.1145/3544548.3580871(1-17)Online publication date: 19-Apr-2023
  • (2023)Reading and Walking with Smart Glasses: Effects of Display and Control Modes on SafetyInternational Journal of Human–Computer Interaction10.1080/10447318.2023.227652940:23(7875-7891)Online publication date: 7-Nov-2023
  • (2022)Multimodal Natural Human–Computer Interfaces for Computer-Aided Design: A Review PaperApplied Sciences10.3390/app1213651012:13(6510)Online publication date: 27-Jun-2022
  • Show More Cited By

Index Terms

  1. eyeLook: using attention to facilitate mobile media consumption

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Conferences
    UIST '05: Proceedings of the 18th annual ACM symposium on User interface software and technology
    October 2005
    270 pages
    ISBN:1595932712
    DOI:10.1145/1095034
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

    Sponsors

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 23 October 2005

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. attentive user interfaces
    2. context-aware computing
    3. eye tracking
    4. mobile computing
    5. ubiquitous computing

    Qualifiers

    • Article

    Conference

    UIST05

    Acceptance Rates

    UIST '05 Paper Acceptance Rate 31 of 159 submissions, 19%;
    Overall Acceptance Rate 561 of 2,567 submissions, 22%

    Upcoming Conference

    UIST '25
    The 38th Annual ACM Symposium on User Interface Software and Technology
    September 28 - October 1, 2025
    Busan , Republic of Korea

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)7
    • Downloads (Last 6 weeks)0
    Reflects downloads up to 12 Feb 2025

    Other Metrics

    Citations

    Cited By

    View all
    • (2023)Comparing Dwell time, Pursuits and Gaze Gestures for Gaze Interaction on Handheld Mobile DevicesProceedings of the 2023 CHI Conference on Human Factors in Computing Systems10.1145/3544548.3580871(1-17)Online publication date: 19-Apr-2023
    • (2023)Reading and Walking with Smart Glasses: Effects of Display and Control Modes on SafetyInternational Journal of Human–Computer Interaction10.1080/10447318.2023.227652940:23(7875-7891)Online publication date: 7-Nov-2023
    • (2022)Multimodal Natural Human–Computer Interfaces for Computer-Aided Design: A Review PaperApplied Sciences10.3390/app1213651012:13(6510)Online publication date: 27-Jun-2022
    • (2020)Quantification of Users' Visual Attention During Everyday Mobile Device InteractionsProceedings of the 2020 CHI Conference on Human Factors in Computing Systems10.1145/3313831.3376449(1-14)Online publication date: 21-Apr-2020
    • (2020)SaferCross: Enhancing Pedestrian Safety Using Embedded Sensors of SmartphoneIEEE Access10.1109/ACCESS.2020.29800858(49657-49670)Online publication date: 2020
    • (2018)Accurate Model-Based Point of Gaze Estimation on Mobile DevicesVision10.3390/vision20300352:3(35)Online publication date: 24-Aug-2018
    • (2018)Towards Attentive Speed Reading on Small Screen Wearable DevicesProceedings of the 20th ACM International Conference on Multimodal Interaction10.1145/3242969.3243009(278-287)Online publication date: 2-Oct-2018
    • (2018)The past, present, and future of gaze-enabled handheld mobile devicesProceedings of the 20th International Conference on Human-Computer Interaction with Mobile Devices and Services10.1145/3229434.3229452(1-17)Online publication date: 3-Sep-2018
    • (2018)Chapter 2. Eye gaze as a cue for recognizing intention and coordinating joint actionEye-tracking in Interaction10.1075/ais.10.02ama(21-46)Online publication date: 23-Oct-2018
    • (2017)SmartRSVPProceedings of the 2017 CHI Conference Extended Abstracts on Human Factors in Computing Systems10.1145/3027063.3053176(1640-1647)Online publication date: 6-May-2017
    • Show More Cited By

    View Options

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Figures

    Tables

    Media

    Share

    Share

    Share this Publication link

    Share on social media