skip to main content
10.1145/1866029.1866033acmconferencesArticle/Chapter ViewAbstractPublication PagesuistConference Proceedingsconference-collections
research-article

Imaginary interfaces: spatial interaction with empty hands and without visual feedback

Published: 03 October 2010 Publication History

Abstract

Screen-less wearable devices allow for the smallest form factor and thus the maximum mobility. However, current screen-less devices only support buttons and gestures. Pointing is not supported because users have nothing to point at. However, we challenge the notion that spatial interaction requires a screen and propose a method for bringing spatial interaction to screen-less devices.
We present Imaginary Interfaces, screen-less devices that allow users to perform spatial interaction with empty hands and without visual feedback. Unlike projection-based solutions, such as Sixth Sense, all visual "feedback" takes place in the user's imagination. Users define the origin of an imaginary space by forming an L-shaped coordinate cross with their non-dominant hand. Users then point and draw with their dominant hand in the resulting space.
With three user studies we investigate the question: To what extent can users interact spatially with a user interface that exists only in their imagination? Participants created simple drawings, annotated existing drawings, and pointed at locations described in imaginary space. Our findings suggest that users' visual short-term memory can, in part, replace the feedback conventionally displayed on a screen.

References

[1]
}}Ashbrook, D. Enabling Mobile Microinteractions. PhD Thesis (2010).
[2]
}}Baddeley, A. D., and Hitch, G. Working memory. In G. H. Bower (Ed.), The Psychology of Learning and Motivation: Advances in Research and Theory 8, (1974), 47--89.
[3]
}}Baudel, T. and Beaudouin-Lafon, M. Charade: remote control of objects using free-hand gestures. CACM 36, 7 (1993), 28--35.
[4]
}}Baudisch, P and Chu, G. Back-of-device interaction allows creating very small touch devices. In Proc. CHI, (2009), 1923--1932.
[5]
}}Billinghurst M., Bowskill J., Dyer N., and Morphett J., Spatial information displays on a wearable computer, IEEE CG&A 18, (1998), 24--30.
[6]
}}Butler, A., Izadi, S., and Hodges, S. SideSight: multi-"touch" interaction around small devices. In Proc. UIST, (2008), 201--204.
[7]
}}Couvreur, J. Pen Stroke Recognition. Graffiti code download at http://blog.monstuff.com/archives/000012.htm.
[8]
}}Emmorey, K., Tversky, B., and Taylor, H. A. Using space to describe space: perspective in speech, sign, and gesture. Spatial Cognition and Computation 2, 3 (2000), 157--180.
[9]
}}Fitzmaurice, G. W. Situated information spaces and spatially aware palmtop computers. CACM 36, (1993), 39--49.
[10]
}}Fukumoto, M. and Tonomura, Y. Body-coupled FingerRing: wireless wearable keyboard. In Proc. CHI, (1997), 147--154.
[11]
}}Graffiti. http://en.wikipedia.org/wiki/Graffiti_(Palm_OS.
[12]
}}Guiard, Y. Asymmetric division of labor in human skilled bimanual action: the kinematic chain as a model. Journal of Motor Behaviour 19, 4 (1987), 486--517.
[13]
}}Harrison, C. and Hudson, S. E. Abracadabra: wireless, high-precision, and unpowered finger input for very small mobile devices. In Proc. UIST, (2009), 121--124.
[14]
}}Harrision, C. and Hudson, S. Minput: enabling interaction on small mobile devices with high-precision, low-cost, multipoint optical tracking. In Proc. CHI, (2010), 1661--1664.
[15]
}}Harrison, C., Tan, D., and Morris, D. Skinput: appropriating the body as an input surface. In Proc. CHI, (2010), 453--462.
[16]
}}Hinckley, K., Pausch, R., and Proffitt, D. Attention and visual feedback: the bimanual frame of reference. Proc. Symposium on Interactive 3D Graphics, (1997), 121--126.
[17]
}}Holz, C. and Baudisch, P. The generalized perceived input point model and how to double touch accuracy by extracting fingerprints. In Proc. CHI, (2010), 581--590.
[18]
}}Jacob, R. J., Girouard, A., Hirshfield, L. M., Horn, M. S., Shaer, O., Solovey, E. T. and Zigelbaum, J, Reality-based interaction: a framework for post-WIMP interfaces. In Proc. CHI, (2008), 201--210.
[19]
}}Kratz, S. and Rohs, M. HoverFlow: expanding the design space of around-device interaction. In Proc. MobileHCI, (2009), 1--8.
[20]
}}Li, F. C. Y., Dearman, D., and Truong, K. N. Virtual Shelves: interactions with orientation aware devices. In Proc. UIST, (2009), 125--128.
[21]
}}MacKenzie, I. S. and Zhang, S. X. The immediate usability of Graffiti. In Proc. GI, (1997), 129--137.
[22]
}}Mistry, P., Maes, P., and Chang, L. WUW - wear Ur world: a wearable gestural interface. In CHI Extended Abstracts, (2009), 4111--4116.
[23]
}}Ni, T. and Baudisch, P. Disappearing mobile devices. In Proc. UIST, (2009), 101--110.
[24]
}}Rekimoto, J. GestureWrist and GesturePad: Unobtrusive Wearable Interaction Devices. In Proc. ISWC, (2001), 21--27.
[25]
}}Starner, T., Auxier, J., Ashbrook, D., and Gandy, M. The Gesture Pendant: a self-illuminating, wearable, infrared computer vision system for home automation control and medical monitoring. In Proc. ISWC, (2000), 87--94.
[26]
}}Strachan, S., Murray-Smith, R., and O'Modhrain, S. BodySpace: inferring body pose for natural control of a music player. In CHI Extended Abstracts, (2007), 2001--2006.
[27]
}}Sturman, D. J. and Zeltzer, D. A design method for "whole-hand" human-computer interaction. TOIS 11, 3 (1993), 219--238.
[28]
}}Tamaki, E., Miyaki, T., and Rekimoto, J. Brainy hand: an ear-worn hand gesture interaction device. In CHI Extended Abstracts, (2009), 4255--4260.
[29]
}}Wilson, A. D. and Oliver, N. GWindows: robust stereo vision for gesture-based control of windows. In Proc. ICMI, (2003), 211--218.
[30]
}}Wilson, A. D. Robust computer vision-based detection of pinching for one and two-handed gesture input. In Proc. UIST, (2006), 255--258.
[31]
}}Yee, K. Peephole Displays: Pen Interaction on Spatially Aware Handheld Computers. In Proc. CHI, (2003). 1--8.

Cited By

View all
  • (2024)Investigating Creation Perspectives and Icon Placement Preferences for On-Body Menus in Virtual RealityProceedings of the ACM on Human-Computer Interaction10.1145/36981368:ISS(236-254)Online publication date: 24-Oct-2024
  • (2024)Exploring Uni-manual Around Ear Off-Device Gestures for EarablesProceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies10.1145/36435138:1(1-29)Online publication date: 6-Mar-2024
  • (2024)Reflected RealityProceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies10.1145/36314317:4(1-28)Online publication date: 12-Jan-2024
  • Show More Cited By

Index Terms

  1. Imaginary interfaces: spatial interaction with empty hands and without visual feedback

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Conferences
    UIST '10: Proceedings of the 23nd annual ACM symposium on User interface software and technology
    October 2010
    476 pages
    ISBN:9781450302715
    DOI:10.1145/1866029
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

    Sponsors

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 03 October 2010

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. bimanual
    2. computer vision
    3. gesture
    4. memory
    5. mobile
    6. screen-less
    7. spatial
    8. wearable

    Qualifiers

    • Research-article

    Conference

    UIST '10

    Acceptance Rates

    Overall Acceptance Rate 561 of 2,567 submissions, 22%

    Upcoming Conference

    UIST '25
    The 38th Annual ACM Symposium on User Interface Software and Technology
    September 28 - October 1, 2025
    Busan , Republic of Korea

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)97
    • Downloads (Last 6 weeks)7
    Reflects downloads up to 12 Feb 2025

    Other Metrics

    Citations

    Cited By

    View all
    • (2024)Investigating Creation Perspectives and Icon Placement Preferences for On-Body Menus in Virtual RealityProceedings of the ACM on Human-Computer Interaction10.1145/36981368:ISS(236-254)Online publication date: 24-Oct-2024
    • (2024)Exploring Uni-manual Around Ear Off-Device Gestures for EarablesProceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies10.1145/36435138:1(1-29)Online publication date: 6-Mar-2024
    • (2024)Reflected RealityProceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies10.1145/36314317:4(1-28)Online publication date: 12-Jan-2024
    • (2024)vARitouch: Back of the Finger Device for Adding Variable Compliance to Rigid ObjectsProceedings of the 2024 CHI Conference on Human Factors in Computing Systems10.1145/3613904.3642828(1-20)Online publication date: 11-May-2024
    • (2024)MAF: Exploring Mobile Acoustic Field for Hand-to-Face Gesture InteractionsProceedings of the 2024 CHI Conference on Human Factors in Computing Systems10.1145/3613904.3642437(1-20)Online publication date: 11-May-2024
    • (2024)PalmSpaceInternational Journal of Human-Computer Studies10.1016/j.ijhcs.2024.103219184:COnline publication date: 1-Apr-2024
    • (2023)Commonsense Knowledge-Driven Joint Reasoning Approach for Object Retrieval in Virtual RealityACM Transactions on Graphics10.1145/361832042:6(1-18)Online publication date: 5-Dec-2023
    • (2023)From Natural to Non-Natural Interaction: Embracing Interaction Design Beyond the Accepted Convention of NaturalProceedings of the 25th International Conference on Multimodal Interaction10.1145/3577190.3616122(684-688)Online publication date: 9-Oct-2023
    • (2023)SparseIMU: Computational Design of Sparse IMU Layouts for Sensing Fine-grained Finger MicrogesturesACM Transactions on Computer-Human Interaction10.1145/356989430:3(1-40)Online publication date: 10-Jun-2023
    • (2023)HOOV: Hand Out-Of-View Tracking for Proprioceptive Interaction using Inertial SensingProceedings of the 2023 CHI Conference on Human Factors in Computing Systems10.1145/3544548.3581468(1-16)Online publication date: 19-Apr-2023
    • Show More Cited By

    View Options

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Figures

    Tables

    Media

    Share

    Share

    Share this Publication link

    Share on social media