ABSTRACT
Large displays are everywhere. However, the computer mouse remains the most common interaction tool for such displays. We propose a new approach for fingertip interaction with large display systems using monocular computer vision. By taking into account the location of the user and the interaction area available, we can estimate an interaction surface - virtual touchscreen - between the display and the user. Users can use their pointing finger to interact with the display as if it was brought forward and presented directly in front of them, while preserving viewing angle. An interaction model is presented to describe the interaction with the virtual touchscreen, using the head-hand line method. Initial results, in the form of a work-in-progress prototype, demonstrate the feasibility of this concept.
- Cheng, K. & Takatsuka, M. Real-time Monocular Tracking of View Frustum for Large Screen Human-Computer Interaction. In ACSC 2005 (2005), pp 125--134. Google ScholarDigital Library
- Cipolla, R. & Hollinghurst, N. J. A Human-Robot Interface using Pointing with Unclibrated Stereo Vision. Computer Vision for Human-Machine Interaction, Cambridge University Press (1998), pp. 97--110.Google Scholar
- Colombo, C., Bimbo, A. D. & Valli, A. Visual Capture and Understanding of Hand Pointing Actions in a 3-D Environment. In IEEE Transactions on Systems, Man, and Cybernetics, 33(4), (2003), pp 677--686. Google ScholarDigital Library
- Czerwinski, M., Smith, G., Regan, T., et al. Toward Characterizing the Productivity Benefits of Very Large Displays. In Proc. INTERACT 2003, IOS Press (2003), pp 9--16.Google Scholar
- Lertrusdachakul, T., Taguchi, A., Aoki, T., et al. Transparent eye contact and gesture videoconference. In International Journal of Wireless and Mobile Computing, 1(1), (2005), pp 29--37. Google ScholarDigital Library
- Moeslund, T. B., Storring, M. & Granum, E. Vision-Based User Interface for Interacting with a Virtual Environment. In Proc. DANKOMB 2000 (2000), pp 20--28.Google Scholar
- Nickel, K. & Stiefelhagen, R. Pointing Gesture Recognition based on 3D-Tracking of Face, Hands and Head-Orientation. In Proc. ICMI '03, ACM Press (2003), pp 140--146. Google ScholarDigital Library
- Norman, D. A. Cognitive Engineering, Lawrence Erlbaum Associates, Inc., New Jersey (1986).Google Scholar
- Oka, K., Sato, Y. & Koike, H. Real-Time Fingertip Tracking and Gesture Recognition. In IEEE Computer Graphics and Applications, 22(6), (2002), pp 64--71. Google ScholarDigital Library
- Rakkolainen, I. MobiVR - A Novel User Interface Concept for Mobile Computing. In Proc. IMC'03 (2003), pp 107--112.Google Scholar
- Sun Microsystems Inc., definition of View Frustum, http://java.sun.com/products/javamedia/3D/forDevelopers/j3dguide/glossary.doc.html#47353, date accessed: 23 June 2006Google Scholar
- Taylor, J. & McCloskey, D. I. Pointing. In Behavioural Brain Research, 29, (1988), pp 1--5.Google ScholarCross Ref
Index Terms
- Estimating virtual touchscreen for fingertip interaction with large displays
Recommendations
Real-time monocular tracking of view frustum for large screen human-computer interaction
ACSC '05: Proceedings of the Twenty-eighth Australasian conference on Computer Science - Volume 38This paper introduces a novel approach towards direct interaction with large display systems. Monocular computer vision is utilised to avoid restraints imposed by input devices. Tracking the user's head and determining the view frustum in real-time is ...
Comparing Gestures and Traditional Interaction Modalities on Large Displays
INTERACT '09: Proceedings of the 12th IFIP TC 13 International Conference on Human-Computer Interaction: Part IIInterfaces based on gesture recognition offer a simple and intuitive alternative to the use of traditional menus, keyboard and mouse. In this paper we explore the field of gestural interaction on large screen displays, presenting the results of a study ...
Gestural-Vocal Coordinated Interaction on Large Displays
EICS '22 Companion: Companion of the 2022 ACM SIGCHI Symposium on Engineering Interactive Computing SystemsOn large displays, using keyboard and mouse input is challenging because small mouse movements do not scale well with the size of the display and individual elements on screen. We present “Large User Interface” (LUI), which coordinates gestural and ...
Comments