ABSTRACT
In this paper, we describe techniques for barehanded interaction between human and computer. Barehanded means that no device and no wires are attached to the user, who controls the computer directly with the movements of his/her hand.Our approach is centered on the needs of the user. We therefore define requirements for real-time barehanded interaction, derived from application scenarios and usability considerations. Based on those requirements a finger-finding and hand-posture recognition algorithm is developed and evaluated.To demonstrate the strength of the algorithm, we build three sample applications. Finger tracking and hand posture recognition are used to paint virtually onto the wall, to control a presentation with hand postures, and to move virtual items on the wall during a brainstorming session. We conclude the paper with user tests, which were conducted to prove the usability of bare-hand human computer interaction.
Supplemental Material
- Bérard, F. Vision par ordinateur pour l'interaction homme-machine fortement couplée, Doctoral Theses, Université Joseph Fourier, Grenoble, 1999.]]Google Scholar
- Blake, A., Isard, M., and Reynard, D. Learning to track the visual motion of contours, Artificial Intelligence, 78, 101--134, 1995.]] Google ScholarDigital Library
- Card, S. Moran, T. Newell, A. The Psychology of Human-Computer Interaction, Lawrence Erlbaum Associates, 1983.]] Google ScholarDigital Library
- Crowley, J., Bérard, F., and Coutaz, J. Finger tacking as an input device for augmented reality, Automatic Face and Gesture Recognition, Zürich, 195--200, 1995.]]Google Scholar
- Freeman, W., Anderson, D. and Beardsley, P. Computer Vision for Interactive Computer Graphics, IEEE Computer Graphics and Applications, 42--53, Mai-June 1998.]] Google ScholarDigital Library
- Laptev, I. and Lindeberg, T. Tracking of Multi-State Hand Models Using Particle Filtering and a Hierarchy of Multi-Scale Image Features, Technical report ISRN KTH/NA/P-00/ 12-SE, September 2000.]]Google Scholar
- Lee, J. and Kunii, T. Constraint-based hand animation, in Models and techniques in computer animation, 110--127, Springer Verlag, Tokyo, 1993.]]Google Scholar
- Lien, C. and Huang, C. Model-Based Articulated Hand Motion Tracking For Gesture Recognition, Image and Vision Computing, vol. 16, no. 2, 121--134, February 1998.]]Google ScholarCross Ref
- MacCormick, J. M. and Isard, M. Partitioned sampling, articulated objects, and interface-quality hand tracking, European Conference on Computer Vision, Dublin, 2000.]] Google ScholarDigital Library
- MacKenzie, I. and Ware, C. Lag as a determinant of Human Performance in Interactive Systems. Conference on Human Factors in Computing Systems, 488--493, New York, 1993.]] Google ScholarDigital Library
- O'Hagan, R. and Zelinsky, A. Finger Track - A Robust and Real-Time Gesture Interface, Australian Joint Conference on Artificial Intelligence, Perth, 1997.]] Google ScholarDigital Library
- Quek, F., Mysliwiec, T. and Zhao, M. Finger mouse: A freehand pointing interface, International Workshop on Automatic Face- and Gesture-Recognition, Zürich, 1995.]]Google Scholar
- Rehg, J. and Kanade, T. Digiteyes: Vision-based human hand tracking, Technical Report CMU-CS-93-220, School of Computer Science, Carnegie Mellon University, 1993.]] Google ScholarDigital Library
- Sato, Y., Kobayashi, Y. and Koike, H. Fast Tracking of Hands and Fingertips in Infrared Images for Augmented Desk Interface, International Conference on Automatic Face and Gesture Recognition, Grenoble, 2000.]] Google ScholarDigital Library
- Segen, J. GestureVR: Vision-Based 3D Hand Interface for Spatial Interaction, ACM Multimedia Conference, Bristol, 1998.]] Google ScholarDigital Library
- Stafford-Fraser, J. Video-Augmented Environments, PhD theses, Gonville & Caius College, University of Cambridge, 1996.]]Google Scholar
- Triesch, J. and Malsburg, C. Robust Classification of Hand Postures Against Complex Background, International Conference On Automatic Face and Gesture Recognition, Killington, 1996.]] Google ScholarDigital Library
- Ware, C. and Balakrishnan, R. Researching for Objects in VR Displays: Lag and Frame Rate, ACM Transactions on Computer-Human Interaction, vol. 1, no. 4, 331--356, 1994.]] Google ScholarDigital Library
- Zhu, X., Yang, J. and Waibel, A. Segmenting Hands of Arbitrary Color, International Conference on Automatic Face and Gesture Recognition, Grenoble, 2000.]] Google ScholarDigital Library
Index Terms
- Bare-hand human-computer interaction
Recommendations
A Real Time Vision-Based Hand Gesture Interaction
AMS '10: Proceedings of the 2010 Fourth Asia International Conference on Mathematical/Analytical Modelling and Computer SimulationMoving beyond mouse and keyboard, the evolution of human-computer interaction (HCI) has been an interest research in recent years which witnessed the development from text-based like using a keyboard to graphic user interface (GUI) based on a mouse, ...
The 3d sensor table for bare hand tracking and posture recognition
MMM'07: Proceedings of the 13th international conference on Multimedia Modeling - Volume Part IThe 3D Sensor Table system senses the movement of bare-hand and recognizes simple hand postures, such as stretched-hand, fist, and knife-shape hand. This system is designed for user interaction with real-time two- and three-dimensional graphics ...
A dynamic hand gesture recognition dataset for human-computer interfaces
AbstractComputer vision systems are commonly used to design touch-less human-computer interfaces (HCI) based on dynamic hand gesture recognition (HGR) systems, which have a wide range of applications in several domains, such as, gaming, ...
Comments