ABSTRACT
The Portable Gestural Interface PyGmI, which we implemented, is a smart tool to interact with a system via simple hand gestures. The user wears some color markers on his fingers and a webcam on his chest. The implemented prototype permits to visualize and navigate into presentation files, thanks to a tiny projector fixed on the user's belt. The gesture recognition uses color segmentation, tracking and the Gesture and Activity Recognition Toolkit (GART). This article presents PyGmI, its setup, the designed gestures, the recognition modules, an application using it and finally an evaluation.
- Argyros, A. A. and Lourakis, M. I. A. 2006. Vision-based interpretation of hand gestures for remote control of a computer mouse. In Computer Vision in Human-Computer Interaction. Springer-Verlag, 40--51. Google ScholarDigital Library
- Isard, M. and Blake, A. 1998. CONDENSATION - Conditional Density Propagation for Visual Tracking. Int. J. Comput. Vision 29, 1, 5--28. Google ScholarDigital Library
- Löchtefeld, M., Gehring, S., Schöning, J., and Krüger, A. 2010. ShelfTorchlight: Augmenting a Shelf using a Camera Projector Unit. In Adjunct Proceedings of the Eighth International Conference on Pervasive Computing.Google Scholar
- Lenman, S., Bretzner, L., and Thuresson, B. 2002. Using marking menus to develop command sets for computer vision based hand gesture interfaces. In NordiCHI '02: Proceedings of the second Nordic conference on Human-computer interaction. ACM, New York, NY, USA, 239--242. Google ScholarDigital Library
- Lyons, K., Brashear, H., Westeyn, T. L., Kim, J. S., and Starner, T. 2007. GART: The gesture and activity recognition toolkit. In HCI (3) (2008-12-04), J. A. Jacko, Ed. Lecture Notes in Computer Science, vol. 4552. Springer, 718--727. Google ScholarDigital Library
- Mistry, P., Maes, P., and Chang, L. 2009. WUW - Wear Ur World: A Wearable Gestural Interface. In CHI '09: Proceedings of the 27th international conference extended abstracts on Human factors in computing systems. ACM, New York, NY, USA, 4111--4116. Google ScholarDigital Library
- Mugellini, E., Abou Khaled, O., Pierroz, S., Carrino, S., and Chabbi Drissi, H. 2009. Generic Framework for Transforming Everyday Objects into Interactive Surfaces. In Proceedings of the 13th International Conference on Human-Computer Interaction. Part III. Springer-Verlag, Berlin, Heidelberg, 473--482. Google ScholarDigital Library
- Sakata, N., Konishi, T., and Nishida, S. 2009. Mobile Interfaces Using Body Worn Projector and Camera. In VMR '09: Proceedings of the 3rd International Conference on Virtual and Mixed Reality. Springer-Verlag, Berlin, Heidelberg, 106--113. Google ScholarDigital Library
- Starner, T. 2001. The Challenges of Wearable Computing: Part 1. IEEE Micro 21, 4, 44--52. Google ScholarDigital Library
Index Terms
- PyGmI: creation and evaluation of a portable gestural interface
Recommendations
BackHand: Sensing Hand Gestures via Back of the Hand
UIST '15: Proceedings of the 28th Annual ACM Symposium on User Interface Software & TechnologyIn this paper, we explore using the back of hands for sensing hand gestures, which interferes less than glove-based approaches and provides better recognition than sensing at wrists and forearms. Our prototype, BackHand, uses an array of strain gauge ...
Touch+Finger: Extending Touch-based User Interface Capabilities with "Idle" Finger Gestures in the Air
UIST '18: Proceedings of the 31st Annual ACM Symposium on User Interface Software and TechnologyIn this paper, we present Touch+Finger, a new interaction technique that augments touch input with multi-finger gestures for rich and expressive interaction. The main idea is that while one finger is engaged in a touch event, a user can leverage the ...
An evaluation of touchless hand gestural interaction for pointing tasks with preferred and non-preferred hands
NordiCHI '14: Proceedings of the 8th Nordic Conference on Human-Computer Interaction: Fun, Fast, FoundationalPerformance evaluations of touchless gestural interaction are generally done by benchmarking pointing performance against existing interactive devices, requiring the use of user's preferred hand. However, as there is no reason for this interaction to be ...
Comments