ABSTRACT
We present our ongoing research on automatic segmentation of motion gestures tracked by IMUs. We postulate that by recognizing gesture execution phases from motion data that we may be able to auto-delimit user gesture entries. We demonstrate that machine learning classifiers can be trained to recognize three distinct phases of gesture entry: the start, middle and end of a gesture motion. We further demonstrate that this type of classification can be done at the level of individual gestures. Furthermore, we describe how we captured a new data set for data exploration and discuss a tool we developed to allow manual annotations of gesture phase information. Initial results we obtained using the new data set annotated with our tool show a precision of 0.95 for recognition of the gesture phase and a precision of 0.93 for simultaneous recognition of the gesture phase and the gesture type.
- Ashbrook, D. Enabling Mobile Microinteractions. PhD thesis, Georgia Institute of Technology, 2010. Google ScholarDigital Library
- Guse, D. Gesture-Based User Authentication on Mobile Devices using Accelerometer and Gyroscope (Master's Thesis). Quality and Usability Group, Deutsche Telekom Laboratories, TU Berlin (2011).Google Scholar
- Hoffman, M., and Varcholik, P. Breaking the status quo: Improving 3d gesture recognition with spatially convenient input devices. In IEEE Virtual Reality Conference (VR), IEEE (Waltham, Massachusetts, USA, 2010), 59--66. Google ScholarDigital Library
- Kratz, S., and Rohs, M. A $3 Gesture Recognizer Simple Gesture Recognition for Devices Equipped with 3D Acceleration Sensors. In Proceedings of the 14th international conference on Intelligent user interfaces. (Hong Kong, China, Feb. 2010). Google ScholarDigital Library
- Kuehnel, C., Westermann, T., Hemmert, F., Kratz, S., Müller, A., and Moeller, S. I'm home: defining and evaluating a gesture set for smar-home control. International Journal of Human-Computer Studies 69, 11 (2011), 1071--5819.Google Scholar
- Ruiz, J., and Li, Y. DoubleFlip: a motion gesture delimiter for mobile interaction. In Proceedings of the 2011 annual conference on Human factors in computing systems, ACM (2011), 2717--2720. Google ScholarDigital Library
- Schloemer, T., Poppinga, B., Henze, N., and Boll, S. Gesture recognition with a Wii controller. In Proc. TEI '08, ACM (New York, NY, USA, 2008), 11--14. Google ScholarDigital Library
- Wikipedia Editors. Apple Watch (retrieved 01/05/2015). http://en.wikipedia.org/wiki/Apple_Watch.Google Scholar
- YEI Technology. 3-Space Sensor User's Manual. http://www.yeitechnology.com/sites/default/files/YEI_TSS_Users_Manual_3.0_r1_4Nov2014.pdf, 2014.Google Scholar
Index Terms
- Towards Accurate Automatic Segmentation of IMU-Tracked Motion Gestures
Recommendations
Leap gestures for TV: insights from an elicitation study
TVX '14: Proceedings of the ACM International Conference on Interactive Experiences for TV and Online VideoWe present insights from a gesture elicitation study in the context of interacting with TV, during which 18 participants contributed and rated the execution difficulty and recall likeliness of free-hand gestures for 21 distinct TV tasks. Our study ...
Segmentation of hand gestures using motion capture data
AAMAS '13: Proceedings of the 2013 international conference on Autonomous agents and multi-agent systemsVirtual agent research on gesture is increasingly relying on data-driven algorithms, which require large corpora to be effectively trained. This work presents a method for automatically segmenting human motion into gesture phases based on input motion ...
Teaching motion gestures via recognizer feedback
IUI '14: Proceedings of the 19th international conference on Intelligent User InterfacesWhen using motion gestures, 3D movements of a mobile phone, as an input modality, one significant challenge is how to teach end users the movement parameters necessary to successfully issue a command. Is a simple video or image depicting movement of a ...
Comments