ABSTRACT
Gesture-based interaction is still underutilized in the mobile context despite the large amount of attention it has been given. Using accelerometers that are widely available in mobile devices, we developed M.Gesture, a software system that supports accelerometer-based gesture authoring on single or multiple mobile devices. The development was based on a formative study that showed users' preferences for subtle, simple motions and synchronized, multi-device gestures. M.Gesture adopts an acceleration data space and interface components based on mass-spring analogy and combines the strengths of both demonstration-based and declarative approaches. Also, gesture declaration is done by specifying a mass-spring trajectory with planes in the acceleration space. For iterative gesture modification, multi-level feedbacks are provided as well. The results of evaluative studies have shown good usability and higher recognition performance than that of dynamic time warping for simple gesture authoring. Later, we discuss the benefits of applying a physical metaphor and hybrid approach.
Supplemental Material
- Daniel Ashbrook and Thad Starner. 2010. MAGIC: a motion gesture design tool. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI 10), 2159--2168. Google ScholarDigital Library
- Mehmet Aydın Baytaş, Yücel Yemez, and Oğuzhan Özcan. 2014. Hotspotizer: end-user authoring of mid-air gestural interactions. In Proceedings of the Nordic Conference on Human-Computer Interaction: Fun, Fast, Foundational (NordiCHI 14), 677--686. Google ScholarDigital Library
- Ari Y. Benbasat, and Joseph A. Paradiso. 2002. An inertial measurement framework for gesture recognition and applications. In Gesture and Sign Language in Human-Computer Interaction, Ipke Wachsmuth and Timo Sowa (eds.). Springer, 9--20. Google ScholarDigital Library
- Liwei Chan, Chi-Hao Hsieh, Yi-Ling Chen, Shuo Yang, Da-Yuan Huang, Rong-Hao Liang, and Bing-Yu Chen. 2015. Cyclops: Wearable and Single-Piece Full-Body Gesture Input Devices. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '15). 3001--3009. Google ScholarDigital Library
- Xiang 'Anthony' Chen, Tovi Grossman, Daniel J. Wigdor, and George Fitzmaurice. 2014. Duet: exploring joint interactions on a smart phone and a smart watch. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '14). 159--168. Google ScholarDigital Library
- Xiang 'Anthony' Chen, Nicolai Marquardt, Anthony Tang, Sebastian Boring, and Saul Greenberg. 2012. Extending a mobile device's interaction space through body-centric interaction. In Proceedings of the international conference on Human-computer interaction with mobile devices and services (MobileHCI '12). 151--160. Google ScholarDigital Library
- Pei-Yu (Peggy) Chi and Yang Li. 2015. Weave: Scripting Cross-Device Wearable Interaction. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '15). 3923--3932. Google ScholarDigital Library
- Andrew Crossan, John Williamson, Stephen Brewster, and Rod Murray-Smith. 2008. Wrist rotation for interaction in mobile contexts. In Proceedings of the international conference on Human computer interaction with mobile devices and services (MobileHCI '08). 435--438. Google ScholarDigital Library
- Allen Cypher. 1993. Watch what I do: programming by demonstration. MIT Press. Google ScholarDigital Library
- Tovi Grossman, Xiang Anthony Chen, and George Fitzmaurice. 2015. Typing on Glasses: Adapting Text Entry to Smart Eyewear. In Proceedings of the International Conference on Human-Computer Interaction with Mobile Devices and Services (MobileHCI '15). 144--152. Google ScholarDigital Library
- Björn Hartmann, Leith Abdulla, Manas Mittal, and Scott R. Klemmer. 2007. Authoring sensor-based interactions by demonstration with direct manipulation and pattern recognition. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '07). 145--154. Google ScholarDigital Library
- Lode Hoste, Brecht De Rooms, and Beat Signer. 2013. Declarative Gesture Spotting Using Inferred and Refined Control Points. In Proceedings of the International Conference on Pattern Recognition Applications and Methods (ICPRAM 2013).Google Scholar
- Steven Houben and Nicolai Marquardt. 2015. WatchConnect: A Toolkit for Prototyping Smartwatch Centric Cross-Device Applications. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '15). 1247--1256. Google ScholarDigital Library
- Edwin L. Hutchins, James D. Hollan, and Donald A. Norman. 1985. Direct manipulation interfaces. Human-Computer Interaction, 1, 4: 311--338. Google ScholarDigital Library
- Ideum. 2013. CreativeML. Retrieved September, 2015 from http://www.creativeml.orgGoogle Scholar
- Ideum. 2014. Gesture Works. Retrieved September, 2015 from http://gestureworks.comGoogle Scholar
- Ju-Whan Kim and Tek-Jin Nam. 2013. EventHurdle: supporting designers' exploratory interaction prototyping with gesture-based sensors. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '13). 267--276. Google ScholarDigital Library
- Kenrick Kin, Björn Hartmann, Tony DeRose, and Maneesh Agrawala. 2012. Proton: multitouch gestures as regular expressions. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '12). 2885--2894. Google ScholarDigital Library
- Kenrick Kin, Björn Hartmann, Tony DeRose, and Maneesh Agrawala. 2012. Proton++: a customizable declarative multitouch framework. In Proceedings of the ACM symposium on User interface software and technology (UIST '12). 477--486. Google ScholarDigital Library
- Sven Kratz and Michael Rohs. 2010. A $3 gesture recognizer: simple gesture recognition for devices equipped with 3D acceleration sensors. In Proceedings of the international conference on Intelligent user interfaces (IUI '10). 341--344. Google ScholarDigital Library
- Yang Li. 2010. Protractor: a fast and accurate gesture recognizer. In Proceesdings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '10). 2169--2172. Google ScholarDigital Library
- Jiayang Liu, Zhen Wang, Lin Zhong, Jehan Wickramasuriya, and Venu Vasudevan. 2009. uWave: Accelerometer-based personalized gesture recognition and its applications. In Proceedings of IEEE International Conference on Pervasive Computing and Communications (PerCom '09), 1--9. Google ScholarDigital Library
- Hao Lü and Yang Li. 2012. Gesture coder: a tool for programming multi-touch gestures by demonstration. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '12). 2875--2884. Google ScholarDigital Library
- Hao Lü and Yang Li. 2013. Gesture studio: authoring multi-touch interactions through demonstration and declaration. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '13). 257--266. Google ScholarDigital Library
- Sushmita Mitra and Tinku Acharya. 2007. Gesture recognition: A survey. IEEE Transactions on Systems, Man, and Cybernetics-Part C: Applications and Reviews, 37, 3: 311--324. Google ScholarDigital Library
- Miguel A. Nacenta, Yemliha Kamber, Yizhou Qiang, and Per Ola Kristensson. 2013. Memorability of pre-designed and user-defined gesture sets. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '13). 1099--1108. Google ScholarDigital Library
- Donald A. Norman and Stephen W. Draper. 1986. User Centered System Design; New Perspectives on Human Computer Interaction. L. Erlbaum Assoc. Inc. Google ScholarDigital Library
- Taiwoo Park, Jinwon Lee, Inseok Hwang, Chungkuk Yoo, Lama Nachman, and Junehwa Song. 2011. EGesture: a collaborative architecture for energy-efficient gesture recognition with hand-worn sensor and mobile devices. In Proceedings of the ACM Conference on Embedded Networked Sensor Systems (SenSys '11). 260--273. Google ScholarDigital Library
- Dean Rubine. 1991. Specifying gestures by example. SIGGRAPH Comput. Graph. 25, 4: 329--337. Google ScholarDigital Library
- Jaime Ruiz, Yang Li, and Edward Lank. 2011. User defined motion gestures for mobile interaction. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '11). 197--206. Google ScholarDigital Library
- Hiroaki Sakoe and Seibi Chiba. 1978. Dynamic programming algorithm optimization for spoken word recognition. IEEE Transactions on Acoustics, Speech and Signal Processing, 26, 1: 43--49.Google ScholarCross Ref
- Lucio Davide Spano, Antonio Cisternino, Fabio Paternò, and Gianni Fenu. 2013. GestIT: a declarative and compositional framework for multi-platform gesture definition. In Proceedings of the SIGCHI symposium on Engineering interactive computing systems (EICS '13). 187--196. Google ScholarDigital Library
- Jacob O. Wobbrock, Meredith Ringel Morris, and Andrew D. Wilson. 2009. User-defined gestures for surface computing. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '09). 1083--1092. Google ScholarDigital Library
- Jacob O. Wobbrock, Andrew D. Wilson, and Yang Li. 2007. Gestures without libraries, toolkits or training: a $1 recognizer for user interface prototypes. In Proceedings of the ACM symposium on User interface software and technology (UIST '07). 159--168. Google ScholarDigital Library
Index Terms
- M.Gesture: An Acceleration-Based Gesture Authoring System on Multiple Handheld and Wearable Devices
Recommendations
Enabling Hand Gesture Customization on Wrist-Worn Devices
CHI '22: Proceedings of the 2022 CHI Conference on Human Factors in Computing SystemsWe present a framework for gesture customization requiring minimal examples from users, all without degrading the performance of existing gesture sets. To achieve this, we first deployed a large-scale study (N=500+) to collect data and train an ...
A Systematic Review of Gesture Elicitation Studies: What Can We Learn from 216 Studies?
DIS '20: Proceedings of the 2020 ACM Designing Interactive Systems ConferenceGesture elicitation studies represent a popular and resourceful method in HCI to inform the design of intuitive gesture commands, reflective of end-users' behavior, for controlling all kinds of interactive devices, applications, and systems. In the last ...
Serendipity: Finger Gesture Recognition using an Off-the-Shelf Smartwatch
CHI '16: Proceedings of the 2016 CHI Conference on Human Factors in Computing SystemsPrevious work on muscle activity sensing has leveraged specialized sensors such as electromyography and force sensitive resistors. While these sensors show great potential for detecting finger/hand gestures, they require additional hardware that adds to ...
Comments