ABSTRACT
In recent years, hand-tracking technologies had been implemented in Virtual Reality application, allowing users to use natural hand gesture for interaction within the environment. However, little efforts have been conducted in understanding user's preference when they use their hands to interact with the VR world. In this paper, the result of a guessability test for hand gestures in order to operate musical tasks to support music interaction within immersive Virtual Environment. A total number of 750 gestures have been elicited from 15 participants for 50 selected tasks, including 10 musical tasks. Our result enables a smaller size of gesture set to be elicited from the users. The implications of this work can be relevant in hand gesture design, gesture interaction, and gestural interfaces design for music interaction, all of which are highlighted in this study.
- J. O. Wobbrock, M. R. Morris and A. D. Wilson, "User-defined gestures for surface computing" Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, pp. 1083--1092, 2009. Google ScholarDigital Library
- J. Ruiz, Y. Li and E. Lank, "User-defined motion gestures for mobile interaction" Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, pp. 197--206, 2011. Google ScholarDigital Library
- T. Piumsomboon, A. Clark and M. Billinghurst, "User-defined gestures for augmented reality" IFIP Conference on Human-Computer Interaction, pp. 282--299, 2013. Google ScholarDigital Library
- D. England, "Whole body interaction: An introduction" Whole Body Interaction, pp. 1--5, 2011. Google ScholarCross Ref
- S. Jordà, G. Geiger, M. Alonso and M. Kaltenbrunner, "The reacTable: exploring the synergy between live music performance and tabletop tangible interfaces" Proceedings of the 1st international conference on Tangible and embedded interaction, pp. 139--146, 2007. Google ScholarDigital Library
- S. Holland, "Learning about harmony with Harmony Space: an overview" Music education: An artificial intelligence approach, pp. 24--40, 1994.Google Scholar
- J. Bird, S. Holland, P. Marshall, Y. Rogers and A. Clark, "Feel the force: Using tactile technologies to investigate the extended mind" Proceedings of Devices that Alter Perception workshop (DAP 08), 2008.Google Scholar
- A. Bouwer, S. Holland and M. Dalgleish, "Song walker harmony space: Embodied interaction design for complex musical skills" Music and human-computer interaction, pp. 207--221, 2013.Google Scholar
- A. F. Blackwell, C. Britton, A. Cox, T. R. Green, C. Gurr, G. Kadoda, M. S. Kutar, M. Lomes, C. L. Nehaniv, M. Petre and C. Roast, "Cognitive dimensions of notations: Design tools for cognitive technology" Cognitive Technology: Instruments of Mind, pp. 325--341, 2001.Google ScholarCross Ref
- J. Nielsen, "Noncommand user interfaces" Communications of the ACM, vol. 36, no. 4, pp. 83--99, 1993. Google ScholarDigital Library
- Z. Feng, S. Xu, X. Zhang, L. Jin, Z. Ye and W. Yang, "Real-time fingertip tracking and detection using Kinect depth sensor for a new writing-in-the air system" in ACM, 2012.Google Scholar
- N. Xu, W. Wang and X. Qu, "Recognition of in-air handwritten Chinese character based on Leap Motion Controller" International Conference on Image and Graphics, pp. 160--168, August 2015. Google ScholarCross Ref
- Y. Niu and H. Chen, "Gesture authentication with touch input for mobile devices" International Conference on Security and Privacy in Mobile Information and Communication Systems, pp. 13--24, 2011.Google Scholar
- G. Ren and E. O'Neill, "Freehand gestural text entry for interactive TV" Proceedings of the 11th european conference on Interactive TV and video, pp. 121--130, 2013. Google ScholarDigital Library
- J. Cui, W. F. DIeter, A. Kuijper and A. Sourin, "Mid-Air Gestures for Virtual Modeling with Leap Motion" International Conference on Distributed, Ambient, and Pervasive Interactions, pp. 221--230, 2016. Google ScholarCross Ref
- S. Garbaya and U. Zaldivar-Colado, "The affect of contact force sensations on user performance in virtual assembly tasks" Virtual Reality, vol. 11, no. 4, pp. 287--299, 2007. Google ScholarDigital Library
- C. Lewis and R. Mack, "Learning to use a text processing system: Evidence from "thinking aloud" protocols" Proceedings of the 1982 conference on Human factors in computing systems, pp. 387--392, 1982. Google ScholarDigital Library
Index Terms
- A User-Defined Gesture Set for Music Interaction in Immersive Virtual Environment
Recommendations
User-defined gestures for augmented reality
CHI EA '13: CHI '13 Extended Abstracts on Human Factors in Computing SystemsRecently there has been an increase in research of hand gestures for interaction in the area of Augmented Reality (AR). However this research has focused on developer designed gestures, and little is known about user preference and behavior for gestures ...
User-defined gestures for surface computing
CHI '09: Proceedings of the SIGCHI Conference on Human Factors in Computing SystemsMany surface computing prototypes have employed gestures created by system designers. Although such gestures are appropriate for early investigations, they are not necessarily reflective of user behavior. We present an approach to designing tabletop ...
User Defined Eye Movement-Based Interaction for Virtual Reality
Cross-Cultural Design. Methods, Tools, and UsersAbstractMost of the applications of eye movement-based interaction in VR are limited to blinking and gaze at present, however, gaze gestures were neglected. Therefore, the potential of eye movement-based interaction in VR is far from being realized. In ...
Comments