ABSTRACT
The advance of the motion tracking and 3D display technologies impacts the input and output devices of the general human-computer interaction framework nowadays. The motion sensing devices can provide great tracking accuracy at relatively low prices, which make motion-based interactions affordable and popular for general use. Taking into account all the general interactions required on graphic user interfaces, we propose an integrated framework for motion control, which seamlessly supports 2D, 3D and motion gesture interactions. We categorize the general tasks and define four corresponding operating modes: 2D cursor, 3D manipulation, 3D navigation, and motion gesture. Trade-offs are made between generality, performance, and usability. With a careful design of mapping, we believe the generality of the motion control can outweigh the compromise in performance.
In the implementation, a hybrid framework of optical and inertial sensing is used to achieve precise 6 DOF motion tracking. We develop two interesting applications to demonstrate the usability of the integrated motion control framework between the first three operating modes. The motion gesture mode is proposed but not covered in this work.
- Ashbrook, D., and Starner, T. 2010. Magic: a motion gesture design tool. In Proceedings of the 28th international conference on Human factors in computing systems, ACM, New York, NY, USA, CHI '10, 2159--2168. Google ScholarDigital Library
- Bowman, D. A., and Hodges, L. F. 1997. An evaluation of techniques for grabbing and manipulating remote objects in immersive virtual environments. In Proceedings of the 1997 symposium on Interactive 3D graphics, ACM, New York, NY, USA, I3D '97, 35--ff. Google ScholarDigital Library
- Bowman, D. A., Kruijff, E., LaViola, J. J., and Poupyrev, I. 2004. 3D User Interfaces: Theory and Practice. Addison Wesley Longman Publishing Co., Inc., Redwood City, CA, USA. Google ScholarDigital Library
- Bowman, D. A., Chen, J., Wingrave, C. A., Lucas, J. F., Ray, A., Polys, N. F., Li, Q., Haciahmetoglu, Y., Kim, J.-S., Kim, S., Boehringer, R., and Ni, T. 2006. New directions in 3d user interfaces. IJVR 5, 2, 3--14.Google ScholarCross Ref
- Cao, X., and Balakrishnan, R. 2003. Visionwand: interaction techniques for large displays using a passive wand tracked in 3d. In Proceedings of the 16th annual ACM symposium on User interface software and technology, ACM, New York, NY, USA, UIST '03, 173--182. Google ScholarDigital Library
- Chen, M., AlRegib, G., and Juang, B.-H. 2010. Characteristics of spatio-temporal signals acquired by optical motion tracking. In Signal Processing (ICSP), 2010 IEEE 10th International Conference on, 1205--1208.Google Scholar
- Dix, A., Finlay, J. E., Abowd, G. D., and Beale, R. 2003. Human-Computer Interaction (3rd Edition). Prentice-Hall, Inc., Upper Saddle River, NJ, USA. Google ScholarDigital Library
- Godwin, A., Agnew, M., and Stevenson, J. 2009. Accuracy of inertial motion sensors in static, quasistatic, and complex dynamic motion. Journal of Biomechanical Engineering 131, 11, 114501.Google ScholarCross Ref
- Jota, R., Nacenta, M. A., Jorge, J. A., Carpendale, S., and Greenberg, S. 2010. A comparison of ray pointing techniques for very large displays. In Proceedings of Graphics Interface 2010, Canadian Information Processing Society, Toronto, Ont., Canada, Canada, GI '10, 269--276. Google ScholarDigital Library
- Liang, J., and Green, M. 1994. Jdcad: A highly interactive 3d modeling system. Computers and Graphics 18, 4, 499--506.Google ScholarCross Ref
- Liu, J., Zhong, L., Wickramasuriya, J., and Vasudevan, V. 2009. uwave: Accelerometer-based personalized gesture recognition and its applications. Pervasive and Mobile Computing 5, 6, 657--675. PerCom 2009. Google ScholarDigital Library
- Mäntyjärvi, J., Kela, J., Korpipää, P., and Kallio, S. 2004. Enabling fast and effortless customisation in accelerometer based gesture interaction. In Proceedings of the 3rd international conference on Mobile and ubiquitous multimedia, ACM, New York, NY, USA, MUM '04, 25--31. Google ScholarDigital Library
- Mine, M. 1995. Virtual environment interaction techniques. Tech. rep., UNC Chapel Hill CS Dept. Google ScholarDigital Library
- Mitra, S., and Acharya, T. 2007. Gesture recognition: A survey. IEEE TRANSACTIONS ON SYSTEMS, MAN AND CYBERNETICS - PART C 37, 3, 311--324. Google ScholarDigital Library
- Oh, J.-Y., and Stuerzlinger, W. 2002. Laser Pointers as Collaborative Pointing Devices. In Proc. Graphics Interface, 141--150.Google Scholar
- Poupyrev, I., Billinghurst, M., Weghorst, S., and Ichikawa, T. 1996. The go-go interaction technique: nonlinear mapping for direct manipulation in vr. In Proceedings of the 9th annual ACM symposium on User interface software and technology, ACM, New York, NY, USA, UIST '96, 79--80. Google ScholarDigital Library
- Ruiz, J., Li, Y., and Lank, E. 2011. User-defined motion gestures for mobile interaction. In Proceedings of the 29th international conference on Human factors in computing systems, ACM, CHI '11. Google ScholarDigital Library
- Teather, R., Pavlovych, A., Stuerzlinger, W., and MacKenzie, I. 2009. Effects of tracking technology, latency, and spatial jitter on object movement. Proceedings of IEEE Symposium on 3D User Interfaces 9, 43--50. Google ScholarDigital Library
- Vanacken, L., Grossman, T., and Coninx, K. 2009. Multimodal selection techniques for dense and occluded 3d virtual environments. International Journal of Human-Computer Studies 67, 3, 237--255. Current trends in 3D user interface research. Google ScholarDigital Library
- Welch, G., and Foxlin, E. 2002. Motion tracking: no silver bullet, but a respectable arsenal. Computer Graphics and Applications, IEEE 22, 6 (Nov), 24--38. Google ScholarDigital Library
Index Terms
- An integrated framework for universal motion control
Recommendations
Motion-capture-based avatar control framework in third-person view virtual environments
ACE '06: Proceedings of the 2006 ACM SIGCHI international conference on Advances in computer entertainment technologyThis paper presents a motion-capture-based control framework for third-person view virtual reality applications. Using motion capture devices, a user can directly control the full body motion of an avatar in virtual environments. In addition, using a ...
Video-based hand manipulation capture through composite motion control
This paper describes a new method for acquiring physically realistic hand manipulation data from multiple video streams. The key idea of our approach is to introduce a composite motion control to simultaneously model hand articulation, object movement, ...
Character motion control interface with hand manipulation inspired by puppet mechanism
VRCAI '13: Proceedings of the 12th ACM SIGGRAPH International Conference on Virtual-Reality Continuum and Its Applications in IndustryIn this paper, we propose an interactive motion control interface with hand manipulation. Using fingers and hands, the user can simultaneously control a large number of degrees of freedom. Our interface is inspired by the control mechanism of a real ...
Comments