skip to main content
10.1145/2087756.2087854acmconferencesArticle/Chapter ViewAbstractPublication PagessiggraphConference Proceedingsconference-collections
research-article

An integrated framework for universal motion control

Published:11 December 2011Publication History

ABSTRACT

The advance of the motion tracking and 3D display technologies impacts the input and output devices of the general human-computer interaction framework nowadays. The motion sensing devices can provide great tracking accuracy at relatively low prices, which make motion-based interactions affordable and popular for general use. Taking into account all the general interactions required on graphic user interfaces, we propose an integrated framework for motion control, which seamlessly supports 2D, 3D and motion gesture interactions. We categorize the general tasks and define four corresponding operating modes: 2D cursor, 3D manipulation, 3D navigation, and motion gesture. Trade-offs are made between generality, performance, and usability. With a careful design of mapping, we believe the generality of the motion control can outweigh the compromise in performance.

In the implementation, a hybrid framework of optical and inertial sensing is used to achieve precise 6 DOF motion tracking. We develop two interesting applications to demonstrate the usability of the integrated motion control framework between the first three operating modes. The motion gesture mode is proposed but not covered in this work.

References

  1. Ashbrook, D., and Starner, T. 2010. Magic: a motion gesture design tool. In Proceedings of the 28th international conference on Human factors in computing systems, ACM, New York, NY, USA, CHI '10, 2159--2168. Google ScholarGoogle ScholarDigital LibraryDigital Library
  2. Bowman, D. A., and Hodges, L. F. 1997. An evaluation of techniques for grabbing and manipulating remote objects in immersive virtual environments. In Proceedings of the 1997 symposium on Interactive 3D graphics, ACM, New York, NY, USA, I3D '97, 35--ff. Google ScholarGoogle ScholarDigital LibraryDigital Library
  3. Bowman, D. A., Kruijff, E., LaViola, J. J., and Poupyrev, I. 2004. 3D User Interfaces: Theory and Practice. Addison Wesley Longman Publishing Co., Inc., Redwood City, CA, USA. Google ScholarGoogle ScholarDigital LibraryDigital Library
  4. Bowman, D. A., Chen, J., Wingrave, C. A., Lucas, J. F., Ray, A., Polys, N. F., Li, Q., Haciahmetoglu, Y., Kim, J.-S., Kim, S., Boehringer, R., and Ni, T. 2006. New directions in 3d user interfaces. IJVR 5, 2, 3--14.Google ScholarGoogle ScholarCross RefCross Ref
  5. Cao, X., and Balakrishnan, R. 2003. Visionwand: interaction techniques for large displays using a passive wand tracked in 3d. In Proceedings of the 16th annual ACM symposium on User interface software and technology, ACM, New York, NY, USA, UIST '03, 173--182. Google ScholarGoogle ScholarDigital LibraryDigital Library
  6. Chen, M., AlRegib, G., and Juang, B.-H. 2010. Characteristics of spatio-temporal signals acquired by optical motion tracking. In Signal Processing (ICSP), 2010 IEEE 10th International Conference on, 1205--1208.Google ScholarGoogle Scholar
  7. Dix, A., Finlay, J. E., Abowd, G. D., and Beale, R. 2003. Human-Computer Interaction (3rd Edition). Prentice-Hall, Inc., Upper Saddle River, NJ, USA. Google ScholarGoogle ScholarDigital LibraryDigital Library
  8. Godwin, A., Agnew, M., and Stevenson, J. 2009. Accuracy of inertial motion sensors in static, quasistatic, and complex dynamic motion. Journal of Biomechanical Engineering 131, 11, 114501.Google ScholarGoogle ScholarCross RefCross Ref
  9. Jota, R., Nacenta, M. A., Jorge, J. A., Carpendale, S., and Greenberg, S. 2010. A comparison of ray pointing techniques for very large displays. In Proceedings of Graphics Interface 2010, Canadian Information Processing Society, Toronto, Ont., Canada, Canada, GI '10, 269--276. Google ScholarGoogle ScholarDigital LibraryDigital Library
  10. Liang, J., and Green, M. 1994. Jdcad: A highly interactive 3d modeling system. Computers and Graphics 18, 4, 499--506.Google ScholarGoogle ScholarCross RefCross Ref
  11. Liu, J., Zhong, L., Wickramasuriya, J., and Vasudevan, V. 2009. uwave: Accelerometer-based personalized gesture recognition and its applications. Pervasive and Mobile Computing 5, 6, 657--675. PerCom 2009. Google ScholarGoogle ScholarDigital LibraryDigital Library
  12. Mäntyjärvi, J., Kela, J., Korpipää, P., and Kallio, S. 2004. Enabling fast and effortless customisation in accelerometer based gesture interaction. In Proceedings of the 3rd international conference on Mobile and ubiquitous multimedia, ACM, New York, NY, USA, MUM '04, 25--31. Google ScholarGoogle ScholarDigital LibraryDigital Library
  13. Mine, M. 1995. Virtual environment interaction techniques. Tech. rep., UNC Chapel Hill CS Dept. Google ScholarGoogle ScholarDigital LibraryDigital Library
  14. Mitra, S., and Acharya, T. 2007. Gesture recognition: A survey. IEEE TRANSACTIONS ON SYSTEMS, MAN AND CYBERNETICS - PART C 37, 3, 311--324. Google ScholarGoogle ScholarDigital LibraryDigital Library
  15. Oh, J.-Y., and Stuerzlinger, W. 2002. Laser Pointers as Collaborative Pointing Devices. In Proc. Graphics Interface, 141--150.Google ScholarGoogle Scholar
  16. Poupyrev, I., Billinghurst, M., Weghorst, S., and Ichikawa, T. 1996. The go-go interaction technique: nonlinear mapping for direct manipulation in vr. In Proceedings of the 9th annual ACM symposium on User interface software and technology, ACM, New York, NY, USA, UIST '96, 79--80. Google ScholarGoogle ScholarDigital LibraryDigital Library
  17. Ruiz, J., Li, Y., and Lank, E. 2011. User-defined motion gestures for mobile interaction. In Proceedings of the 29th international conference on Human factors in computing systems, ACM, CHI '11. Google ScholarGoogle ScholarDigital LibraryDigital Library
  18. Teather, R., Pavlovych, A., Stuerzlinger, W., and MacKenzie, I. 2009. Effects of tracking technology, latency, and spatial jitter on object movement. Proceedings of IEEE Symposium on 3D User Interfaces 9, 43--50. Google ScholarGoogle ScholarDigital LibraryDigital Library
  19. Vanacken, L., Grossman, T., and Coninx, K. 2009. Multimodal selection techniques for dense and occluded 3d virtual environments. International Journal of Human-Computer Studies 67, 3, 237--255. Current trends in 3D user interface research. Google ScholarGoogle ScholarDigital LibraryDigital Library
  20. Welch, G., and Foxlin, E. 2002. Motion tracking: no silver bullet, but a respectable arsenal. Computer Graphics and Applications, IEEE 22, 6 (Nov), 24--38. Google ScholarGoogle ScholarDigital LibraryDigital Library

Index Terms

  1. An integrated framework for universal motion control

    Recommendations

    Comments

    Login options

    Check if you have access through your login credentials or your institution to get full access on this article.

    Sign in
    • Published in

      cover image ACM Conferences
      VRCAI '11: Proceedings of the 10th International Conference on Virtual Reality Continuum and Its Applications in Industry
      December 2011
      617 pages
      ISBN:9781450310604
      DOI:10.1145/2087756

      Copyright © 2011 ACM

      Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      • Published: 11 December 2011

      Permissions

      Request permissions about this article.

      Request Permissions

      Check for updates

      Qualifiers

      • research-article

      Acceptance Rates

      Overall Acceptance Rate51of107submissions,48%

      Upcoming Conference

      SIGGRAPH '24

    PDF Format

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader