skip to main content
10.1145/2166966.2166981acmconferencesArticle/Chapter ViewAbstractPublication PagesiuiConference Proceedingsconference-collections
research-article

An adaptive solution for intra-operative gesture-based human-machine interaction

Authors Info & Claims
Published:14 February 2012Publication History

ABSTRACT

Computerized medical systems play a vital role in the operating room, however, sterility requirements and interventional workflow often make interaction with these devices challenging for surgeons. Typical solutions, such as delegating physical control of keyboard and mouse to assistants, add an undesirable level of indirection. We present a touchless, gesture-based interaction framework for the operating room that lets surgeons define a personalized set of gestures for controlling arbitrary medical computerized systems. Instead of using cameras for capturing gestures, we rely on a few wireless inertial sensors, placed on the arms of the surgeon, eliminating the dependence on illumination and line-of-sight. A discriminative gesture recognition approach based on kernel regression allows us to simultaneously classify performed gestures and to track the relative spatial pose within each gesture, giving surgeons fine-grained control of continuous parameters. An extensible software architecture enables a dynamic association of learned gestures to arbitrary intraoperative computerized systems. Our experiments illustrate the performance of our approach and encourage its practical applicability.

Skip Supplemental Material Section

Supplemental Material

iui-157-file3.avi

avi

20.8 MB

References

  1. Belkin, M., and Niyogi, P. Laplacian eigenmaps for dimensionality reduction and data representation. Neural Computation 15, 6 (Feb 2003), 1373--1396. Google ScholarGoogle ScholarDigital LibraryDigital Library
  2. Bigdelou, A., Sterner, T., Wiesner, S., Wendler, T., Matthes, F., and Navab, N. OR specific domain model for usability evaluations of intra-operative systems. International Conference on Information Processing in Computer-Assisted Interventions (2011). Google ScholarGoogle ScholarDigital LibraryDigital Library
  3. Bogdan, C., Kaindl, H., Falb, J., and Popp, R. Modeling of interaction design by end users through discourse modeling. In Proceedings of the 13th international conference on Intelligent user interfaces, IUI 2008, ACM (New York, NY, USA, 2008), 305--308. Google ScholarGoogle ScholarDigital LibraryDigital Library
  4. Cabral, M. C., Morimoto, C. H., and Zuffo, M. K. On the usability of gesture interfaces in virtual reality environments. Latin American conference on Human-computer interaction (2005). Google ScholarGoogle ScholarDigital LibraryDigital Library
  5. Demirci, S., Manstad-Hulaas, F., and Navab, N. Quantification of aortic deformation after EVAR. SPIE Medical Imaging (2009).Google ScholarGoogle Scholar
  6. Elgammal, A., and Lee, C. The role of manifold learning in human motion analysis. Human Motion Understanding, Modeling, Capture and Animation (2008).Google ScholarGoogle Scholar
  7. Ferscha, A., Resmerita, S., Holzmann, C., and Reichör, M. Orientation sensing for gesture-based interaction with smart artifacts. Computer communications 28, 13 (2005), 1552--1563. Google ScholarGoogle ScholarDigital LibraryDigital Library
  8. Freudenthal, A., Studeli, T., Lamata, P., and Samset, E. Collaborative co-design of emerging multi-technologies for surgery. Journal of Biomedical Informatics 44 (2011), 198--215. Google ScholarGoogle ScholarDigital LibraryDigital Library
  9. Gallo, L. A glove-based interface for 3d medical image visualization. Intelligent interactive multimedia systems and services 6 (2010), 221--230.Google ScholarGoogle Scholar
  10. Graetzel, C., Fong, T., Grange, S., and Baur, C. A non-contact mouse for surgeon-computer interaction. Technology and Health Care 12, 3 (2004), 245--257. Google ScholarGoogle ScholarDigital LibraryDigital Library
  11. Gray, P., Ramsay, A., and Serrano, M. A demonstration of the openinterface interaction development environment. In Symposium on User Interface Software and Technology (UIST), UIST (2009).Google ScholarGoogle Scholar
  12. Gustafson, S., Bierwirth, D., and Baudisch, P. Imaginary interfaces: spatial interaction with empty hands and without visual feedback. Symposium on User Interface Software and Technology (UIST) (2010), 3--12. Google ScholarGoogle ScholarDigital LibraryDigital Library
  13. Hartmann, B., and Link, N. Gesture recognition with inertial sensors and optimized dtw prototypes. IEEE International Conference on Systems Man and Cybernetics (SMC) (2010), 2102--2109.Google ScholarGoogle ScholarCross RefCross Ref
  14. Jaimes, A., and Sebe, N. Multimodal human-computer interaction: A survey. Computer Vision and Image Understanding 108, 1--2 (2007), 116--134. Google ScholarGoogle ScholarDigital LibraryDigital Library
  15. Johnson, R., O., Hara, K., Sellen, A., Cousins, C., and Criminisi, A. Exploring the potential for touchless interaction in image-guided interventional radiology. ACM Conference on Human Factors in Computing Systems (2011). Google ScholarGoogle ScholarDigital LibraryDigital Library
  16. Kela, J., Korpipa, P., Maantyjarvi, J., Kallio, S., Savino, G., Jozzo, L., and Marca, S. Accelerometer-based gesture control for a design environment. Pers Ubiquit Comput 10, 5 (2006), 285--299. Google ScholarGoogle ScholarDigital LibraryDigital Library
  17. Kipshagen, T., Graw, M., Tronnier, V., Bonsanto, M., and Hofmann, U. Touch-and marker-free interaction with medical software. World Congress on Medical Physics and Biomedical Engineering 2009 (2009), 75--78.Google ScholarGoogle ScholarCross RefCross Ref
  18. Kockro, R. A., Stadie, A., Schwandt, E., Reisch, R., Charalampaki, C., Ng, I., Yeo, T. T., Hwang, P., Serra, L., and Perneczky, A. A collaborative virtual reality environment for neurosurgical planning and training. Neurosurgery 61 (2007), 379--391.Google ScholarGoogle Scholar
  19. Lawson, J.-Y. L., Al-Akkad, A.-A., Vanderdonckt, J., and Macq, B. An open source workbench for prototyping multimodal interactions based on off-the-shelf heterogeneous components. In Proceedings of the 1st ACM SIGCHI symposium on Engineering interactive computing systems, EICS 2009, ACM (New York, NY, USA, 2009), 245--254. Google ScholarGoogle ScholarDigital LibraryDigital Library
  20. Mitra, S., and Acharya, T. Gesture recognition: A survey. IEEE Transactions on Systems, Man, and Cybernetics, Part C: Applications and Reviews 37, 3 (2007), 311--324. Google ScholarGoogle ScholarDigital LibraryDigital Library
  21. Soutschek, S., Penne, J., Hornegger, J., and Kornhuber, J. 3-d gesture-based scene navigation in medical imaging applications using time-of-flight cameras. Computer Vision and Pattern Recognition Workshops (Apr 2008).Google ScholarGoogle ScholarCross RefCross Ref
  22. Urban, M., Bajcsy, P., Kooper, R., and Lementec, J. Recognition of arm gestures using multiple orientation sensors: Repeatability assessment. Intelligent Transportation Systems (2004), 553--558.Google ScholarGoogle Scholar
  23. Wachs, J. P., Stern, H., Edan, Y., Gillam, M., Feied, C., Smith, M., and Handler, J. A real-time hand gesture interface for medical visualization applications. Applications of Soft Computing (2006), 153--162.Google ScholarGoogle Scholar
  24. Wobbrock, J. O., Wilson, A. D., and Li, Y. Gestures without libraries, toolkits or training: a $1 recognizer for user interface prototypes. Symposium on User Interface Software and Technology (UIST) (2007), 159--168. Google ScholarGoogle ScholarDigital LibraryDigital Library
  25. Zhang, X., Chen, X., Wang, W.-h., Yang, J.-h., Lantz, V., and Wang, K.-q. Hand gesture recognition and virtual game control based on 3d accelerometer and emg sensors. In Proceedings of the 14th international conference on Intelligent user interfaces, IUI 2009, ACM (New York, NY, USA, 2009), 401--406. Google ScholarGoogle ScholarDigital LibraryDigital Library

Index Terms

  1. An adaptive solution for intra-operative gesture-based human-machine interaction

      Recommendations

      Comments

      Login options

      Check if you have access through your login credentials or your institution to get full access on this article.

      Sign in
      • Published in

        cover image ACM Conferences
        IUI '12: Proceedings of the 2012 ACM international conference on Intelligent User Interfaces
        February 2012
        436 pages
        ISBN:9781450310482
        DOI:10.1145/2166966

        Copyright © 2012 ACM

        Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

        Publisher

        Association for Computing Machinery

        New York, NY, United States

        Publication History

        • Published: 14 February 2012

        Permissions

        Request permissions about this article.

        Request Permissions

        Check for updates

        Qualifiers

        • research-article

        Acceptance Rates

        Overall Acceptance Rate746of2,811submissions,27%

      PDF Format

      View or Download as a PDF file.

      PDF

      eReader

      View online with eReader.

      eReader