ABSTRACT
Computerized medical systems play a vital role in the operating room, however, sterility requirements and interventional workflow often make interaction with these devices challenging for surgeons. Typical solutions, such as delegating physical control of keyboard and mouse to assistants, add an undesirable level of indirection. We present a touchless, gesture-based interaction framework for the operating room that lets surgeons define a personalized set of gestures for controlling arbitrary medical computerized systems. Instead of using cameras for capturing gestures, we rely on a few wireless inertial sensors, placed on the arms of the surgeon, eliminating the dependence on illumination and line-of-sight. A discriminative gesture recognition approach based on kernel regression allows us to simultaneously classify performed gestures and to track the relative spatial pose within each gesture, giving surgeons fine-grained control of continuous parameters. An extensible software architecture enables a dynamic association of learned gestures to arbitrary intraoperative computerized systems. Our experiments illustrate the performance of our approach and encourage its practical applicability.
Supplemental Material
- Belkin, M., and Niyogi, P. Laplacian eigenmaps for dimensionality reduction and data representation. Neural Computation 15, 6 (Feb 2003), 1373--1396. Google ScholarDigital Library
- Bigdelou, A., Sterner, T., Wiesner, S., Wendler, T., Matthes, F., and Navab, N. OR specific domain model for usability evaluations of intra-operative systems. International Conference on Information Processing in Computer-Assisted Interventions (2011). Google ScholarDigital Library
- Bogdan, C., Kaindl, H., Falb, J., and Popp, R. Modeling of interaction design by end users through discourse modeling. In Proceedings of the 13th international conference on Intelligent user interfaces, IUI 2008, ACM (New York, NY, USA, 2008), 305--308. Google ScholarDigital Library
- Cabral, M. C., Morimoto, C. H., and Zuffo, M. K. On the usability of gesture interfaces in virtual reality environments. Latin American conference on Human-computer interaction (2005). Google ScholarDigital Library
- Demirci, S., Manstad-Hulaas, F., and Navab, N. Quantification of aortic deformation after EVAR. SPIE Medical Imaging (2009).Google Scholar
- Elgammal, A., and Lee, C. The role of manifold learning in human motion analysis. Human Motion Understanding, Modeling, Capture and Animation (2008).Google Scholar
- Ferscha, A., Resmerita, S., Holzmann, C., and Reichör, M. Orientation sensing for gesture-based interaction with smart artifacts. Computer communications 28, 13 (2005), 1552--1563. Google ScholarDigital Library
- Freudenthal, A., Studeli, T., Lamata, P., and Samset, E. Collaborative co-design of emerging multi-technologies for surgery. Journal of Biomedical Informatics 44 (2011), 198--215. Google ScholarDigital Library
- Gallo, L. A glove-based interface for 3d medical image visualization. Intelligent interactive multimedia systems and services 6 (2010), 221--230.Google Scholar
- Graetzel, C., Fong, T., Grange, S., and Baur, C. A non-contact mouse for surgeon-computer interaction. Technology and Health Care 12, 3 (2004), 245--257. Google ScholarDigital Library
- Gray, P., Ramsay, A., and Serrano, M. A demonstration of the openinterface interaction development environment. In Symposium on User Interface Software and Technology (UIST), UIST (2009).Google Scholar
- Gustafson, S., Bierwirth, D., and Baudisch, P. Imaginary interfaces: spatial interaction with empty hands and without visual feedback. Symposium on User Interface Software and Technology (UIST) (2010), 3--12. Google ScholarDigital Library
- Hartmann, B., and Link, N. Gesture recognition with inertial sensors and optimized dtw prototypes. IEEE International Conference on Systems Man and Cybernetics (SMC) (2010), 2102--2109.Google ScholarCross Ref
- Jaimes, A., and Sebe, N. Multimodal human-computer interaction: A survey. Computer Vision and Image Understanding 108, 1--2 (2007), 116--134. Google ScholarDigital Library
- Johnson, R., O., Hara, K., Sellen, A., Cousins, C., and Criminisi, A. Exploring the potential for touchless interaction in image-guided interventional radiology. ACM Conference on Human Factors in Computing Systems (2011). Google ScholarDigital Library
- Kela, J., Korpipa, P., Maantyjarvi, J., Kallio, S., Savino, G., Jozzo, L., and Marca, S. Accelerometer-based gesture control for a design environment. Pers Ubiquit Comput 10, 5 (2006), 285--299. Google ScholarDigital Library
- Kipshagen, T., Graw, M., Tronnier, V., Bonsanto, M., and Hofmann, U. Touch-and marker-free interaction with medical software. World Congress on Medical Physics and Biomedical Engineering 2009 (2009), 75--78.Google ScholarCross Ref
- Kockro, R. A., Stadie, A., Schwandt, E., Reisch, R., Charalampaki, C., Ng, I., Yeo, T. T., Hwang, P., Serra, L., and Perneczky, A. A collaborative virtual reality environment for neurosurgical planning and training. Neurosurgery 61 (2007), 379--391.Google Scholar
- Lawson, J.-Y. L., Al-Akkad, A.-A., Vanderdonckt, J., and Macq, B. An open source workbench for prototyping multimodal interactions based on off-the-shelf heterogeneous components. In Proceedings of the 1st ACM SIGCHI symposium on Engineering interactive computing systems, EICS 2009, ACM (New York, NY, USA, 2009), 245--254. Google ScholarDigital Library
- Mitra, S., and Acharya, T. Gesture recognition: A survey. IEEE Transactions on Systems, Man, and Cybernetics, Part C: Applications and Reviews 37, 3 (2007), 311--324. Google ScholarDigital Library
- Soutschek, S., Penne, J., Hornegger, J., and Kornhuber, J. 3-d gesture-based scene navigation in medical imaging applications using time-of-flight cameras. Computer Vision and Pattern Recognition Workshops (Apr 2008).Google ScholarCross Ref
- Urban, M., Bajcsy, P., Kooper, R., and Lementec, J. Recognition of arm gestures using multiple orientation sensors: Repeatability assessment. Intelligent Transportation Systems (2004), 553--558.Google Scholar
- Wachs, J. P., Stern, H., Edan, Y., Gillam, M., Feied, C., Smith, M., and Handler, J. A real-time hand gesture interface for medical visualization applications. Applications of Soft Computing (2006), 153--162.Google Scholar
- Wobbrock, J. O., Wilson, A. D., and Li, Y. Gestures without libraries, toolkits or training: a $1 recognizer for user interface prototypes. Symposium on User Interface Software and Technology (UIST) (2007), 159--168. Google ScholarDigital Library
- Zhang, X., Chen, X., Wang, W.-h., Yang, J.-h., Lantz, V., and Wang, K.-q. Hand gesture recognition and virtual game control based on 3d accelerometer and emg sensors. In Proceedings of the 14th international conference on Intelligent user interfaces, IUI 2009, ACM (New York, NY, USA, 2009), 401--406. Google ScholarDigital Library
Index Terms
- An adaptive solution for intra-operative gesture-based human-machine interaction
Recommendations
Tracking of deformable human hand in real time as continuous input for gesture-based interaction
IUI '07: Proceedings of the 12th international conference on Intelligent user interfacesGesture input is a natural and effective interactive model. The tracking of deformable hand gesture is a very important task in gesture-based interaction. A novel real-time tracking approach is proposed to capture hand motion with single camera. It ...
The design of hand gestures for human-computer interaction: Lessons from sign language interpreters
The design and selection of 3D modeled hand gestures for human-computer interaction should follow principles of natural language combined with the need to optimize gesture contrast and recognition. The selection should also consider the discomfort and ...
Human-Machine Interaction based on Hand Gesture Recognition using Skeleton Information of Kinect Sensor
ICAIT'2018: Proceedings of the 3rd International Conference on Applications in Information TechnologyThe hand gesture provides a natural and intuitive communication medium for the human and machine interaction. Because, it can use in virtual reality, language detection, computer games, and other human-computer or human-machine instruction applications. ...
Comments