ABSTRACT
Our work investigates the use of gaze and multitouch to fluidly perform rotate-scale-translate (RST) tasks on large displays. The work specifically aims to understand if gaze can provide benefit in such a task, how task complexity affects performance, and how gaze and multitouch can be combined to create an integral input structure suited to the task of RST. We present four techniques that individually strike a different balance between gaze-based and touch-based translation while maintaining concurrent rotation and scaling operations. A 16 participant empirical evaluation revealed that three of our four techniques present viable options for this scenario, and that larger distances and rotation/scaling operations can significantly affect a gaze-based translation configuration. Furthermore we uncover new insights regarding multimodal integrality, finding that gaze and touch can be combined into configurations that pertain to integral or separable input structures.
Supplemental Material
- Balakrishnan, R., and Hinckley, K. Symmetric bimanual interaction. In Proc. CHI '00 (2000). Google ScholarDigital Library
- Baur, D., Boring, S., and Feiner, S. Virtual projection: Exploring optical projection as a metaphor for multi-device interaction. In Proc. CHI '12 (2012). Google ScholarDigital Library
- Boring, S., Baur, D., Butz, A., Gustafson, S., and Baudisch, P. Touch projector: Mobile interaction through video. In Proc. CHI '10 (2010). Google ScholarDigital Library
- Dachselt, R., and Buchholz, R. Natural throw and tilt interaction between mobile phones and distant displays. In Proc. CHI EA '09 (2009). Google ScholarDigital Library
- Fares, R., Downing, D., and Komogortsev, O. Magic-sense: Dynamic cursor sensitivity-based magic pointing. In Proc. CHI EA '12 (2012). Google ScholarDigital Library
- Fares, R., Fang, S., and Komogortsev, O. Can we beat the mouse with magic? In Proc. CHI '13 (2013). Google ScholarDigital Library
- Garner, W. R. The processing of information and structure. Psychology Press, 2014.Google ScholarCross Ref
- Grasso, M. A., Ebert, D., and Finin, T. The effect of perceptual structure on multimodal speech recognition interfaces. organization (1998).Google Scholar
- Han, S., Lee, H., Park, J., Chang, W., and Kim, C. Remote interaction for 3d manipulation. In CHI EA '10 (2010). Google ScholarDigital Library
- Jacob, R. J. K. What you look at is what you get: Eye movement-based interaction techniques. In Proc. CHI '90 (1990). Google ScholarDigital Library
- Jacob, R. J. K., Sibert, L. E., McFarlane, D. C., and Mullen, Jr., M. P. Integrality and separability of input devices. ACM Trans. Comput.-Hum. Interact. (1994). Google ScholarDigital Library
- Keefe, D. F., Gupta, A., Feldman, D., Carlis, J. V., Keefe, S. K., and Griffin, T. J. Scaling up multi-touch selection and querying: Interfaces and applications for combining mobile multi-touch input with large-scale visualization displays. International Journal of Human-Computer Studies 70, 10 (2012), 703--713. Google ScholarDigital Library
- Martinet, A., Casiez, G., and Grisoni, L. The effect of dof separation in 3d manipulation tasks with multi-touch displays. In Proc. VRST '10 (2010). Google ScholarDigital Library
- Masliah, M. R. Quantifying human coordination in hci. In CHI EA '99 (1999). Google ScholarDigital Library
- Nancel, M., Wagner, J., Pietriga, E., Chapuis, O., and Mackay, W. Mid-air pan-and-zoom on wall-sized displays. In Proc. CHI '11 (2011). Google ScholarDigital Library
- Pfeuffer, K., Alexander, J., Chong, M. K., and Gellersen, H. Gaze-touch: Combining gaze with multi-touch for interaction on the same surface. In Proc. UIST '14 (2014). Google ScholarDigital Library
- Sibert, L. E., and Jacob, R. J. K. Evaluation of eye gaze interaction. In Proc. CHI '00 (2000). Google ScholarDigital Library
- Stellmach, S., and Dachselt, R. Still looking: Investigating seamless gaze-supported selection, positioning, and manipulation of distant targets. In Proc. CHI '13 (2013). Google ScholarDigital Library
- Turner, J., Alexander, J., Bulling, A., Schmidt, D., and Gellersen, H. Eye pull, eye push: Moving objects between large screens and personal devices with gaze and touch. In Proc. INTERACT '13 (2013).Google ScholarCross Ref
- Turner, J., Bulling, A., Alexander, J., and Gellersen, H. Cross-device gaze-supported point-to-point content transfer. In Proc. ETRA '14 (2014). Google ScholarDigital Library
- Veit, M., Capobianco, A., and Bechmann, D. Influence of degrees of freedom's manipulation on performances during orientation tasks in virtual reality environments. In Proc. VRST '09 (2009). Google ScholarDigital Library
- Vogel, D., and Balakrishnan, R. Distant freehand pointing and clicking on very large, high resolution displays. In Proc. UIST '05 (2005). Google ScholarDigital Library
- Wang, Y., MacKenzie, C. L., Summers, V. A., and Booth, K. S. The structure of object transportation and orientation in human-computer interaction. In Proc. CHI '98 (1998). Google ScholarDigital Library
- Yoo, B., Han, J.-J., Choi, C., Yi, K., Suh, S., Park, D., and Kim, C. 3d user interface combining gaze and hand gestures for large-scale display. In CHI EA '10 (2010). Google ScholarDigital Library
- Zhai, S., and Milgram, P. Quantifying coordination in multiple dof movement and its application to evaluating 6 dof input devices. In Proc. CHI '98 (1998). Google ScholarDigital Library
- Zhai, S., Morimoto, C., and Ihde, S. Manual and gaze input cascaded (magic) pointing. In Proc. CHI '99 (1999). Google ScholarDigital Library
Index Terms
- Gaze+RST: Integrating Gaze and Multitouch for Remote Rotate-Scale-Translate Tasks
Recommendations
Gaze and Touch Interaction on Tablets
UIST '16: Proceedings of the 29th Annual Symposium on User Interface Software and TechnologyWe explore how gaze can support touch interaction on tablets. When holding the device, the free thumb is normally limited in reach, but can provide an opportunity for indirect touch input. Here we propose gaze and touch input, where touches redirect to ...
Investigating gaze-supported multimodal pan and zoom
ETRA '12: Proceedings of the Symposium on Eye Tracking Research and ApplicationsRemote pan-and-zoom control for the exploration of large information spaces is of interest for various application areas, such as browsing through medical data in sterile environments or investigating geographic information systems on a distant display. ...
Cross-device gaze-supported point-to-point content transfer
ETRA '14: Proceedings of the Symposium on Eye Tracking Research and ApplicationsWithin a pervasive computing environment, we see content on shared displays that we wish to acquire and use in a specific way i.e., with an application on a personal device, transferring from point-to-point. The eyes as input can indicate intention to ...
Comments