ABSTRACT
In this paper a study is reported, which investigates the effectiveness of two approaches to improving gaze-based interaction for realistic and complex menu selection tasks. The first approach focuses on identifying menu designs for hierarchical menus that are particularly suitable for gaze-based interaction, whereas the second approach is based on the idea of combining gaze-based interaction with speech as a second input modality. In an experiment with 40 participants the impact of menu design, input device, and navigation complexity on accuracy and completion time in a menu selection task as well as on user satisfaction were investigated. The results concerning both objective task performance and subjective ratings confirmed our expectations in that a semi-circle menu was better suited for gaze-based menu selection than either a linear or a full-circle menu. Contrary to our expectations, an input device solely based on eye gazes turned out to be superior to the combined gaze- and speech-based device. Moreover, the drawbacks of a less suitable menu design (i.e., of a linear menu or a full-circle menu) as well as of the multimodal input device particularly obstructed performance in the case of more complex navigational tasks.
- Beinhauer, W. 2006. In Proceedings of the 2006 Symposium on Eye Tracking Research & Applications. ETRA 2006. ACM, New York, NY, 53--53. Google ScholarDigital Library
- Callahan, J., Hopkins, D., Weiser, M., and Shneiderman, B. 1988. An empirical comparison of pie vs. linear menus. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. J. J. O'Hare, Ed. CHI 88. ACM, New York, NY, 95--100. Google ScholarDigital Library
- Hart, S. G. and Staveland, L. E. 1988. Development of the NASA-tlx (Task Load Index): Results of empirical and theoretical research, in Human Mental Workload, 139--183.Google Scholar
- ISO 9241-11 1998. Ergonomic requirements for office work with visual display terminals (VDT)s - Part 11 Guidance on usability.Google Scholar
- Jacob, R. J. K. 1990. What you look at is what you get: eye movement-based interaction techniques. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems: Empowering. J. C. Chew and J. Whiteside, Eds. CHI 90. ACM, New York, NY, 11--18. Google ScholarDigital Library
- Jacob, R. J. K. 1993. Eye movement-based human-computer interaction techniques: Toward non-command interfaces. In H. R. Hartson & D. Hix (Eds.), Advances in human-computer interaction (Vol. 4, pp. 151--190). Norwood, NJ: Ablex. Available at http://www.eecs.tufts.edu/~jacob/papers/hartson.pdf (05.11.2007).Google Scholar
- Kaur, M., Tremaine, M., Huang, N., Wilder, J., Gacovski, Z., Flippo, F., and Mantravadi, C. S. 2003. Where is "it"? Event Synchronization in Gaze-Speech Input Systems. In Proceedings of the 5th international Conference on Multimodal interfaces. ICMI 2003. ACM, New York, NY, 151--158. Google ScholarDigital Library
- Kobayashi, M. and Igarashi, T. 2003. Considering the direction of cursor movement for efficient traversal of cascading menus. In Proceedings of the 16th Annual ACM Symposium on User interface Software and Technology. UIST 2003. ACM, New York, NY, 91--94. Google ScholarDigital Library
- Kurtenbach, G. and Buxton, W. 1994. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems: Celebrating interdependence. B. Adelson, S. Dumais, and J. Olson, Eds. CHI 94. ACM, New York, NY, 258--264.Google Scholar
- Miniotas, D., Špakov, O., Tugoy, I., and MackKenzie, I. S. 2005. Extending the limits for gaze pointing through the use of speech. Information Technology and Control, 34, 225--230.Google Scholar
- Ohno, T. 1998. Features of eye gaze interface for selection tasks. In Proceedings of the third Asia Pacific computer human interaction. APCHI 98, IEEE Computer Society, Washington, DC, 176--182. Google ScholarDigital Library
- Tobii Technology, AB, 2006. Tobii 1750 Eye Tracker. Sweden. http://www.tobii.comGoogle Scholar
- Urbina, M. H. and Huckauf, A. 2007. Dwell time free eye typing approaches. In Proceedings of the 3rd Conference on Communication by Gaze Interaction - COGAIN 2007. Leicester, UK, 65--70.Google Scholar
- Velichkovsky, B. M., Sprenger, A., and Pomplun, M. 1997. Auf dem Weg zur Blickmaus: Die Beeinflussung der Fixationsdauer durch kognitive und kommunikative Aufgaben. In R. Liskowsky, B. M. Velichkowsky, & W. Wüünschmann (Eds.), Software-Ergonomie, 317--327.Google Scholar
- Yamato, M., Monden, A., Matsumoto, K., Inoue, K., and Torii, K. 2000. Quick button selection with eye gazing for general GUI environment. In Proceedings of International Conference on Software: Theory and Practice, ICS 2000. Beijing, China, 712--719.Google Scholar
- Zhai, S., Morimoto, C., and Ihde, S. 1999. Manual and gaze input cascaded (MAGIC) pointing. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems: the CHI is the Limit. CHI 99. ACM, New York, NY, 246--253. Google ScholarDigital Library
Index Terms
- Looking my way through the menu: the impact of menu design and multimodal input on gaze-based menu selection
Recommendations
Lattice Menu: A Low-Error Gaze-Based Marking Menu Utilizing Target-Assisted Gaze Gestures on a Lattice of Visual Anchors
CHI '22: Proceedings of the 2022 CHI Conference on Human Factors in Computing SystemsWe present Lattice Menu, a gaze-based marking menu utilizing a lattice of visual anchors that helps perform accurate gaze pointing for menu item selection. Users who know the location of the desired item can leverage target-assisted gaze gestures for ...
Exploring the usefulness of finger-based 3D gesture menu selection
CHI '14: Proceedings of the SIGCHI Conference on Human Factors in Computing SystemsCounting using one's fingers is a potentially intuitive way to enumerate a list of items and lends itself naturally to gesture-based menu systems. In this paper, we present the results of the first comprehensive study on Finger-Count menus to ...
Multimodal Menu Presentation and Selection in Immersive Virtual Environments
VR '00: Proceedings of the IEEE Virtual Reality 2000 ConferenceUsability has become one of the key ingredients in making virtual reality (VR) systems work, and a big part of a usable VR system is in the design of effective interface/interaction schemes. We investigate in the usability of various menu presentation ...
Comments