skip to main content
10.1145/3317956.3318152acmconferencesArticle/Chapter ViewAbstractPublication PagesetraConference Proceedingsconference-collections
research-article

Impact of variable positioning of text prediction in gaze-based text entry

Published: 25 June 2019 Publication History

Abstract

Text predictions play an important role in improving the performance of gaze-based text entry systems. However, visual search, scanning, and selection of text predictions require a shift in the user's attention from the keyboard layout. Hence the spatial positioning of predictions becomes an imperative aspect of the end-user experience. In this work, we investigate the role of spatial positioning by comparing the performance of three different keyboards entailing variable positions for text predictions. The experiment result shows no significant differences in the text entry performance, i.e., displaying suggestions closer to visual fovea did not enhance the text entry rate of participants, however they used more keystrokes and backspace. This implies to the inessential usage of suggestions when it is in the constant visual attention of users, resulting in increased cost of correction. Furthermore, we argue that the fast saccadic eye movements undermines the spatial distance optimization in prediction positioning.

References

[1]
Richard A Abrams, David E Meyer, and Sylvan Kornblum. 1990. Eye-hand coordination: oculomotor control in rapid aimed limb movements. Journal of Experimental Psychology: Human Perception and Performance 16, 2 (1990), 248--267.
[2]
Harold Bekkering, Jos J. Adam, Herman Kingma, A. Huson, and H. T. A. Whiting. 1994. Reaction time latencies of eye and hand movements in single- and dual-task conditions. Experimental Brain Research 97, 3 (01 Jan 1994), 471--476.
[3]
Darrell S. Best and Andrew T. Duchowski. 2016. A Rotary Dial for Gaze-based PIN Entry. In Proceedings of the Ninth Biennial ACM Symposium on Eye Tracking Research & Applications (ETRA '16). ACM, New York, NY, USA, 69--76.
[4]
Vivian Cook. 2014. Standard Punctuation and the Punctuation of the Street. Springer International Publishing, Cham, 267--290.
[5]
Justin Cuaresma and I Scott MacKenzie. 2013. A study of variations of Qwerty soft keyboards for mobile phones. In Proceedings of the International Conference on Multimedia and Human-Computer Interaction-MHCI. 126--1.
[6]
Renato de Sousa Gomide, Luiz Fernando Batista Loja, Rodrigo Pinto Lemos, Edna Lúcia Flôres, Francisco Ramos Melo, and Ricardo Antonio Gonçalves Teixeira. 2016. A new concept of assistive virtual keyboards based on a systematic review of text entry optimization techniques. Research on Biomedical Engineering 32 (06 2016), 176 -- 198. http://www.scielo.br/scielo.php?script=sci_arttext&pid=S2446-47402016000200176&nrm=iso
[7]
Antonio Diaz-Tula and Carlos H. Morimoto. 2016. AugKey: Increasing Foveal Throughput in Eye Typing with Augmented Keys. In Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems (CHI '16). ACM, New York, NY, USA, 3533--3544.
[8]
Andrew Duchowski. 2007. Eye tracking methodology: Theory and practice (2 ed.). Vol. 373. Springer Science & Business Media.
[9]
Nestor Garay-Vitoria and Julio Abascal. 2006. Text prediction systems: a survey. Universal Access in the Information Society 4, 3 (01 Mar 2006), 188--203.
[10]
Jason Tyler Griffin, Jerome Pasquero, and Donald Somerset McKenzie. 2015a. Touchscreen keyboard providing selection of word predictions in partitions of the touchscreen keyboard. (Aug. 25 2015). US Patent 9,116,552.
[11]
Jason Tyler Griffin, Jerome Pasquero, Donald Somerset McKenzie, and Alistair Robert Hamilton. 2015b. In-letter word prediction for virtual keyboard. (Sept. 1 2015). US Patent 9,122,672.
[12]
Siwacha Janpinijrut, Prakasith Kayasith, Cholwich Nattee, and Manabu Okumura. 2011. Vowel-separated Layout: A Thai Touchscreen Keyboard for People with Hand Movement Disability. In Proceedings of the 5th International Conference on Rehabilitation Engineering & Assistive Technology (i-CREATe '11). Singapore Therapeutic, Assistive & Rehabilitative Technologies (START) Centre, Kaki Bukit TechPark II, Singapore, Article 10, 4 pages. http://dl.acm.org/citation.cfm?id=2500753.2500765
[13]
Anders Sewerin Johansen, John Paulin Hansen, Dan Witzner Hansen, Kenji Itoh, and Satoru Mashino. 2003. Language Technology in a Predictive, Restricted On-screen Keyboard with Dynamic Layout for Severely Disabled People. In Proceedings of the 2003 EACL Workshop on Language Modeling for Text Entry Methods (TextEntry '03). Association for Computational Linguistics, Stroudsburg, PA, USA, 59--66. http://dl.acm.org/citation.cfm?id=1628195.1628203
[14]
Andrew Kurauchi, Wenxin Feng, Ajjen Joshi, Carlos Morimoto, and Margrit Betke. 2016. EyeSwipe: Dwell-free Text Entry Using Gaze Paths. In Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems (CHI '16). ACM, New York, NY, USA, 1952--1956.
[15]
Gordon Kurtenbach and William Buxton. 1993. The Limits ofExpert Performance Using Hierarchic Marking Menus. In Proceedings of the INTERACT '93 and CHI '93 Conference on Human Factors in ComputingSystems (CHI'93). ACM, NewYork, NY, USA, 482--487.
[16]
I. Scott MacKenzie and R. William Soukoreff. 2003. Phrase Sets for Evaluating Text Entry Techniques. In CHI '03 Extended Abstracts on Human Factors in Computing Systems (CHI EA '03). ACM, New York, NY, USA, 754--755.
[17]
I Scott MacKenzie and K Tanaka-Ishii. 2007a. Evaluation of text entry techniques. Vol. 2007. Morgan Kaufmann San Francisco, CA.
[18]
I. Scott MacKenzie and Kumiko Tanaka-Ishii. 2007b. Text Entry Systems: Mobility, Accessibility, Universality. Morgan Kaufmann Publishers Inc., San Francisco, CA, USA.
[19]
I Scott MacKenzie and Shawn X Zhang. 1999. The design and evaluation of a high-performance soft keyboard. In Proceedings of the SIGCHI conference on Human Factors in Computing Systems. ACM, 25--31.
[20]
I. Scott MacKenzie and Xuang Zhang. 2008. Eye Typing Using Word and Letter Prediction and a Fixation Algorithm. In Proceedings of the 2008 Symposium on Eye Tracking Research & Applications (ETRA '08). ACM, New York, NY, USA, 55--58.
[21]
Päivi Majaranta. 2009. Text entry by eye gaze. Ph.D. Dissertation.
[22]
Päivi Majaranta, I. Scott MacKenzie, Anne Aula, and Kari-Jouko Räihä. 2006. Effects of feedback and dwell time on eye typing speed and accuracy. Universal Access in the Information Society 5, 2 (01 Aug 2006), 199--208.
[23]
Päivi Majaranta and Kari-Jouko Räihä. 2002. Twenty Years ofEye Typing: Systems and Design Issues. In Proceedings of the 2002 Symposium on Eye Tracking Research & Applications (ETRA '02). ACM, New York, NY, USA, 15--22.
[24]
Julio Miró-Borrás and Pablo Bernabeu-Soler. 2009. Text Entry in the E-Commerce Age: Two Proposals for the Severely Handicapped. Journal of theoretical and applied electronic commerce research 4 (04 2009), 101 -- 112. https://scielo.conicyt.cl/scielo.php?script=sci_arttext&pid=S0718-18762009000100009&nrm=iso
[25]
Carlos H Morimoto and Arnon Amir. 2010. Context switching for fast key selection in text entry applications. In Proceedings of the 2010 symposium on eye-tracking research & applications. ACM, 271--274.
[26]
Philippe Morin, David Kryze, Luca Rigazio, and Peter Veprek. 2011. Virtual keypad systems and methods. (Nov. 22 2011). US Patent 8,065,624.
[27]
Prateek Panwar, Sayan Sarcar, and Debasis Samanta. 2012. EyeBoard: A fast and accurate eye gaze-based text entry system. In Intelligent Human Computer Interaction (IHCI), 2012 4th International Conference on. IEEE, 1--8.
[28]
Diogo Pedrosa, Maria Da Graça Pimentel, Amy Wright, and Khai N Truong. 2015. Filteryedping: Design challenges and user performance of dwell-free eye typing. ACM Transactions on Accessible Computing (TACCESS) 6, 1 (2015), 3.
[29]
Kari-Jouko Räihä and Saila Ovaska. 2012. An Exploratory Study of Eye Typing Fundamentals: Dwell Time, Text Entry Rate, Errors, and Workload. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '12). ACM, New York, NY, USA, 3001--3010.
[30]
Frode Eika Sandnes. 2015. Reflective Text Entry: A Simple Low Effort Predictive Input Method Based on Flexible Abbreviations. Procedia Computer Science 67 (2015), 105 -- 112. Proceedings of the 6th International Conference on Software Development and Technologies for Enhancing Accessibility and Fighting Info-exclusion.
[31]
Sayan Sarcar, Prateek Panwar, and Tuhin Chakraborty. 2013. EyeK: an efficient dwell-free eye gaze-based text entry system. In Proceedings of the 11th asia pacific conference on computer human interaction. ACM, 215--220.
[32]
Korok Sengupta, Raphael Menges, Chandan Kumar, and Steffen Staab. 2017a. GazeTheKey: Interactive Keys to Integrate Word Predictions for Gaze-based Text Entry. In Proceedings ofthe 22Nd International Conference on Intelligent User Interfaces Companion (IUI '17 Companion). ACM, New York, NY, USA, 121--124.
[33]
K. Sengupta, J. Sun, R. Menges, C. Kumar, and S. Staab. 2017b. Analyzing the Impact of Cognitive Load in Evaluating Gaze-Based Typing. In 2017 IEEE 30th International Symposium on Computer-Based Medical Systems (CBMS). 787--792.
[34]
M. Kumar Sharma, Somnath Dey, P. Kumar Saha, and Debasis Samanta. 2010. Parameters effecting the predictive virtual keyboard. In 2010 IEEE Students Technology Symposium (TechSym). 268--275.
[35]
Linda E. Sibert and Robert J. K. Jacob. 2000. Evaluation of Eye Gaze Interaction. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '00). ACM, New York, NY, USA, 281--288.
[36]
David J. Ward and David J. C. MacKay. 2002. Fast hands-free writing by gaze direction. Nature 418 (22 Aug 2002), 838 EP -.

Cited By

View all
  • (2024)40 Years of Eye Typing: Challenges, Gaps, and Emergent StrategiesProceedings of the ACM on Human-Computer Interaction10.1145/36555968:ETRA(1-19)Online publication date: 28-May-2024
  • (2024)Eye-Hand Typing: Eye Gaze Assisted Finger Typing via Bayesian Processes in ARIEEE Transactions on Visualization and Computer Graphics10.1109/TVCG.2024.337210630:5(2496-2506)Online publication date: 19-Mar-2024
  • (2023)Gaze Speedup: Eye Gaze Assisted Gesture Typing in Virtual RealityProceedings of the 28th International Conference on Intelligent User Interfaces10.1145/3581641.3584072(595-606)Online publication date: 27-Mar-2023
  • Show More Cited By

Index Terms

  1. Impact of variable positioning of text prediction in gaze-based text entry

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Conferences
    ETRA '19: Proceedings of the 11th ACM Symposium on Eye Tracking Research & Applications
    June 2019
    623 pages
    ISBN:9781450367097
    DOI:10.1145/3314111
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

    Sponsors

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 25 June 2019

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. gaze input
    2. interaction
    3. text entry
    4. text prediction
    5. variable position

    Qualifiers

    • Research-article

    Funding Sources

    Conference

    ETRA '19

    Acceptance Rates

    Overall Acceptance Rate 69 of 137 submissions, 50%

    Upcoming Conference

    ETRA '25

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)18
    • Downloads (Last 6 weeks)1
    Reflects downloads up to 05 Mar 2025

    Other Metrics

    Citations

    Cited By

    View all
    • (2024)40 Years of Eye Typing: Challenges, Gaps, and Emergent StrategiesProceedings of the ACM on Human-Computer Interaction10.1145/36555968:ETRA(1-19)Online publication date: 28-May-2024
    • (2024)Eye-Hand Typing: Eye Gaze Assisted Finger Typing via Bayesian Processes in ARIEEE Transactions on Visualization and Computer Graphics10.1109/TVCG.2024.337210630:5(2496-2506)Online publication date: 19-Mar-2024
    • (2023)Gaze Speedup: Eye Gaze Assisted Gesture Typing in Virtual RealityProceedings of the 28th International Conference on Intelligent User Interfaces10.1145/3581641.3584072(595-606)Online publication date: 27-Mar-2023
    • (2023)GlanceWriter: Writing Text by Glancing Over Letters with GazeProceedings of the 2023 CHI Conference on Human Factors in Computing Systems10.1145/3544548.3581269(1-13)Online publication date: 19-Apr-2023
    • (2021)Nosype: A Novel Nose-tip Tracking-based Text Entry System for Smartphone Users with Clinical Disabilities for Touch-based TypingProceedings of the 23rd International Conference on Mobile Human-Computer Interaction10.1145/3447526.3472054(1-16)Online publication date: 27-Sep-2021
    • (2021)Hummer: Text Entry by Gaze and HumProceedings of the 2021 CHI Conference on Human Factors in Computing Systems10.1145/3411764.3445501(1-11)Online publication date: 6-May-2021
    • (2020)Leveraging Error Correction in Voice-based Text Entry by Talk-and-GazeProceedings of the 2020 CHI Conference on Human Factors in Computing Systems10.1145/3313831.3376579(1-11)Online publication date: 21-Apr-2020
    • (2020)TAGSwipe: Touch Assisted Gaze Swipe for Text EntryProceedings of the 2020 CHI Conference on Human Factors in Computing Systems10.1145/3313831.3376317(1-12)Online publication date: 21-Apr-2020

    View Options

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Figures

    Tables

    Media

    Share

    Share

    Share this Publication link

    Share on social media