Abstract
We present in this paper a detailed comparison of different algorithms and devices to determine the number of words read in everyday life. We call our system the “Wordometer”. We used three kinds of eye tracking systems in our experiment: mobile video-oculography (MVoG); stationary video-oculography (SVoG); and electro-oculography (EoG). By analyzing the movement of the eyes we were able to estimate the number of words that a user read. Recently, inexpensive eye trackers have appeared on the market. Thus, we undertook a large-scale experiment that compared three devices that can be used for daily reading on a screen: the Tobii Eye X SVoG; the JINS MEME EoG; and the Pupil MVoG. We found that the accuracy of the everyday life devices and professional devices was similar when used with the Wordometer. We analyzed the robustness of the systems for special reading behaviors: rereading and skipping.
With the MVoG, SVoG and EoG systems, we obtained estimation errors respectively, 7.2%, 13.0%, and 10.6% in our main experiment. In all our experiments, we obtained 300 recordings by 14 participants, which amounted to 109,097 read words.
- C. Gurrin, A. F. Smeaton, and A. R. Doherty, “Lifelogging: Personal big data,” Foundations and trends in information retrieval, vol. 8, no. 1, pp. 1--125, 2014. Google ScholarDigital Library
- E. K. Choe, N. B. Lee, B. Lee, W. Prat, and J. A. Kientz, “Understanding quantified-selfers' practices in collecting and exploring personal data,” in Proceedings of the 32nd annual ACM conference on Human factors in computing systems. ACM, 2014, pp. 1143--1152. Google ScholarDigital Library
- P. M. Hurvitz, A. V. Moudon, B. Kang, B. E. Saelens, and G. E. Duncan, “Emerging technologies for assessing physical activity behaviors in space and time,” Emerging Technologies to Promote and Evaluate Physical Activity, p. 8, 2014.Google Scholar
- K. Kitamura, T. Yamasaki, and K. Aizawa, “Food log by analyzing food images,” in Proceedings of the 16th ACM international conference on Multimedia. ACM, 2008, pp. 999--1000. Google ScholarDigital Library
- J.-K. Min, A. Doryab, J. Wiese, S. Amini, J. Zimmerman, and J. I. Hong, “Toss'n'turn: smartphone as sleep and sleep quality detector,” in Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. ACM, 2014, pp. 477--486.Google ScholarDigital Library
- S. Abdullah, E. L. Murnane, M. Mathews, M. Kay, J. A. Kientz, G. Gay, and T. Choudhury, “Cognitive rhythms: unobtrusive and continuous sensing of alertness using a mobile phone,” in Proceedings of the 2016 ACM International Joint Conference on Pervasive and Ubiquitous Computing. ACM, 2016, pp. 178--189. Google ScholarDigital Library
- P. T. Terenzini, L. Springer, E. T. Pascarella, and A. Nora, “Influences affecting the development of students' critical thinking skills,” Research in higher education, vol. 36, no. 1, pp. 23--39, 1995. Google ScholarCross Ref
- O. Augereau, K. Kise, and K. Hoshika, “A proposal of a document image reading-life log based on document image retrieval and eyetracking,” in Document Analysis and Recognition (ICDAR), 2015 13th International Conference on. IEEE, 2015, pp. 246--250. Google ScholarDigital Library
- K. Kunze, H. Kawaichi, K. Yoshimura, and K. Kise, “The wordometer--estimating the number of words read using document image retrieval and mobile eye tracking,” in Document Analysis and Recognition (ICDAR), 2013 12th International Conference on. IEEE, 2013, pp. 25--29. Google ScholarDigital Library
- S. Ishimaru, K. Kunze, K. Kise, and A. Dengel, “The wordometer 2.0: estimating the number of words you read in real life using commercial eog glasses,” in Proceedings of the 2016 ACM International Joint Conference on Pervasive and Ubiquitous Computing: Adjunct. ACM, 2016, pp. 293--296. Google ScholarDigital Library
- K. Rayner, T. J. Slatery, and N. N. Bélanger, “Eye movements, the perceptual span, and reading speed,” Psychonomic bulletin 8 review, vol. 17, no. 6, pp. 834--839, 2010.Google Scholar
- M. Shelhamer and D. C. Roberts, “Chapter 6 - magnetic scleral search coil,” in Vertigo and Imbalance: Clinical Neurophysiology of the Vestibular System, ser. Handbook of Clinical Neurophysiology. Elsevier, 2010, vol. 9, pp. 80--87. [Online]. Available: http://www.sciencedirect.com/science/article/pii/S1567423110090064Google ScholarCross Ref
- R. Barea, L. Boquete, S. Ortega, E. López, and J. Rodríguez-Ascariz, “Eog-based eye movements codification for human computer interaction,” Expert Systems with Applications, vol. 39, no. 3, pp. 2677--2683, 2012. Google ScholarDigital Library
- A. Bulling, J. A. Ward, H. Gellersen, and G. Troster, “Eye movement analysis for activity recognition using electrooculography,” IEEE transactions on pattern analysis and machine intelligence, vol. 33, no. 4, pp. 741--753, 2011. Google ScholarDigital Library
- C. Galdi, M. Nappi, D. Riccio, and H. Wechsler, “Eye movement analysis for human authentication: Critical survey,” Pattern Recognition Letters, 2016. Google ScholarDigital Library
- K. Kunze, K. Masai, M. Inami, Ö. Sacakli, M. Liwicki, A. Dengel, S. Ishimaru, and K. Kise, “Qantifying reading habits: counting how many words you read,” in Proceedings of the 2015 ACM International Joint Conference on Pervasive and Ubiquitous Computing. ACM, 2015, pp. 87--96. Google ScholarDigital Library
- Y. Shiga, T. Toyama, Y. Utsumi, K. Kise, and A. Dengel, “Daily activity recognition combining gaze motion and visual features,” in Proceedings of the 2014 ACM International Joint Conference on Pervasive and Ubiquitous Computing: Adjunct Publication. ACM, 2014, pp. 1103--1111. Google ScholarDigital Library
- K. Ogaki, K. M. Kitani, Y. Sugano, and Y. Sato, “Coupling eye-motion and ego-motion features for first-person activity recognition,” in 2012 IEEE Computer Society Conference on Computer Vision and Pattern Recognition Workshops. IEEE, 2012, pp. 1--7. Google ScholarCross Ref
- A. I. Adiba, N. Tanaka, and J. Miyake, “An adjustable gaze tracking system and its application for automatic discrimination of interest objects,” IEEE/ASME Transactions on Mechatronics, vol. 21, no. 2, pp. 973--979, 2016. Google ScholarCross Ref
- C. Holland and O. V. Komogortsev, “Biometric identification via eye movement scanpaths in reading,” in Biometrics (IJCB), 2011 International Joint Conference on. IEEE, 2011, pp. 1--8. Google ScholarDigital Library
- K. Rayner, K. H. Chace, T. J. Slatery, and J. Ashby, “Eye movements as reflections of comprehension processes in reading,” Scientific Studies of Reading, vol. 10, no. 3, pp. 241--255, 2006. Google ScholarCross Ref
- O. Augereau, H. Fujiyoshi, K. Kunze, and K. Kise, “Estimation of english skill with a mobile eye tracker,” in Adjunct Proceedings of the 2016 ACM International Joint Conference on Pervasive and Ubiquitous Computing and Proceedings of the 2016 ACM International Symposium on Wearable Computers. ACM, 2016, pp. 1777--1781. Google ScholarDigital Library
- O. Augereau, H. Fujiyoshi, and K. Kise, “Towards an automated estimation of english skill via toeic score based on reading analysis,” in Pattern Recognition (ICPR), 2016 23rd International Conference on. IEEE, 2016, pp. 1285--1290. Google ScholarCross Ref
- R. Biedert, A. Dengel, M. Elshamy, and G. Buscher, “Towards robust gaze-based objective quality measures for text,” in Proceedings of the Symposium on Eye Tracking Research and Applications. ACM, 2012, pp. 201--204. Google ScholarDigital Library
- K. Kunze, Y. Utsumi, Y. Shiga, K. Kise, and A. Bulling, “I know what you are reading: recognition of document types using mobile eye tracking,” in Proceedings of the 17th annual international symposium on International symposium on wearable computers. ACM, 2013, pp. 113--116. Google ScholarDigital Library
- M. Pentinen and E. Huovinen, “The early development of sight-reading skills in adulthood a study of eye movements,” Journal of Research in Music Education, vol. 59, no. 2, pp. 196--220, 2011. Google ScholarCross Ref
- H. R. Gudmundsdotir, “Advances in music-reading research,” Music Education Research, vol. 12, no. 4, pp. 331--338, 2010. Google ScholarCross Ref
- C. Rigaud, T.-N. Le, J.-C. Burie, J.-M. Ogier, S. Ishimaru, M. Iwata, and K. Kise, “Semi-automatic text and graphics extraction of manga using eye tracking information,” in 2016 12th IAPR Workshop on Document Analysis Systems (DAS). IEEE, 2016, pp. 120--125. Google ScholarCross Ref
- K. Rayner, “Eye movements in reading and information processing: 20 years of research.” Psychological bulletin, vol. 124, no. 3, pp. 372--422, 1998. Google ScholarCross Ref
- G. Buscher, A. Dengel, and L. van Elst, “Eye movements as implicit relevance feedback,” in CHI‘08 extended abstracts on Human factors in computing systems. ACM, 2008, pp. 2991--2996.Google Scholar
- D. D. Salvucci and J. H. Goldberg, “Identifying fixations and saccades in eye-tracking protocols,” in Proceedings of the 2000 symposium on Eye tracking research 8 applications. ACM, 2000, pp. 71--78. Google ScholarDigital Library
- M. A. Case, H. A. Burwick, K. G. Volpp, and M. S. Patel, “Accuracy of smartphone applications and wearable devices for tracking physical activity data,” Jama, vol. 313, no. 6, pp. 625--626, 2015. Google ScholarCross Ref
- C. Lennon and H. Burdick, “The lexile framework as an approach for reading measurement and success,” electronic publication on www.lexile.com, 2004.Google Scholar
- A. Gibaldi, M. Vanegas, P. J. Bex, and G. Maiello, “Evaluation of the tobii eyex eye tracking controller and matlab toolkit for research,” Behavior Research Methods, pp. 1--24, 2016.Google Scholar
- R. Biedert, J. Hees, A. Dengel, and G. Buscher, “A robust realtime reading-skimming classifier,” in Proceedings of the Symposium on Eye Tracking Research and Applications. ACM, 2012, pp. 123--130. Google ScholarDigital Library
- F. Ehrler, C. Weber, and C. Lovis, “Influence of pedometer position on pedometer accuracy at various walking speeds: A comparative study,” Journal of medical Internet research, vol. 18, no. 10, 2016. Google ScholarCross Ref
Index Terms
Wordometer Systems for Everyday Life
Recommendations
Real-time wordometer demonstration using commercial EoG glasses
UbiComp '17: Proceedings of the 2017 ACM International Joint Conference on Pervasive and Ubiquitous Computing and Proceedings of the 2017 ACM International Symposium on Wearable ComputersReading is an important part of our everyday life. Most of us read every day at work, in the transportation, at home, etc. Except by counting the number of books a person read in a year or a month, it is very hard to quantify reading. We want to create ...
The wordometer 2.0: estimating the number of words you read in real life using commercial EOG glasses
UbiComp '16: Proceedings of the 2016 ACM International Joint Conference on Pervasive and Ubiquitous Computing: AdjunctOn the basis of the motivation to increase daily reading volumes, this paper introduces an implementation of "Wordometer 2.0," which counts the number of read words in a day. While word count estimation using eye tracking glasses or medical EOG (...
JINS MEME algorithm for estimation and tracking of concentration of users
UbiComp '17: Proceedings of the 2017 ACM International Joint Conference on Pervasive and Ubiquitous Computing and Proceedings of the 2017 ACM International Symposium on Wearable ComputersActivity tracking using a wearable device is an emerging research field. Large-scale studies on activity tracking performed with eyewear-type wearable devices remains a challenging area owing to the negative effect such devices have on users' looks. To ...
Comments