ABSTRACT
In safety- and security-critical applications like video surveillance it is crucial that human operators detect task-relevant events in the continuous video streams and select them for report or dissemination to other authorities. Usually, the selection operation is performed using a manual input device like a mouse or a joystick. Due to the visually rich and dynamic input, the required high attention, the long working time, and the challenging manual selection of moving objects, it occurs that relevant events are missed. To alleviate this problem we propose adding another event selection process, using eye-brain input. Our approach is based on eye tracking and EEG, providing spatio-temporal event selection without any manual intervention. We report ongoing research, building on prior work where we showed the general feasibility of the approach. In this contribution, we extend our work testing the feasibility of the approach using more advanced and less artificial experimental paradigms simulating frequently occurring, basic types of real surveillance tasks. The paradigms are much closer to a real surveillance task in terms of the used visual stimuli, the more subtle cues for event indication, and the required viewing behavior. As a methodology we perform an experiment (N=10) with non-experts. The results confirm the feasibility of the approach for event selection in the advanced tasks. We achieve spatio-temporal event selection accuracy scores of up to 77% and 60% for different stages of event indication.
- Chang, C.-C., and Lin, C.-J. 2011. LIBSVM: A library for support vector machines. ACM Transactions on Intelligent Systems and Technology 2, 3, 27:1--27:27. Google ScholarDigital Library
- Fischer, Y., Krempel, E., Birnstill, P., Unmüßig, G., Monari, E., Moßgraber, J., Schenk, M., and Beyerer, J. 2014 (accepted for publication). Privacy-Aware Smart Video Surveillance Revisited. In Proceedings of the 9th Security Research Conference (Future Security).Google Scholar
- Heinze, N., Esswein, M., Krüger, W., and Saur, G. 2010. Image exploitation algorithms for reconnaissance and surveillance with UAV. In Proceedings of SPIE Defense, Security, and Sensing, International Society for Optics and Photonics (2010), 76680U--76680U.Google Scholar
- Hild, J., Brüstle, S., Heinze, N., and Peinsipp-Byma, E. 2013. Gaze interaction in UAS video exploitation. In Proceedings of SPIE Defense, Security, and Sensing, International Society for Optics and Photonics (2013), 87400H--87400H.Google Scholar
- Hild, J., Müller, E., Klaus, E., Peinsipp-Byma, E., and Beyerer, J. 2013. Evaluating Multi-Modal Eye Gaze Interaction for Moving Object Selection. In Proceedings of the Sixth International Conference on Advances in Computer-Human Interactions. ACHI 2013. IARIA, 454--459.Google Scholar
- Huang, B. and Lo, A. 2013. Integrating EEG information improves performance of gaze based cursor control. In Proceedings of Neural Engineering. IEEE, 415--418.Google Scholar
- Irwin, D. 1992. Visual memory within and across fixations. In Eye movements and visual cognition, Springer, 146--165.Google Scholar
- Jacob, R. J. K. and Karn, K. S. 2003. Eye tracking in human-computer interaction and usability research: Ready to deliver the promises. In Mind 2, 3 (2003), 573--605.Google Scholar
- Just, M. A. and Carpenter, P. A. 1976. Eye fixations and cognitive processes. Cognitive psychology 8, 4 (1976), 441--480.Google Scholar
- Krusienski, D. J., Sellers, E. W., McFarland, D. J., Vaughan, T. M., and Wolpaw, J. R. 2008. Toward Enhanced P300 Speller Performance. Journal of Neuroscience Methods 167, 1 (2008), 15--21.Google ScholarCross Ref
- Putze, F., Hild, J., Kärgel, R., Herff, Ch. Redmann, A., Beyerer, J. and Schulz, T. 2013. Locating User Attention using Eye tracking and EEG for Spatio-temporal Event Selection. In Proceedings of the 2013 international conference on intelligent user interfaces. IUI 2013. ACM, 129--136. Google ScholarDigital Library
- Schlögl, A., Keinrath, C., Zimmermann, D., Scherer, R., Leeb, R., and Pfurtscheller, G. A. 2007. Fully Automated Correction Method of EOG Artifacts in EEG Recordings. Clinical Neurophysiology 118, 1 (Jan. 2007), 98--204.Google ScholarCross Ref
- Salvucci, D. D. and Goldberg, J. H. Identifying fixations and saccades in eye-tracking protocols. 2000. In Proceedings of the 2000 symposium on Eye tracking research & applications. ETRA 2000. ACM, 71--78. Google ScholarDigital Library
- von Bünau, P., Meinecke, F., Kiraly, F., and Müller, K.-R. 2009. Finding stationary subspaces in multivariate time series. Physical Review Letters 103, 21. APS.Google ScholarCross Ref
- Teichner, W. H. 1974. The detection of a simple visual signal as a function of time on watch. Human Factors 16, 4, 339--352, Sage Publications.Google ScholarCross Ref
- Ware, C., and Mikaelian, H. 1987. An evaluation of an eye tracker as a device for computer input. ACM SIGCHI Bulletin 17, SI, ACM, 183--188. Google ScholarDigital Library
- Yong, X., Fatourechi, M., Ward, R. K., and Birch, G. E. 2011. The Design of a Point-and-Click System by Integrating a Self-Paced Brain Computer Interface with an Eye-Tracker. IEEE Journal on Emerging and Selected Topics in Circuits and Systems 1, 4 (December 2011), 590--602.Google ScholarCross Ref
- Zander, T. O., Gaertner, M., Kothe, C., and Vilimek, R. 2010. Combining Eye Gaze Input with a Brain-Computer Interface for Touchless Human-Computer Interaction. International Journal of Human-Computer Interaction 27, 1 (2010), 38--51.Google ScholarCross Ref
Index Terms
- Spatio-Temporal Event Selection in Basic Surveillance Tasks using Eye Tracking and EEG
Recommendations
Intervention-free selection using EEG and eye tracking
ICMI '16: Proceedings of the 18th ACM International Conference on Multimodal InteractionIn this paper, we show how recordings of gaze movements (via eye tracking) and brain activity (via electroencephalography) can be combined to provide an interface for implicit selection in a graphical user interface. This implicit selection works ...
Locating user attention using eye tracking and EEG for spatio-temporal event selection
IUI '13: Proceedings of the 2013 international conference on Intelligent user interfacesIn expert video analysis, the selection of certain events in a continuous video stream is a frequently occurring operation, e.g., in surveillance applications. Due to the dynamic and rich visual input, the constantly high attention and the required hand-...
Event-Based Pupil Tracking Using Bright and Dark Pupil Effect
UIST '23 Adjunct: Adjunct Proceedings of the 36th Annual ACM Symposium on User Interface Software and TechnologyReal-time high-speed gaze estimation can enable next-generation gaze-based interaction. The event camera, which captures intensity variations at high frequency, has been employed to this end. However, pupil tracking based only on events is difficult ...
Comments