ABSTRACT
Gesture based interactions are commonly used in mobile and ubiquitous environments. Multimodal interaction techniques use lip gestures to enhance speech recognition or control mouse movement on the screen. In this paper we extend the previous work to explore LUI: lip gestures as an alternative input technique for controlling the user interface elements in a ubiquitous environment. In addition to use lips to control cursor movement, we use lip gestures to control music players and activate menus. A LUI Motion-Action library is also provided to guide future interaction design using lip gestures.
- Çetingül, H. E., Yemez, Y., Erzin, E. and Tekalp, A. M. Discriminative Analysis of Lip Motion Features for Speaker Identification and Speech-Reading. IEEE Transactions on Image Processing 15(10): 2879--2891 (2006). Google ScholarDigital Library
- Dalka, P. and Czyzewski, A. Human-Computer Interface Based on Visual Lip Movement and Gesture Recognition. IJCSA 7(3): 124--139 (2010).Google Scholar
- Dupont, S. and Luettin, J. Audio-Visual Speech Modeling for Continuous Speech Recognition. IEEE Transactions on Multimedia 2(3): 141--151 (2000). Google ScholarDigital Library
- Kanade, T., Tian, Y. and Cohn, J. F. Comprehensive Database for Facial Expression Analysis. FG 2000: 46--53. Google ScholarDigital Library
- Kaynak, M. N., Zhi, Q., Cheok, A. D., Sengupta, K., Zhang, J. and Ko, C. C. Analysis of lip geometric features for audio-visual speech recognition. IEEE Transactions on Systems, Man, and Cybernetics, Part A 34(4): 564--570 (2004). Google ScholarDigital Library
- Matthews, I., Cootes,T. F., Bangham, J. A., Cox, S. and Harvey, R. Extraction of Visual Features for Lipreading. IEEE Trans. Pattern Anal. Mach. Intell. 24(2): 198--213 (2002). Google ScholarDigital Library
- Pantic, M. and Rothkrantz, L. J. M. Automatic Analysis of Facial Expressions: The State of the Art. IEEE Trans. Pattern Anal. Mach. Intell. 22(12): 1424--1445 (2000). Google ScholarDigital Library
- Tian, Y., Kanade, T. and Cohn, J. F. Recognizing Lower Face Action Units for Facial Expression Analysis. FG 2000: 484--490. Google ScholarDigital Library
- Tian, Y., Kanade, T., and Cohn, J. F. Robust lip tracking by combining shape, color and motion, In Proc. of ACCV '00, 2000,1040--1045.Google Scholar
- Tu, J., Tao, H. and Huang, T. S. Face as mouse through visual face tracking. Computer Vision and Image Understanding 108(1-2): 35--40 (2007). Google ScholarDigital Library
- Zhao, S., Dragicevic, P., Chignell, M. H., Balakrishnan, R., Baudisch, P. earpod: Eyes-free menu selection using touch input and reactive audio feedback. In Proc. of CHI 2007: 1395--1404 Google ScholarDigital Library
Index Terms
- LUI: lip in multimodal mobile GUI interaction
Recommendations
PalmGesture: Using Palms as Gesture Interfaces for Eyes-free Input
MobileHCI '15: Proceedings of the 17th International Conference on Human-Computer Interaction with Mobile Devices and ServicesIn this paper, we explored eyes-free gesture interactions on palms, which enables users to interact with devices by drawing stroke gestures on palms without looking at palms. We conducted a 24-person user study to understand how users draw gestures on ...
Telekinetic Thumb Summons Out-of-reach Touch Interface Beneath Your Thumbtip (demo)
MobiSys '19: Proceedings of the 17th Annual International Conference on Mobile Systems, Applications, and ServicesAs personal interactive devices become more ingrained into our daily lives, it becomes more important to understand how seamless interaction with those devices can be fostered. A typical mechanism to interface with a personal device is via a touch ...
Gaze Speedup: Eye Gaze Assisted Gesture Typing in Virtual Reality
IUI '23: Proceedings of the 28th International Conference on Intelligent User InterfacesMid-air text input in augmented or virtual reality (AR/VR) is an open problem. One proposed solution is gesture typing where the user performs a gesture trace over the keyboard. However, this requires the user to move their hands precisely and ...
Comments