ABSTRACT
Nowadays, humans are surrounded by many complex computer systems. When people interact among each other, they use multiple modalities including voice, body posture, hand gestures, facial expressions, or eye gaze. Currently, computers can only understand a small subset of these modalities, but such cues can be captured by an increasing number of wearable devices. This research aims to improve traditional human-human and human-machine interaction by augmenting humans with wearable technology and developing novel user interfaces.
More specifically, (i) we investigate and develop systems that enable a group of people in close proximity to interact using in-air hand gestures and facilitate effortless information sharing. Additionally, we focus on (ii) eye gaze which can further enrich the interaction between humans and cyber-physical systems.
- Mihai Bâce, Teemu Leppänen, David Gil de Gomez, and Argenis Ramirez Gomez. ubiGaze: Ubiquitous Augmented Reality Messaging Using Gaze Gestures. In Proc. SIGGRAPH ASIA '16 MGIA. 11:1--11:5. Google ScholarDigital Library
- Mihai Bâce, Gábor Sörös, Sander Staal, and Giorgio Corbellini. HandshakAR: Wearable Augmented Reality System for Effortless Information Sharing. In Proc. Augmented Human '17. 34:1--34:5. Google ScholarDigital Library
- Andreas Bulling and Kai Kunze. Eyewear Computers for Human-computer Interaction. In Interactions '16. 70--73. Google ScholarDigital Library
- Moritz Kassner, William Patera, and Andreas Bulling. Pupil: An Open Source Platform for Pervasive Eye Tracking and Mobile Gaze-based Interaction. In Proc. UbiComp '14 Adjunct. 1151--1160. Google ScholarDigital Library
- Kyle Krafka, Aditya Khosla, Petr Kellnhofer, Harini Kannan, Suchendra Bhandarkar, Wojciech Matusik, and Antonio Torralba. Eye Tracking for Everyone. In Proc. CVPR '16.Google Scholar
- Andrés Lucero, Jussi Holopainen, and Tero Jokela. Pass-them-around: Collaborative Use of Mobile Phones for Photo Sharing. In Proc CHI '11. 1787--1796. Google ScholarDigital Library
- Nicolai Marquardt, Robert Diaz-Marino, Sebastian Boring, and Saul Greenberg. The Proximity Toolkit: Prototyping Proxemic Interactions in Ubiquitous Computing Ecologies. In Proc. UIST '11. 315--326. Google ScholarDigital Library
Index Terms
- Augmenting human interaction capabilities with proximity, natural gestures, and eye gaze
Recommendations
Social eye gaze in human-robot interaction: a review
This article reviews the state of the art in social eye gaze for human-robot interaction (HRI). It establishes three categories of gaze research in HRI, defined by differences in goals and methods: a human-centered approach, which focuses on people's ...
4th workshop on eye gaze in intelligent human machine interaction: eye gaze and multimodality
ICMI '12: Proceedings of the 14th ACM international conference on Multimodal interactionThis is the fourth workshop in a series of workshops on Eye Gaze in Intelligent Human Machine Interaction, in which we have discussed a wide range of issues for eye gaze; technologies for sensing human attentional behaviors, roles of attentional ...
Eye, Head and Torso Coordination During Gaze Shifts in Virtual Reality
Humans perform gaze shifts naturally through a combination of eye, head and body movements. Although gaze has been long studied as input modality for interaction, this has previously ignored the coordination of the eyes, head and body. This article ...
Comments