ABSTRACT
The ability to engage in natural language interaction in physically situated settings hinges on a set of competencies such as managing conversational engagement, turn taking, understanding, language and behavior generation, and interaction planning. In human-human interaction these are mixed-initiative, collaborative processes, that often involve a wide array of finely coordinated verbal and non-verbal actions. Eye gaze, and more generally attention, among many other channels, play a fundamental role. In this talk, I will discuss samples of research work we have conducted over the last few years on developing models for supporting physically situated dialog in relatively unconstrained environments. Throughout, I will highlight the role that gaze and attention play in these models. I will discuss and showcase several prototype systems that we have developed, and describe opportunities for reasoning about, interpreting and producing gaze signals in support of fluid, seamless spoken language interaction.
Index Terms
- Attention and Gaze in Situated Language Interaction
Recommendations
Natural Communication about Uncertainties in Situated Interaction
ICMI '14: Proceedings of the 16th International Conference on Multimodal InteractionPhysically situated, multimodal interactive systems must often grapple with uncertainties about properties of the world, people, and their intentions and actions. We present methods for estimating and communicating about different uncertainties in ...
Dialog in the open world: platform and applications
ICMI-MLMI '09: Proceedings of the 2009 international conference on Multimodal interfacesWe review key challenges of developing spoken dialog systems that can engage in interactions with one or multiple participants in relatively unconstrained environments. We outline a set of core competencies for open-world dialog, and describe three ...
Perception of gaze direction for situated interaction
Gaze-In '12: Proceedings of the 4th Workshop on Eye Gaze in Intelligent Human Machine InteractionAccurate human perception of robots' gaze direction is crucial for the design of a natural and fluent situated multimodal face-to-face interaction between humans and machines. In this paper, we present an experiment targeted at quantifying the effects ...
Comments