ABSTRACT
Understanding the ability to coordinate with a partner constitutes a great challenge in social signal processing and social robotics. In this paper, we designed a child-adult imitation task to investigate how automatically computable cues on turn-taking and movements can give insight into high-level perception of coordination. First we collected a human questionnaire to evaluate the perceived coordination of the dyads. Then, we extracted automatically computable cues and information on dialog acts from the video clips. The automatic cues characterized speech and gestural turn-takings and coordinated movements of the dyad. We finally confronted human scores with automatic cues to search which cues could be informative on the perception of coordination during the task. We found that the adult adjusted his behavior according to the child need and that a disruption of the gestural turn-taking rhythm was badly perceived by the judges. We also found, that judges rated negatively the dyads that talked more as speech intervenes when the child had difficulties to imitate. Finally, coherence measures between the partners' movement features seemed more adequate than correlation to characterize their coordination.
- F. Bernieri, J. Reznick, and R. Rosenthal. Synchrony, pseudo synchrony, and dissynchrony: Measuring the entrainment process in mother-infant interactions. Journal of Personality and Social Psychology, 54(2):243--253, 1988.Google ScholarCross Ref
- F. Bernieri and R. Rosenthal. Interpersonal coordination: Behavior matching and interactional synchrony. Fundamentals of nonverbal behavior. Cambridge University Press, 1991.Google Scholar
- G. R. Bradski. Computer vision face tracking for use in a perceptual user interface. Intel Technology Journal, (Q2), 1998.Google Scholar
- C. Breazeal and B. Scassellati. A context-dependent attention system for a social robot. In Proceedings of the Sixteenth International Joint Conference on Artificial Intelligence, pages 1146--1153, 1999. Google ScholarDigital Library
- C. L. Breazeal. Sociable machines: expressive social exchange between humans and robots. PhD thesis, 2000. Google ScholarDigital Library
- J. Cappella. Behavioral and judged coordination in adult informal social interactions: vocal and kinesic indicators. Pers. Soc. Psychol., 72:119--131, 1977.Google ScholarCross Ref
- A. Delaborde and L. Devillers. Use of nonverbal speech cues in social interaction between human and robot: emotional and interactional markers. In Proceedings of the 3rd international workshop on Affective interaction in natural environments, pages 75--80, 2010. Google ScholarDigital Library
- E. Delaherche and M. Chetouani. Multimodal coordination: exploring relevant features and measures. In Second International Workshop on Social Signal Processing, ACM Multimedia 2010, 2010. Google ScholarDigital Library
- H. Hung and D. Gatica-Perez. Estimating cohesion in small groups using audio-visual nonverbal behavior. IEEE Transactions on Multimedia, 12(6):563--575, 2010. Google ScholarDigital Library
- H. Hung, Y. Huang, G. Friedland, and D. Gatica-Perez. Estimating dominance in multi-party meetings using speaker diarization. IEEE Transactions on Audio, Speech & Language Processing, 19(4):847--860, 2011. Google ScholarDigital Library
- D. B. Jayagopi and D. Gatica-Perez. Mining group nonverbal conversational patterns using probabilistic topic models. IEEE Transactions on Multimedia, 12(8):790--802, 2010. Google ScholarDigital Library
- M. Kipp. Spatiotemporal coding in anvil. In LREC, 2008.Google Scholar
- D. Lakens. Movement synchrony and perceived entitativity. Journal of Experimental Social Psychology, 46(5):701 -- 708, 2010.Google ScholarCross Ref
- F. Ramseyer and W. Tschacher. Nonverbal synchrony or random coincidence? how to tell the difference. In A. Esposito et al., editors, Development of Multimodal Interfaces: Active Listening and Synchrony, volume 5967, pages 182--196. Springer Berlin / Heidelberg, 2010. Google ScholarDigital Library
- M. J. Richardson, K. L. Marsh, R. W. Isenhower, J. R. Goodman, and R. Schmidt. Rocking together: Dynamics of intentional and unintentional interpersonal coordination. Human Movement Science, 26(6):867 -- 891, 2007.Google ScholarCross Ref
- M. Rolf, M. Hanheide, and K. Rohlfing. Attention via synchrony : Making use of multimodal cues in social learning. IEEE Trans. Auton. Mental Develop., 1(1):55--67, 2009. Google ScholarDigital Library
- G. Varni, A. Camurri, P. Coletta, and G. Volpe. Toward a real-time automated measure of empathy and dominance. In CSE (4), pages 843--848, 2009. Google ScholarDigital Library
- G. Varni, M. Mancini, G. Volpe, and A. Camurri. Sync'n'move: social interaction based on music and gesture. Proceedings of the 1st International ICST Conference on User Centric Media, 2009.Google Scholar
- S. F. Worgan and R. K. Moore. Towards the detection of social dominance in dialogue. Speech Communication, In Press:--, 2011. Google ScholarDigital Library
- B. Wrede, S. Kopp, K. Rohlfing, M. Lohse, and C. Muhl. Appropriate feedback in asymmetric interactions. Journal of Pragmatics, 42(9):2369 -- 2384, 2010. How people talk to Robots and Computers.Google ScholarCross Ref
- Z. Yucel, A. A. Salah, C. Mericli, and T. Mericli. Joint visual attention modeling for naturally interacting robotic agents. In ISCIS, pages 242--247, 2009.Google ScholarCross Ref
Index Terms
- Characterization of coordination in an imitation task: human evaluation and automatically computable cues
Recommendations
Automatic recognition of coordination level in an imitation task
J-HGBU '11: Proceedings of the 2011 joint ACM workshop on Human gesture and behavior understandingAutomatic analysis of human-human degree of coordination bears challenging questions. In this paper, we propose to automatically predict the degree of coordination between dyadic partners performing an imitation task. A subjective evaluation of their ...
Looking for Laughs: Gaze Interaction with Laughter Pragmatics and Coordination
ICMI '21: Proceedings of the 2021 International Conference on Multimodal InteractionLaughter and gaze have an important role in managing and coordi-nating social interactions. In the current work, using a multimodal corpus of dyadic taste-testing interactions, we explore whether laughs performing different pragmatic functions are ...
Large Scale Enterprise Information System Architectures and Coordination Methods
ICEE '12: Proceedings of the 2012 3rd International Conference on E-Business and E-Government - Volume 03The large scale enterprise information systems are usually consisted of many subsystems or modules. Coordination among subsystems or modules is an important problem in developing and integrating information systems. In this paper, we classify ...
Comments