ABSTRACT
Gaze following, or our ability to attend to where others are looking can be top-down controlled by context information about the social relevance of the gaze signal. In particular, it has been shown that gaze signals are followed more strongly when the gazer is believed to have a mind with the ability to show intentional behavior (i.e., human) compared to being pre-programmed (i.e., robot). Perceiving human traits in nonhuman agents (i.e., anthropomorphism) occurs naturally in human-robot interaction, where it has positive effects on performance. It can also attenuate performance, if the robot is designed in a way that makes it hard to categorize as human or nonhuman (e.g., humanoid appearance), and inflicts additional working memory load due to categorical ambiguity. Here, we examine if gaze signals of ambiguous humanoid agents are followed differently than those of unambiguous human or robot agents, and to what extent gaze following is affected by individual differences in working memory capacity (WMC). We assume participants with high versus low WMC to be more capable of suppressing reflexive gaze following behaviors to the cued location in a counterpredictive paradigm (where targets appear with high likelihoods at uncued locations), particularly when being cued by humanoid gazers (top-down control, which requires cognitive resources). While the analysis showed no effect of categorical ambiguity on top-down control abilities overall, it revealed that participants with low WMC had weaker top-down control than participants with high WMC for the most ambiguous humanoid agent. The results are discussed with regard to the design of social agents and human-robot interaction.
- Abdulaziz Abubshait and Eva Wiese. 2017. You Look Human, But Act Like a Machine: Agent Appearance and Behavior Modulate Different Aspects of HumanâĂŞRobot Interaction. Frontiers in Psychology 8 (2017).Google Scholar
- Jeremy N. Bailenson, Andrew C. Beall, and Jim Blascovich. 2002. Gaze and task performance in shared virtual environments. The Journal of Visualization and Computer Animation 13, 5 (Dec. 2002), 313--320.Google ScholarCross Ref
- Andrew P. Bayliss, Alexandra Frischen, Mark J. Fenske, and Steven P. Tipper. 2007. Affective evaluations of objects are influenced by observed gaze direction and emotional expression. Cognition 104, 3 (Sept. 2007), 644--653.Google ScholarCross Ref
- Andrew P. Bayliss, Matthew A. Paul, Peter R. Cannon, and Steven P. Tipper. 2006. Gaze cuing and affective judgments of objects: I like what you look at. Psychonomic Bulletin & Review 13, 6 (Dec. 2006), 1061--1066.Google ScholarCross Ref
- Gary Bente, Sabine RÃijggenberg, Nicole C. KrÃďmer, and Felix Eschenburg. 2008. Avatar-Mediated Networking: Increasing Social Presence and Interpersonal Trust in Net-Based Collaborations. Human Communication Research 34, 2 (April 2008), 287--318.Google ScholarCross Ref
- Robert O Deaner, Stephen V Shepherd, and Michael L Platt. 2007. Familiarity accentuates gaze cuing in women but not men. Biology Letters 3, 1 (Feb. 2007), 64--67.Google ScholarCross Ref
- N. J. Emery. 2000. The eyes have it: the neuroethology, function and evolution of social gaze. Neuroscience and Biobehavioral Reviews 24, 6 (Aug. 2000), 581--604.Google ScholarCross Ref
- Jeffrey L. Foster, Zach Shipstead, Tyler L. Harrison, Kenny L. Hicks, Thomas S. Redick, and Randall W. Engle. 2015. Shortened complex span tasks can reliably measure working memory capacity. Memory & Cognition 43, 2 (Feb. 2015), 226--236.Google ScholarCross Ref
- Elaine Fox, Andrew J. Calder, Andrew Mathews, and Jenny Yiend. 2007. Anxiety and Sensitivity to Gaze Direction in Emotionally Expressive Faces. Emotion (Washington, D.C.) 7, 3 (Aug. 2007), 478--486.Google Scholar
- Chris Kelland Friesen and Alan Kingstone. 1998. The eyes have it! Reflexive orienting is triggered by nonpredictive gaze. Psychonomic bulletin & review 5, 3 (1998), 490--495.Google Scholar
- Chris Kelland Friesen, Jelena Ristic, and Alan Kingstone. 2004. Attentional Effects of Counterpredictive Gaze and Arrow Cues. Journal of Experimental Psychology: Human Perception and Performance 30, 2 (2004), 319--329.Google ScholarCross Ref
- Alexandra Frischen, Andrew P. Bayliss, and Steven P. Tipper. 2007. Gaze Cueing of Attention. Psychological bulletin 133, 4 (July 2007), 694--724.Google Scholar
- Chris D. Frith and Uta Frith. 2006. The Neural Basis of Mentalizing. Neuron 50, 4 (May 2006), 531--534.Google ScholarCross Ref
- Kurt Gray, Liane Young, and Adam Waytz. 2012. Mind Perception Is the Essence of Morality. Psychological Inquiry 23, 2 (April 2012), 101--124.Google ScholarCross Ref
- Dana A. Hayward and Jelena Ristic. 2013. Measuring attention using the Posner cuing paradigm: the role of across and within trial target probabilities. Frontiers in Human Neuroscience 7 (May 2013).Google Scholar
- Bruce M. Hood, J. Douglas Willen, and Jon Driver. 1998. Adult's eyes trigger shifts of visual attention in human infants. Psychological Science 9, 2 (1998), 131--134.Google ScholarCross Ref
- Michael J. Kane, M. Kathryn Bleckley, Andrew R. A. Conway, and Randall W. Engle. 2001. A controlled-attention view of working-memory capacity. Journal of Experimental Psychology: General 130, 2 (2001), 169--183.Google ScholarCross Ref
- Jari KÃďtsyri, Klaus FÃűrger, Meeri MÃďkÃďrÃďinen, and Tapio Takala. 2015. A review of empirical evidence on different uncanny valley hypotheses: support for perceptual mismatch as one road to the valley of eeriness. Frontiers in Psychology 6 (2015).Google Scholar
- Arielle R. Mandell, Melissa A. Smith, Molly C. Martini, Tyler H. Shaw, and Eva Wiese. 2015. Does the Presence of Social Agents Improve Cognitive Performance on a Vigilance Task? In Social Robotics, Adriana Tapus, Elisabeth AndrÃľ, Jean-Claude Martin, FranÃğois Ferland, and Mehdi Ammi (Eds.). Vol. 9388. Springer International Publishing, Cham, 421--430. http://link.springer.com/10.1007/978-3-319-25554-5_42Google Scholar
- Molly C. Martini, George A. Buzzell, and Eva Wiese. 2015. Agent Appearance Modulates Mind Attribution and Social Attention in Human-Robot Interaction. In Social Robotics (Lecture Notes in Computer Science). Springer, Cham, 431--439.Google Scholar
- Maya B. Mathur and David B. Reichling. 2016. Navigating a social world with robot partners: A quantitative cartography of the Uncanny Valley. Cognition 146 (Jan. 2016), 22--32.Google Scholar
- Masahiro Mori. 1970. The Uncanny Valley: The Original Essay by Masahiro Mori. (June 1970). https://spectrum.ieee.org/automaton/robotics/humanoids/the-uncanny-valleyGoogle Scholar
- M. Mori, K. F. MacDorman, and N. Kageki. 2012. The Uncanny Valley {From the Field}. IEEE Robotics Automation Magazine 19, 2 (June 2012), 98--100.Google ScholarCross Ref
- Anna Pecchinenda and Manuel Petrucci. 2016. Emotion Unchained: Facial Expression Modulates Gaze Cueing under Cognitive Load. PLOS ONE 11, 12 (Dec. 2016), e0168111.Google ScholarCross Ref
- Giuseppina Porciello, Marco Tullio Liuzza, Ilaria Minio-paluello, Gian Vittorio Caprara, and Salvatore Maria Aglioti. 2016. Fortunes and misfortunes of political leaders reflected in the eyes of their electors. Experimental Brain Research; Heidelberg 234, 3 (March 2016), 733--740.Google Scholar
- Michael I. Posner. 1980. Orienting of attention. Quarterly Journal of Experimental Psychology 32, 1 (Feb. 1980), 3--25.Google ScholarCross Ref
- Susanne Quadflieg, Malia F. Mason, and C. Neil Macrae. 2004. The owl and the pussycat: Gaze cues and visuospatial orienting. Psychonomic bulletin & review 11, 5 (2004), 826--831.Google Scholar
- Leonhard Schilbach, Marcus Wilms, Simon B. Eickhoff, Sandro Romanzetti, Ralf Tepest, Gary Bente, N. Jon Shah, Gereon R. Fink, and Kai Vogeley. 2009. Minds Made for Sharing: Initiating Joint Attention Recruits Reward-related Neurocircuitry. Journal of Cognitive Neuroscience 22, 12 (Nov. 2009), 2702--2715. Google ScholarDigital Library
- Atsushi Senju and Gergely Csibra. 2008. Gaze Following in Human Infants Depends on Communicative Signals. Current Biology 18, 9 (May 2008), 668--671.Google ScholarCross Ref
- Christoph Teufel, Dean M. Alexis, Nicola S. Clayton, and Greg Davis. 2010. Mental-state attribution drives rapid, reflexive gaze following. Attention, Perception, & Psychophysics 72, 3 (April 2010), 695--705.Google ScholarCross Ref
- Eva Wiese, Tyler Shaw, Daniel Lofaro, and Carryl Baldwin. 2017. Designing Artificial Agents as Social Companions. Proceedings of the Human Factors and Ergonomics Society Annual Meeting 61, 1 (Sept. 2017), 1604--1608.Google ScholarCross Ref
- Eva Wiese, Agnieszka Wykowska, and Hermann J. MÃijller. 2014. What We Observe Is Biased by What Other People Tell Us: Beliefs about the Reliability of Gaze Behavior Modulate Attentional Orienting to Gaze Cues. PLOS ONE 9, 4 (April 2014), e94529.Google ScholarCross Ref
- Eva Wiese, Agnieszka Wykowska, Jan Zwickel, and Hermann J. MÃijller. 2012. I See What You Mean: How Attentional Selection Is Shaped by Ascribing Intentions to Others. PLOS ONE 7, 9 (Sept. 2012), e45391.Google ScholarCross Ref
- Eva Wiese, Jan Zwickel, and Hermann Josef MÃijller. 2013. The importance of context information for the spatial specificity of gaze cueing. Attention, Perception, & Psychophysics 75, 5 (July 2013), 967--982.Google ScholarCross Ref
- Agnieszka Wykowska, Eva Wiese, Aaron Prosser, and Hermann J. MÃijller. 2014. Beliefs about the Minds of Others Influence How We Process Sensory Information. PLOS ONE 9, 4 (April 2014), e94339.Google ScholarCross Ref
Index Terms
- Differences in Working-Memory Capacity Modulate Top-down Control of Social Attention
Recommendations
Early auditory evoked potential is modulated by selective attention and related to individual differences in visual working memory capacity
A growing body of research suggests that the predictive power of working memory (WM) capacity for measures of intellectual aptitude is due to the ability to control attention and select relevant information. Crucially, attentional mechanisms implicated ...
Intermodal attention shifts in multimodal working memory
Attention maintains task-relevant information in working memory WM in an active state. We investigated whether the attention-based maintenance of stimulus representations that were encoded through different modalities is flexibly controlled by top-down ...
Temporal dynamics of attention during encoding versus maintenance of working memory: Complementary views from event-related potentials and alpha-band oscillations
Working memory WM is strongly influenced by attention. In visual WM tasks, recall performance can be improved by an attention-guiding cue presented before encoding precue or during maintenance retrocue. Although precues and retrocues recruit a similar ...
Comments