skip to main content
10.1145/3183654.3183686acmotherconferencesArticle/Chapter ViewAbstractPublication PagestechmindsocietyConference Proceedingsconference-collections
research-article

Differences in Working-Memory Capacity Modulate Top-down Control of Social Attention

Published:05 April 2018Publication History

ABSTRACT

Gaze following, or our ability to attend to where others are looking can be top-down controlled by context information about the social relevance of the gaze signal. In particular, it has been shown that gaze signals are followed more strongly when the gazer is believed to have a mind with the ability to show intentional behavior (i.e., human) compared to being pre-programmed (i.e., robot). Perceiving human traits in nonhuman agents (i.e., anthropomorphism) occurs naturally in human-robot interaction, where it has positive effects on performance. It can also attenuate performance, if the robot is designed in a way that makes it hard to categorize as human or nonhuman (e.g., humanoid appearance), and inflicts additional working memory load due to categorical ambiguity. Here, we examine if gaze signals of ambiguous humanoid agents are followed differently than those of unambiguous human or robot agents, and to what extent gaze following is affected by individual differences in working memory capacity (WMC). We assume participants with high versus low WMC to be more capable of suppressing reflexive gaze following behaviors to the cued location in a counterpredictive paradigm (where targets appear with high likelihoods at uncued locations), particularly when being cued by humanoid gazers (top-down control, which requires cognitive resources). While the analysis showed no effect of categorical ambiguity on top-down control abilities overall, it revealed that participants with low WMC had weaker top-down control than participants with high WMC for the most ambiguous humanoid agent. The results are discussed with regard to the design of social agents and human-robot interaction.

References

  1. Abdulaziz Abubshait and Eva Wiese. 2017. You Look Human, But Act Like a Machine: Agent Appearance and Behavior Modulate Different Aspects of HumanâĂŞRobot Interaction. Frontiers in Psychology 8 (2017).Google ScholarGoogle Scholar
  2. Jeremy N. Bailenson, Andrew C. Beall, and Jim Blascovich. 2002. Gaze and task performance in shared virtual environments. The Journal of Visualization and Computer Animation 13, 5 (Dec. 2002), 313--320.Google ScholarGoogle ScholarCross RefCross Ref
  3. Andrew P. Bayliss, Alexandra Frischen, Mark J. Fenske, and Steven P. Tipper. 2007. Affective evaluations of objects are influenced by observed gaze direction and emotional expression. Cognition 104, 3 (Sept. 2007), 644--653.Google ScholarGoogle ScholarCross RefCross Ref
  4. Andrew P. Bayliss, Matthew A. Paul, Peter R. Cannon, and Steven P. Tipper. 2006. Gaze cuing and affective judgments of objects: I like what you look at. Psychonomic Bulletin & Review 13, 6 (Dec. 2006), 1061--1066.Google ScholarGoogle ScholarCross RefCross Ref
  5. Gary Bente, Sabine RÃijggenberg, Nicole C. KrÃďmer, and Felix Eschenburg. 2008. Avatar-Mediated Networking: Increasing Social Presence and Interpersonal Trust in Net-Based Collaborations. Human Communication Research 34, 2 (April 2008), 287--318.Google ScholarGoogle ScholarCross RefCross Ref
  6. Robert O Deaner, Stephen V Shepherd, and Michael L Platt. 2007. Familiarity accentuates gaze cuing in women but not men. Biology Letters 3, 1 (Feb. 2007), 64--67.Google ScholarGoogle ScholarCross RefCross Ref
  7. N. J. Emery. 2000. The eyes have it: the neuroethology, function and evolution of social gaze. Neuroscience and Biobehavioral Reviews 24, 6 (Aug. 2000), 581--604.Google ScholarGoogle ScholarCross RefCross Ref
  8. Jeffrey L. Foster, Zach Shipstead, Tyler L. Harrison, Kenny L. Hicks, Thomas S. Redick, and Randall W. Engle. 2015. Shortened complex span tasks can reliably measure working memory capacity. Memory & Cognition 43, 2 (Feb. 2015), 226--236.Google ScholarGoogle ScholarCross RefCross Ref
  9. Elaine Fox, Andrew J. Calder, Andrew Mathews, and Jenny Yiend. 2007. Anxiety and Sensitivity to Gaze Direction in Emotionally Expressive Faces. Emotion (Washington, D.C.) 7, 3 (Aug. 2007), 478--486.Google ScholarGoogle Scholar
  10. Chris Kelland Friesen and Alan Kingstone. 1998. The eyes have it! Reflexive orienting is triggered by nonpredictive gaze. Psychonomic bulletin & review 5, 3 (1998), 490--495.Google ScholarGoogle Scholar
  11. Chris Kelland Friesen, Jelena Ristic, and Alan Kingstone. 2004. Attentional Effects of Counterpredictive Gaze and Arrow Cues. Journal of Experimental Psychology: Human Perception and Performance 30, 2 (2004), 319--329.Google ScholarGoogle ScholarCross RefCross Ref
  12. Alexandra Frischen, Andrew P. Bayliss, and Steven P. Tipper. 2007. Gaze Cueing of Attention. Psychological bulletin 133, 4 (July 2007), 694--724.Google ScholarGoogle Scholar
  13. Chris D. Frith and Uta Frith. 2006. The Neural Basis of Mentalizing. Neuron 50, 4 (May 2006), 531--534.Google ScholarGoogle ScholarCross RefCross Ref
  14. Kurt Gray, Liane Young, and Adam Waytz. 2012. Mind Perception Is the Essence of Morality. Psychological Inquiry 23, 2 (April 2012), 101--124.Google ScholarGoogle ScholarCross RefCross Ref
  15. Dana A. Hayward and Jelena Ristic. 2013. Measuring attention using the Posner cuing paradigm: the role of across and within trial target probabilities. Frontiers in Human Neuroscience 7 (May 2013).Google ScholarGoogle Scholar
  16. Bruce M. Hood, J. Douglas Willen, and Jon Driver. 1998. Adult's eyes trigger shifts of visual attention in human infants. Psychological Science 9, 2 (1998), 131--134.Google ScholarGoogle ScholarCross RefCross Ref
  17. Michael J. Kane, M. Kathryn Bleckley, Andrew R. A. Conway, and Randall W. Engle. 2001. A controlled-attention view of working-memory capacity. Journal of Experimental Psychology: General 130, 2 (2001), 169--183.Google ScholarGoogle ScholarCross RefCross Ref
  18. Jari KÃďtsyri, Klaus FÃűrger, Meeri MÃďkÃďrÃďinen, and Tapio Takala. 2015. A review of empirical evidence on different uncanny valley hypotheses: support for perceptual mismatch as one road to the valley of eeriness. Frontiers in Psychology 6 (2015).Google ScholarGoogle Scholar
  19. Arielle R. Mandell, Melissa A. Smith, Molly C. Martini, Tyler H. Shaw, and Eva Wiese. 2015. Does the Presence of Social Agents Improve Cognitive Performance on a Vigilance Task? In Social Robotics, Adriana Tapus, Elisabeth AndrÃľ, Jean-Claude Martin, FranÃğois Ferland, and Mehdi Ammi (Eds.). Vol. 9388. Springer International Publishing, Cham, 421--430. http://link.springer.com/10.1007/978-3-319-25554-5_42Google ScholarGoogle Scholar
  20. Molly C. Martini, George A. Buzzell, and Eva Wiese. 2015. Agent Appearance Modulates Mind Attribution and Social Attention in Human-Robot Interaction. In Social Robotics (Lecture Notes in Computer Science). Springer, Cham, 431--439.Google ScholarGoogle Scholar
  21. Maya B. Mathur and David B. Reichling. 2016. Navigating a social world with robot partners: A quantitative cartography of the Uncanny Valley. Cognition 146 (Jan. 2016), 22--32.Google ScholarGoogle Scholar
  22. Masahiro Mori. 1970. The Uncanny Valley: The Original Essay by Masahiro Mori. (June 1970). https://spectrum.ieee.org/automaton/robotics/humanoids/the-uncanny-valleyGoogle ScholarGoogle Scholar
  23. M. Mori, K. F. MacDorman, and N. Kageki. 2012. The Uncanny Valley {From the Field}. IEEE Robotics Automation Magazine 19, 2 (June 2012), 98--100.Google ScholarGoogle ScholarCross RefCross Ref
  24. Anna Pecchinenda and Manuel Petrucci. 2016. Emotion Unchained: Facial Expression Modulates Gaze Cueing under Cognitive Load. PLOS ONE 11, 12 (Dec. 2016), e0168111.Google ScholarGoogle ScholarCross RefCross Ref
  25. Giuseppina Porciello, Marco Tullio Liuzza, Ilaria Minio-paluello, Gian Vittorio Caprara, and Salvatore Maria Aglioti. 2016. Fortunes and misfortunes of political leaders reflected in the eyes of their electors. Experimental Brain Research; Heidelberg 234, 3 (March 2016), 733--740.Google ScholarGoogle Scholar
  26. Michael I. Posner. 1980. Orienting of attention. Quarterly Journal of Experimental Psychology 32, 1 (Feb. 1980), 3--25.Google ScholarGoogle ScholarCross RefCross Ref
  27. Susanne Quadflieg, Malia F. Mason, and C. Neil Macrae. 2004. The owl and the pussycat: Gaze cues and visuospatial orienting. Psychonomic bulletin & review 11, 5 (2004), 826--831.Google ScholarGoogle Scholar
  28. Leonhard Schilbach, Marcus Wilms, Simon B. Eickhoff, Sandro Romanzetti, Ralf Tepest, Gary Bente, N. Jon Shah, Gereon R. Fink, and Kai Vogeley. 2009. Minds Made for Sharing: Initiating Joint Attention Recruits Reward-related Neurocircuitry. Journal of Cognitive Neuroscience 22, 12 (Nov. 2009), 2702--2715. Google ScholarGoogle ScholarDigital LibraryDigital Library
  29. Atsushi Senju and Gergely Csibra. 2008. Gaze Following in Human Infants Depends on Communicative Signals. Current Biology 18, 9 (May 2008), 668--671.Google ScholarGoogle ScholarCross RefCross Ref
  30. Christoph Teufel, Dean M. Alexis, Nicola S. Clayton, and Greg Davis. 2010. Mental-state attribution drives rapid, reflexive gaze following. Attention, Perception, & Psychophysics 72, 3 (April 2010), 695--705.Google ScholarGoogle ScholarCross RefCross Ref
  31. Eva Wiese, Tyler Shaw, Daniel Lofaro, and Carryl Baldwin. 2017. Designing Artificial Agents as Social Companions. Proceedings of the Human Factors and Ergonomics Society Annual Meeting 61, 1 (Sept. 2017), 1604--1608.Google ScholarGoogle ScholarCross RefCross Ref
  32. Eva Wiese, Agnieszka Wykowska, and Hermann J. MÃijller. 2014. What We Observe Is Biased by What Other People Tell Us: Beliefs about the Reliability of Gaze Behavior Modulate Attentional Orienting to Gaze Cues. PLOS ONE 9, 4 (April 2014), e94529.Google ScholarGoogle ScholarCross RefCross Ref
  33. Eva Wiese, Agnieszka Wykowska, Jan Zwickel, and Hermann J. MÃijller. 2012. I See What You Mean: How Attentional Selection Is Shaped by Ascribing Intentions to Others. PLOS ONE 7, 9 (Sept. 2012), e45391.Google ScholarGoogle ScholarCross RefCross Ref
  34. Eva Wiese, Jan Zwickel, and Hermann Josef MÃijller. 2013. The importance of context information for the spatial specificity of gaze cueing. Attention, Perception, & Psychophysics 75, 5 (July 2013), 967--982.Google ScholarGoogle ScholarCross RefCross Ref
  35. Agnieszka Wykowska, Eva Wiese, Aaron Prosser, and Hermann J. MÃijller. 2014. Beliefs about the Minds of Others Influence How We Process Sensory Information. PLOS ONE 9, 4 (April 2014), e94339.Google ScholarGoogle ScholarCross RefCross Ref

Index Terms

  1. Differences in Working-Memory Capacity Modulate Top-down Control of Social Attention

    Recommendations

    Comments

    Login options

    Check if you have access through your login credentials or your institution to get full access on this article.

    Sign in
    • Published in

      cover image ACM Other conferences
      TechMindSociety '18: Proceedings of the Technology, Mind, and Society
      April 2018
      143 pages
      ISBN:9781450354202
      DOI:10.1145/3183654

      Copyright © 2018 ACM

      Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      • Published: 5 April 2018

      Permissions

      Request permissions about this article.

      Request Permissions

      Check for updates

      Qualifiers

      • research-article
      • Research
      • Refereed limited

      Acceptance Rates

      TechMindSociety '18 Paper Acceptance Rate17of63submissions,27%Overall Acceptance Rate17of63submissions,27%
    • Article Metrics

      • Downloads (Last 12 months)8
      • Downloads (Last 6 weeks)0

      Other Metrics

    PDF Format

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader