skip to main content
10.1145/2898375.2898385acmotherconferencesArticle/Chapter ViewAbstractPublication PageshotsosConference Proceedingsconference-collections
research-article

Differences in trust between human and automated decision aids

Published:19 April 2016Publication History

ABSTRACT

Humans can easily find themselves in high cost situations where they must choose between suggestions made by an automated decision aid and a conflicting human decision aid. Previous research indicates that humans often rely on automation or other humans, but not both simultaneously. Expanding on previous work conducted by Lyons and Stokes (2012), the current experiment measures how trust in automated or human decision aids differs along with perceived risk and workload. The simulated task required 126 participants to choose the safest route for a military convoy; they were presented with conflicting information from an automated tool and a human. Results demonstrated that as workload increased, trust in automation decreased. As the perceived risk increased, trust in the human decision aid increased. Individual differences in dispositional trust correlated with an increased trust in both decision aids. These findings can be used to inform training programs for operators who may receive information from human and automated sources. Examples of this context include: air traffic control, aviation, and signals intelligence.

References

  1. Biros, D., Daly, M., & Gunsch, G. (2004). The Influence of Task Load and Automation Trust on Deception Detection. Group Decision and Negotiation, 173--189.Google ScholarGoogle Scholar
  2. Bisantz, A., & Seong, Y. (2001). Assessment of operator trust in and utilization of automated decision-aids under different framing conditions. International Journal of Industrial Ergonomics, 85--97.Google ScholarGoogle Scholar
  3. Goldberg, L. (2015). A Broad-Bandwidth, Public-Domain, Personality Inventory Measuring the Lower-Level Facets of Several Five-Factor Models. Retrieved November 29, 2015, from http://ipip.ori.org/newBroadbandText.htmGoogle ScholarGoogle Scholar
  4. Kirlik, A. (1993). Modeling Strategic Behavior in Human-Automation Interaction: Why an "Aid" Can (and Should) Go Unused. Human Factors: The Journal of the Human Factors and Ergonomics Society, 221--242.Google ScholarGoogle ScholarCross RefCross Ref
  5. Lee, J., & See, K. (2004). Trust in Automation: Designing for Appropriate Reliance. Human Factors: The Journal of the Human Factors and Ergonomics Society, 50--80.Google ScholarGoogle Scholar
  6. Lyons, J., & Stokes, C. (2011). Human-Human Reliance in the Context of Automation. Human Factors: The Journal of the Human Factors and Ergonomics Society, 112--121.Google ScholarGoogle Scholar
  7. Mayer, R., Davis, J., & Schoorman, F. (1995). An Integrative Model of Organizational Trust. Academy of Management Review, 709--734.Google ScholarGoogle Scholar
  8. Serva, M., Fuller, M., & Mayer, R. (2005). The reciprocal nature of trust: a longitudinal study of interacting teams. Journal of Organizational Behavior, 625--648.Google ScholarGoogle ScholarCross RefCross Ref

Index Terms

  1. Differences in trust between human and automated decision aids

    Recommendations

    Comments

    Login options

    Check if you have access through your login credentials or your institution to get full access on this article.

    Sign in
    • Published in

      cover image ACM Other conferences
      HotSos '16: Proceedings of the Symposium and Bootcamp on the Science of Security
      April 2016
      138 pages
      ISBN:9781450342773
      DOI:10.1145/2898375

      Copyright © 2016 ACM

      Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      • Published: 19 April 2016

      Permissions

      Request permissions about this article.

      Request Permissions

      Check for updates

      Qualifiers

      • research-article

      Acceptance Rates

      Overall Acceptance Rate34of60submissions,57%

    PDF Format

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader