skip to main content
10.1145/2898375.2898385acmotherconferencesArticle/Chapter ViewAbstractPublication PageshotsosConference Proceedingsconference-collections
research-article

Differences in trust between human and automated decision aids

Published: 19 April 2016 Publication History

Abstract

Humans can easily find themselves in high cost situations where they must choose between suggestions made by an automated decision aid and a conflicting human decision aid. Previous research indicates that humans often rely on automation or other humans, but not both simultaneously. Expanding on previous work conducted by Lyons and Stokes (2012), the current experiment measures how trust in automated or human decision aids differs along with perceived risk and workload. The simulated task required 126 participants to choose the safest route for a military convoy; they were presented with conflicting information from an automated tool and a human. Results demonstrated that as workload increased, trust in automation decreased. As the perceived risk increased, trust in the human decision aid increased. Individual differences in dispositional trust correlated with an increased trust in both decision aids. These findings can be used to inform training programs for operators who may receive information from human and automated sources. Examples of this context include: air traffic control, aviation, and signals intelligence.

References

[1]
Biros, D., Daly, M., & Gunsch, G. (2004). The Influence of Task Load and Automation Trust on Deception Detection. Group Decision and Negotiation, 173--189.
[2]
Bisantz, A., & Seong, Y. (2001). Assessment of operator trust in and utilization of automated decision-aids under different framing conditions. International Journal of Industrial Ergonomics, 85--97.
[3]
Goldberg, L. (2015). A Broad-Bandwidth, Public-Domain, Personality Inventory Measuring the Lower-Level Facets of Several Five-Factor Models. Retrieved November 29, 2015, from http://ipip.ori.org/newBroadbandText.htm
[4]
Kirlik, A. (1993). Modeling Strategic Behavior in Human-Automation Interaction: Why an "Aid" Can (and Should) Go Unused. Human Factors: The Journal of the Human Factors and Ergonomics Society, 221--242.
[5]
Lee, J., & See, K. (2004). Trust in Automation: Designing for Appropriate Reliance. Human Factors: The Journal of the Human Factors and Ergonomics Society, 50--80.
[6]
Lyons, J., & Stokes, C. (2011). Human-Human Reliance in the Context of Automation. Human Factors: The Journal of the Human Factors and Ergonomics Society, 112--121.
[7]
Mayer, R., Davis, J., & Schoorman, F. (1995). An Integrative Model of Organizational Trust. Academy of Management Review, 709--734.
[8]
Serva, M., Fuller, M., & Mayer, R. (2005). The reciprocal nature of trust: a longitudinal study of interacting teams. Journal of Organizational Behavior, 625--648.

Cited By

View all
  • (2024)Exploring the Effects of User Input and Decision Criteria Control on Trust in a Decision Support Tool for Spare Parts Inventory ManagementProceedings of the International Conference on Mobile and Ubiquitous Multimedia10.1145/3701571.3701585(313-323)Online publication date: 1-Dec-2024
  • (2024)When AI Fails, Who Do We Blame? Attributing Responsibility in Human–AI InteractionsIEEE Transactions on Technology and Society10.1109/TTS.2024.33700955:1(61-70)Online publication date: Mar-2024
  • (2024)The Proxemic Influence on Trust in Triadic Human-Robot Interaction: Insights for Tele-Operative Sonography Assessment in Human-in-the-Loop Systems2024 IEEE International Conference on Systems, Man, and Cybernetics (SMC)10.1109/SMC54092.2024.10831164(3249-3254)Online publication date: 6-Oct-2024
  • Show More Cited By

Index Terms

  1. Differences in trust between human and automated decision aids

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Other conferences
    HotSos '16: Proceedings of the Symposium and Bootcamp on the Science of Security
    April 2016
    138 pages
    ISBN:9781450342773
    DOI:10.1145/2898375
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 19 April 2016

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. automation
    2. decision-making
    3. reliance
    4. risk
    5. strain
    6. trust
    7. workload

    Qualifiers

    • Research-article

    Conference

    HotSoS '16
    HotSoS '16: HotSos 2016 Science of Security
    April 19 - 21, 2016
    Pennsylvania, Pittsburgh

    Acceptance Rates

    Overall Acceptance Rate 34 of 60 submissions, 57%

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)32
    • Downloads (Last 6 weeks)7
    Reflects downloads up to 13 Feb 2025

    Other Metrics

    Citations

    Cited By

    View all
    • (2024)Exploring the Effects of User Input and Decision Criteria Control on Trust in a Decision Support Tool for Spare Parts Inventory ManagementProceedings of the International Conference on Mobile and Ubiquitous Multimedia10.1145/3701571.3701585(313-323)Online publication date: 1-Dec-2024
    • (2024)When AI Fails, Who Do We Blame? Attributing Responsibility in Human–AI InteractionsIEEE Transactions on Technology and Society10.1109/TTS.2024.33700955:1(61-70)Online publication date: Mar-2024
    • (2024)The Proxemic Influence on Trust in Triadic Human-Robot Interaction: Insights for Tele-Operative Sonography Assessment in Human-in-the-Loop Systems2024 IEEE International Conference on Systems, Man, and Cybernetics (SMC)10.1109/SMC54092.2024.10831164(3249-3254)Online publication date: 6-Oct-2024
    • (2023)Collaborative Writing in the Intelligence CommunityInternational Journal of e-Collaboration10.4018/IJeC.32411019:1(1-26)Online publication date: 9-Jun-2023
    • (2021)How to Evaluate Trust in AI-Assisted Decision Making? A Survey of Empirical MethodologiesProceedings of the ACM on Human-Computer Interaction10.1145/34760685:CSCW2(1-39)Online publication date: 18-Oct-2021
    • (2019)Effects of the source of advice and decision task on decisions to request expert adviceProceedings of the 24th International Conference on Intelligent User Interfaces10.1145/3301275.3302279(469-475)Online publication date: 17-Mar-2019
    • (2019)‘If You Agree with Me, Do I Trust You?’: An Examination of Human-Agent Trust from a Psychological PerspectiveIntelligent Systems and Applications10.1007/978-3-030-29513-4_73(994-1013)Online publication date: 24-Aug-2019
    • (2018)Who Should I Trust (Human vs. Automation)? The Effects of Pedigree in a Dual Advisor ContextProceedings of the 20th Congress of the International Ergonomics Association (IEA 2018)10.1007/978-3-319-96077-7_2(10-17)Online publication date: 7-Aug-2018
    • (2017)The Effects of Pedigree and Source Type on Trust in a Dual Adviser ContextProceedings of the Human Factors and Ergonomics Society Annual Meeting10.1177/154193121360156161:1(319-323)Online publication date: 28-Sep-2017
    • (2017)Personal Influences on Dynamic Trust Formation in Human-Agent InteractionProceedings of the 5th International Conference on Human Agent Interaction10.1145/3125739.3125749(233-243)Online publication date: 17-Oct-2017
    • Show More Cited By

    View Options

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Figures

    Tables

    Media

    Share

    Share

    Share this Publication link

    Share on social media