ABSTRACT
Humans can easily find themselves in high cost situations where they must choose between suggestions made by an automated decision aid and a conflicting human decision aid. Previous research indicates that humans often rely on automation or other humans, but not both simultaneously. Expanding on previous work conducted by Lyons and Stokes (2012), the current experiment measures how trust in automated or human decision aids differs along with perceived risk and workload. The simulated task required 126 participants to choose the safest route for a military convoy; they were presented with conflicting information from an automated tool and a human. Results demonstrated that as workload increased, trust in automation decreased. As the perceived risk increased, trust in the human decision aid increased. Individual differences in dispositional trust correlated with an increased trust in both decision aids. These findings can be used to inform training programs for operators who may receive information from human and automated sources. Examples of this context include: air traffic control, aviation, and signals intelligence.
- Biros, D., Daly, M., & Gunsch, G. (2004). The Influence of Task Load and Automation Trust on Deception Detection. Group Decision and Negotiation, 173--189.Google Scholar
- Bisantz, A., & Seong, Y. (2001). Assessment of operator trust in and utilization of automated decision-aids under different framing conditions. International Journal of Industrial Ergonomics, 85--97.Google Scholar
- Goldberg, L. (2015). A Broad-Bandwidth, Public-Domain, Personality Inventory Measuring the Lower-Level Facets of Several Five-Factor Models. Retrieved November 29, 2015, from http://ipip.ori.org/newBroadbandText.htmGoogle Scholar
- Kirlik, A. (1993). Modeling Strategic Behavior in Human-Automation Interaction: Why an "Aid" Can (and Should) Go Unused. Human Factors: The Journal of the Human Factors and Ergonomics Society, 221--242.Google ScholarCross Ref
- Lee, J., & See, K. (2004). Trust in Automation: Designing for Appropriate Reliance. Human Factors: The Journal of the Human Factors and Ergonomics Society, 50--80.Google Scholar
- Lyons, J., & Stokes, C. (2011). Human-Human Reliance in the Context of Automation. Human Factors: The Journal of the Human Factors and Ergonomics Society, 112--121.Google Scholar
- Mayer, R., Davis, J., & Schoorman, F. (1995). An Integrative Model of Organizational Trust. Academy of Management Review, 709--734.Google Scholar
- Serva, M., Fuller, M., & Mayer, R. (2005). The reciprocal nature of trust: a longitudinal study of interacting teams. Journal of Organizational Behavior, 625--648.Google ScholarCross Ref
Index Terms
Differences in trust between human and automated decision aids
Recommendations
Trusting Virtual Trust
Can trust evolve on the Internet between virtual strangers? Recently, Pettit answered this question in the negative. Focusing on trust in the sense of `dynamic, interactive, and trusting' reliance on other people, he distinguishes between two forms of ...
Trust and risk in consumer acceptance of e-services
Consumers' risk perception and trust are considered among the most important psychological states that influence online behavior. Despite the number of empirical studies that have explored the effects of trust and risk perceptions on consumer acceptance ...
Comparing Human Trust Attitudes Towards Human and Agent Teammates
HAI '20: Proceedings of the 8th International Conference on Human-Agent InteractionAgents' roles in our lives increasingly matter as they engage with people in a variety of important tasks. To achieve successful human-agent teamwork, it is critical to know the differences and similarities in people's attitudes towards human and agent ...
Comments