INFLUENCING TRUST IN HUMAN AND ARTIFICIAL INTELLIGENCE TEAMING THROUGH HEURISTICS
Loading...
Authors
Thompson, Joel E.
Subjects
heuristic
bias
decision-making
artificial intelligence
AI
human and artificial intelligence team
trust
trustworthy
influence
deception
unjustified trust
unjustified mistrust
bias
decision-making
artificial intelligence
AI
human and artificial intelligence team
trust
trustworthy
influence
deception
unjustified trust
unjustified mistrust
Advisors
McGuire, Mollie R.
Date of Issue
2021-06
Date
Publisher
Monterey, CA; Naval Postgraduate School
Language
Abstract
This thesis analyzes potential methods intended to influence trust within military units and their use of artificial intelligence (AI) systems. AI systems are being developed to enhance the human decision-making process and when employed properly can greatly increase the rate at which actions are taken, a key requirement for generating combat power. Human and AI teams rely on the user’s trust for the AI system, and that trust is influenced by rational, affective, and normative trust factors. This thesis examines those trust factors and determines that only rational trust factors are directly connected to the trustworthiness of the AI and that the user’s trust can be influenced independently of the AI’s trustworthiness through affective and normative trust factors. Influencing the user’s trust of the AI through substitution of affective and normative trust factors in place of rational trust factors produces unjustified trust because this trust is not dependent on the trustworthiness of the AI.
Type
Thesis
Description
Series/Report No
Department
Information Sciences (IS)
Organization
Identifiers
NPS Report Number
Sponsors
Funding
Format
Citation
Distribution Statement
Approved for public release. Distribution is unlimited.
Rights
This publication is a work of the U.S. Government as defined in Title 17, United States Code, Section 101. Copyright protection is not available for this work in the United States.
