Mental models, trust, and reliance exploring the effect of human perceptions on automation use

Loading...
Thumbnail Image
Authors
Cassidy, Andrea M.
Subjects
Advisors
Shattuck, Lawrence G.
Date of Issue
2009-06
Date
Publisher
Monterey, California. Naval Postgraduate School
Language
Abstract
Today's military increasingly uses automation to perform or augment the performance of complex tasks. Automated systems that support or even make important decisions require human operators to understand and trust automation in order to rely on it appropriately. This study examined the effect of varying degrees of information about an automated system's reliability on mental model accuracy, trust in, and reliance on automation. Forty-two participants were divided into three groups based on level of information received about the reliability of a simulated automated target detection aid. One group received little information, one group received accurate information, and one group received inaccurate information about the target detection aid's reliability. Each participant completed a series of 120 tasks in which he or she was required to identify the presence of a threat target and then decide whether to use an automated aid for assistance. Results indicate a significant difference between the groups' trust in and reliance on automation. The experimental group that received little information trusted the automation less but relied on it more. These findings, accompanied by observational data collected regarding the formation of mental models, demonstrate the necessity of continued research in the field of automation trust.
Type
Thesis
Description
Human Systems Integration Report
Series/Report No
Department
Human Systems Integration
Operations Research (OR)
Identifiers
NPS Report Number
Sponsors
Funder
Format
xvi, 75 p. : col. ill. ;
Citation
Distribution Statement
Approved for public release; distribution is unlimited.
Rights
Collections