Trust in automated systems the effect of automation level on trust calibration

Download
Author
Walliser, James C.
Date
2011-06Advisor
Shattuck, Lawrence G.
Second Reader
Shearer, Robert L.
Metadata
Show full item recordAbstract
Automated systems perform functions that were previously executed by a human. When using automation, the role of the human changes from operator to supervisor. For effective operation, the human must appropriately calibrate trust in the automated system. Improper trust leads to misuse and disuse of the system. The responsibilities of an automated system can be described by its level of automation. This study examined the effect of varying levels of automation and accuracy on trust calibration. Thirty participants were divided into three groups based on the system's level of automation and provided with an automated identification system. Within the Virtual Battlespace 2 environment, participants controlled the video feed of an unmanned aircraft while they identified friendly and enemy personnel on the ground. Results indicate a significant difference in the ability to correctly identify targets between levels of automation and accuracy. Participants exhibited better calibration at the management by consent level of automation and at the lower accuracy level. These findings demonstrate the necessity of continued research in the field of automation trust.
Description
Human Systems Integration Report
Rights
This publication is a work of the U.S. Government as defined in Title 17, United States Code, Section 101. Copyright protection is not available for this work in the United States.Related items
Showing items related by title, author, creator and subject.
-
Mental models, trust, and reliance exploring the effect of human perceptions on automation use
Cassidy, Andrea M. (Monterey, California. Naval Postgraduate School, 2009-06);Today's military increasingly uses automation to perform or augment the performance of complex tasks. Automated systems that support or even make important decisions require human operators to understand and trust ... -
TECHNOLOGY TRUST: THE IMPACT OF ANTHROPOMORPHIC SYSTEM INFORMATION ON THE ACCEPTANCE OF AUTONOMOUS SYSTEMS USED IN HIGH-RISK APPLICATIONS
Anderson, Michael G. (Monterey, CA; Naval Postgraduate School, 2021-03);Autonomous systems provide the military with advanced capabilities permitting the execution of increasingly dangerous and difficult missions. A human in the loop is still required to decide how and when to deploy these ... -
AUTOMATION AND ARTIFICIAL INTELLIGENCE FOR NAVAL ISR: U.S. NAVY VS. CHINA’S NAVY
Hong, Crystal S. (Monterey, CA; Naval Postgraduate School, 2020-06);The U.S. Navy faces challenges as it moves toward automating the maritime battlespace and risks falling behind its rising great power competitor, the People’s Liberation Army Navy (PLAN). How are the U.S. Navy and the PLAN ...