Understanding, Assessing, and Mitigating Safety Risks in Artificial Intelligence Systems
Abstract
Traditional software safety techniques rely on validating software against a deductively defined specification of how the software should behave in particular
situations. In the case of AI systems, specifications are often implicit or inductively defined. Data-driven methods are subject to sampling error since practical
datasets cannot provide exhaustive coverage of all possible events in a real physical environment. Traditional software verification and validation approaches may
not apply directly to these novel systems, complicating the operation of systems safety analysis (such as implemented in MIL-STD 882). However, AI offers
advanced capabilities, and it is desirable to ensure the safety of systems that rely on these capabilities. When AI tech is deployed in a weapon system, robot, or
planning system, unwanted events are possible. Several techniques can support the evaluation process for understanding the nature and likelihood of unwanted
events in AI systems and making risk decisions on naval employment. This research considers the state of the art, evaluating which ones are most likely to be
employable, usable, and correct. Techniques include software analysis, simulation environments, and mathematical determinations.
Description
Prepared for: Naval Air Warfare Development Center (NAVAIR)
Rights
This publication is a work of the U.S. Government as defined in Title 17, United States Code, Section 101. Copyright protection is not available for this work in the United States.NPS Report Number
NPS-CS-22-003Collections
Related items
Showing items related by title, author, creator and subject.
-
A formal application of safety and risk assessment in software systems
Williamson, Christopher Loyal (Monterey, California. Naval Postgraduate School, 2004., 2004-09);The current state of the art techniques of Software Engineering lack a formal method and metric for measuring the safety index of a software system. The lack of such a methodology has resulted in a series of highly publicized ... -
A framework for software reuse in safety-critical system of systems
Warren, Bradley R. (Monterey, California. Naval Postgraduate School, 2008-03);This thesis concerns the effective and safe software reuse in safety-critical system-of-systems. Software reuse offers many unutilized benefits such as achieving rapid system development, saving resources and time, and ... -
A formal approach to hazard decomposition in Software Fault Tree Analysis
Needham, Donald Michael (Monterey, California: Naval Postgraduate School, 1990);As digital control systems are used in life-critical applications, assessment of the safety of these control systems becomes increasingly important. One means of formally performing this assessment is through fault tree ...