NEURAL NETWORK DISTRIBUTIONAL INITIAL CONDITION ROBUSTNESS IN POWER SYSTEMS

Authors
Smith, Philip B.
Advisors
Kang, Wei
Martinsen, Thor
Second Readers
Gannon, Anthony J.
Subjects
neural network
robustness
microgrid
feedforward
Date of Issue
2022-06
Date
Publisher
Monterey, CA; Naval Postgraduate School
Language
Abstract
How can we measure and classify neural network robustness across differently distributed data to avoid misuse of machine learning tools? This thesis adopts several metrics to measure the initial condition robustness of feedforward neural networks, allowing the creators of such networks to measure and refine their robustness and performance. This could allow highly robust neural networks to be used reliably on untrained data distributions and prevent the use of less robust networks as a black box in a poor environment. We test this measurement of robustness on a series of differently sized neural networks trained to detect and classify microgrid power system faults, giving examples of both robust and nonrobust networks, along with suggestions on how to maximize robustness. The analysis reveals that collecting data from segments along trajectories enhances the robustness of neural networks. In such data sets, the distribution of data points is dominated by the dynamics of the system, not the initial state distribution.
Type
Thesis
Description
Series/Report No
Department
Applied Mathematics (MA)
Organization
Identifiers
NPS Report Number
Sponsors
Funding
Format
Citation
Distribution Statement
Approved for public release. Distribution is unlimited.
Rights
This publication is a work of the U.S. Government as defined in Title 17, United States Code, Section 101. Copyright protection is not available for this work in the United States.
Collections