NEURAL NETWORK MODEL INTERPRETABILITY FOR COMPUTER NETWORK OPERATIONS AND DEFENSE
Authors
Huang, Alexander
Advisors
Barton, Armon C.
McClure, Patrick
Second Readers
Subjects
classification
anomaly detection
deep learning
artificial neural networks
network traffic analysis
computer networks
anomaly detection
deep learning
artificial neural networks
network traffic analysis
computer networks
Date of Issue
2023-12
Date
Publisher
Monterey, CA; Naval Postgraduate School
Language
Abstract
Effective cyberspace defense and incident response on Navy networks is predicated on the ability to quickly identify, characterize, classify, and respond to network events. A vast amount of network data is collected on shore and enterprise networks alike, but the quantity of data hinders rapid analysis and identification of key events of consequence for network defenders. This research uses machine learning methods to aid in automated decision-making and incident response for network administrators and security operators. We build and test deterministic and Bayesian neural network models as classifiers to discriminate benign traffic from traffic that should be blocked by a firewall network security device. Using Bayesian methods for neural networks, we demonstrate ways to capture and visualize uncertainty and confidence metrics that are not attainable from deterministic models. Finally, we propose class expected saliency maps and class expected Hessians as novel approaches to use machine learning to enhance network traffic analysis and better understand how and why models make predictions. This work provides a proof-of-concept for how model uncertainty and interpretability might be considered in the context of network security and defense.
Type
Thesis
Description
Series/Report No
NPS Outstanding Theses and Dissertations
Department
Computer Science (CS)
Organization
Identifiers
NPS Report Number
Sponsors
Funding
Format
Citation
Distribution Statement
Approved for public release. Distribution is unlimited.
Rights
This publication is a work of the U.S. Government as defined in Title 17, United States Code, Section 101. Copyright protection is not available for this work in the United States.
