Sample entropy and random forests a methodology for anomaly-based intrusion detection and classification of low-bandwidth malware attacks
Hyla, Bret M.
MetadataShow full item record
Sample Entropy examines changes in the normal distribution of network traffic to identify anomalies. Normalized Information examines the overall probability distribution in a data set. Random Forests is a supervised learning algorithm which is efficient at classifying highlyimbalanced data. Anomalies are exceedingly rare compared to the overall volume of network traffic. The combination of these methods enables low-bandwidth anomalies to easily be identified in high-bandwidth network traffic. Using only low-dimensional network information allows for near real-time identification of anomalies. The data set was collected from 1999 DARPA intrusion detection evaluation data set. The experiments compare a baseline f-score to the observed entropy and normalized information of the network. Anomalies that are disguised in network flow analysis were detected. Random Forests prove to be capable of classifying anomalies using the sample entropy and normalized information. Our experiment divided the data set into five-minute time slices and found that sample entropy and normalized information metrics were successful in classifying bad traffic with a recall of .99 and a f-score .50 which was 185% better than our baseline.
Showing items related by title, author, creator and subject.
Laws, Michael J.; Bunder, Greg T. (Monterey, CA; Naval Postgraduate School, 2020-06);Navy watchstanders are ill-equipped to monitor network status in real-time, to include an inability to identify network anomalies and potential risks on the fly. This leads to a lack of situational awareness and ultimately ...
Wang, Beng Wei (Monterey, California. Naval Postgraduate School, 2007-03);Wireless sensor networks have been widely researched for use in both military and commercial applications. They are especially of interest to the military planners as they can be deployed in hostile environments to collect ...
Bollmann, Chad A.; Tummala, Murali; McEachen, John C. (Elsevier, 2021);This work describes a novel application of robust estimation to the detection of volumetric anomalies in computer network traffic. The proposed tests are based on sample location and dispersion and derived from relatively ...