Robustness a better measure of algorithm performance

Loading...
Thumbnail Image
Authors
Musselman, Roger D.
Subjects
Advisors
Sanchez, Paul J.
Date of Issue
2007-09
Date
Publisher
Monterey, California. Naval Postgraduate School
Language
Abstract
Algorithms are an essential part of Operations Research (OR) methodology. Therefore, the efficiency of the algorithms must be a consideration. However, traditional approaches to assessing algorithm efficiency do not always captured the real-world tradeoffs involved. This thesis explored the use of a new measure of algorithm efficiency, robustness, and contrasted it with the traditional "big-O" analysis. Sorting algorithms were used to illustrate the trade-offs. The use of Dr. Genichi Taguchi's robust design techniques allowed us to take into account the impact of factors which would be uncontrollable in the real world, by measuring how those factors affect the consistency of the results. These factors, which are treated separately by big-O analysis, are incorporated as an integral part of robust analysis. The hypothesis was that robustness is potentially a more useful description of algorithm performance than the more traditional big-O analyses. The results of experimentation supported this hypothesis. Where big-O analysis only considers the average performance, robustness integrates the average performance and the consistency of performance. Most importantly, the robust analysis we performed yielded results that are consistent with actual usage-practitioners prefer quicksort over heap sort, despite the fact that under big-O analysis heap sort dominates quicksort.
Type
Thesis
Description
Series/Report No
Department
Organization
Naval Postgraduate School (U.S.)
Identifiers
NPS Report Number
Sponsors
Funder
Format
xvi, 57 p. ;
Citation
Distribution Statement
Approved for public release; distribution is unlimited.
Rights
Collections