Automatic target recognition statistical feature selection of non-Gaussian distributed target classes

Loading...
Thumbnail Image
Authors
Wilder, Matthew J.
Subjects
Advisors
Clark, Grace A.
Date of Issue
2011-06
Date
Publisher
Monterey, CA; Naval Postgraduate School
Language
Abstract
Target and pattern recognition systems are in widespread use. Efforts have been made in all areas of pattern recognition to increase the performance of these systems. Feature extraction, feature selection, and classification are the major aspects of a target recognition system. This research proposes algorithms for selecting useful statistical features in pattern/target classification problems in which the features are non-Gaussian distributed. In engineering practice, it is common to either not perform any feature selection procedure or to use a feature selection algorithm that assumes the features are Gaussian distributed. These results can be far from optimal if the features are non-Gaussian distributed, as they often are. This research has the goal of mitigating that problem by creating algorithms that are useful in practice. This work focuses on the performance of three common feature selection algorithms: the Branch and Bound, the Sequential Forward Selection, and Exhaustive Search algorithms. Ordinarily, the performance index used to measure the class separation in feature space involves assuming the data are Gaussian and deriving tractable performance indices that can be calculated without estimating the probability density functions of the class data. The advantage of this approach is that it produces feature selection algorithms that have low computational complexity and do not require knowledge of the data densities. The disadvantage is that these algorithms may not perform reasonably when the data are non-Gaussian. This research examines the use of information-theoretic class separability measures that can deal with the non-Gaussian case. In particular, this work shows that the Hellinger Distance (a type of divergence) has very desirable mathematical properties and can be useful for feature selection when accompanied by a suitable density estimator. The suitable density estimator for this research is the multivariate kernel density estimator. In selecting the best feature subset of non-Gaussian distributed features, results show that the Hellinger distance outperformed the other class separability measures in several instances highlighted in this report.
Type
Thesis
Description
Series/Report No
Organization
Naval Postgraduate School (U.S.)
Identifiers
NPS Report Number
Sponsors
Funder
Format
xxii, 125 p. : ill. ;
Citation
Distribution Statement
Approved for public release; distribution is unlimited.
Rights
This publication is a work of the U.S. Government as defined in Title 17, United States Code, Section 101. Copyright protection is not available for this work in the United States.
Collections