On hybrid classification using model assisted posterior estimates
Permanent link
https://hdl.handle.net/10037/4948View/ Open
This is the accepted manuscript version. Published version available at http://dx.doi.org/10.1016/j.patcog.2011.12.002 (PDF)
Date
2012Type
Journal articleTidsskriftartikkel
Peer reviewed
Abstract
Traditional parametric and nonparametric classifiers used for statistical pattern recognition have their own strengths and limitations. While parametric methods assume some specific parametric models for density functions or posterior probabilities of competing classes, nonparametric methods are free from such assumptions. So, when these model assumptions are correct, parametric methods outperform nonparametric classifiers, especially when the training sample is small. But, violations of these assumptions often lead to poor performance by parametric classifiers, where nonparametric methods work well. In this article, we make an attempt to overcome these limitations of parametric and nonparametric approaches and combine their strengths. The resulting classifiers, denoted the hybrid classifiers, perform like parametric classifiers when the model assumptions are valid, but unlike parametric classifiers, they also provide safeguards against possible deviations from parametric model assumptions. In this article, we propose some multiscale methods for hybrid classification, and their performance is evaluated using several simulated and benchmark data sets.
Publisher
Elsevier ScienceCitation
Pattern Recognition 45(2012) nr. 6 s. 2288-2298Metadata
Show full item recordCollections
The following license file are associated with this item: