Kernel Maximum a Posteriori Classification with Error Bound Analysis

Abstract

Kernel methods have been widely used in data classification. Many kernel-based classifiers like Kernel Support Vector Machines (KSVM) assume that data can be separated by a hyperplane in the feature space. These methods do not consider the data distribution. This paper proposes a novel Kernel Maximum A Posteriori (KMAP) classification method, which implements a Gaussian density distribution assumption in the feature space and can be regarded as a more generalized classification method than other kernel-based classifier such as Kernel Fisher Discriminant Analysis (KFDA). We also adopt robust methods for parameter estimation. In addition, the error bound analysis for KMAP indicates the effectiveness of the Gaussian density assumption in the feature space. Furthermore, KMAP achieves very promising results on eight UCI benchmark data sets against the competitive methods.

DOI: 10.1007/978-3-540-69158-7_87

3 Figures and Tables

Cite this paper

@inproceedings{Xu2007KernelMA, title={Kernel Maximum a Posteriori Classification with Error Bound Analysis}, author={Zenglin Xu and Kaizhu Huang and Jianke Zhu and Irwin King and Michael R. Lyu}, booktitle={ICONIP}, year={2007} }