A support vector machine with a hybrid kernel and minimal Vapnik-Chervonenkis dimension

  • Ying Tan, Jun Wang
  • Published 2004 in
    IEEE Transactions on Knowledge and Data…

Abstract

We present a mechanism to train support vector machines (SVMs) with a hybrid kernel and minimal Vapnik-Chervonenkis (VC) dimension. After describing the VC dimension of sets of separating hyperplanes in a high-dimensional feature space produced by a mapping related to kernels from the input space, we proposed an optimization criterion to design SVMs by minimizing the upper bound of the VC dimension. This method realizes a structural risk minimization and utilizes a flexible kernel function such that a superior generalization over test data can be obtained. In order to obtain a flexible kernel function, we develop a hybrid kernel function and a sufficient condition to be an admissible Mercer kernel based on common Mercer kernels (polynomial, radial basis function, two-layer neural network, etc.). The nonnegative combination coefficients and parameters of the hybrid kernel are determined subject to the minimal upper bound of the VC dimension of the learning machine. The use of the hybrid kernel results in a better performance than those with a single common kernel. Experimental results are discussed to illustrate the proposed method and show that the SVM with the hybrid kernel outperforms that with a single common kernel in terms of generalization power.

Extracted Key Phrases

10 Figures and Tables

Statistics

0204060'05'06'07'08'09'10'11'12'13'14'15'16'17
Citations per Year

203 Citations

Semantic Scholar estimates that this publication has 203 citations based on the available data.

See our FAQ for additional information.

Cite this paper

@article{Tan2004ASV, title={A support vector machine with a hybrid kernel and minimal Vapnik-Chervonenkis dimension}, author={Ying Tan and Jun Wang}, journal={IEEE Transactions on Knowledge and Data Engineering}, year={2004}, volume={16}, pages={385-395} }