Robust Kernel Approximation for Classification

  title={Robust Kernel Approximation for Classification},
  author={Fanghui Liu and X. Huang and Cheng Peng and Jie Yang and Nikola K. Kasabov},
This paper investigates a robust kernel approximation scheme for support vector machine classification with indefinite kernels. It aims to tackle the issue that the indefinite kernel is contaminated by noises and outliers, i.e. a noisy observation of the true positive definite (PD) kernel. The traditional algorithms recovery the PD kernel from the observation with the small Gaussian noises, however, such way is not robust to noises and outliers that do not follow a Gaussian distribution. In… 



A Generalized Kernel Approach to Dissimilarity-based Classification

It is shown that other, more global classification techniques are preferable to the nearest neighbor rule, in such cases when dissimilarities used in practice are far from ideal and the performance of the nearest neighbors rule suffers from its sensitivity to noisy examples.

Learning with non-positive kernels

A general representer theorem for constrained stabilization is shown and generalization bounds are proved by computing the Rademacher averages of the kernel class.

Classification With Truncated $\ell _{1}$ Distance Kernel

In numerical experiments, the TL1 kernel with a pregiven parameter achieves similar or better performance than the radial basis function kernel with the parameter tuned by cross validation, implying theTL1 kernel a promising nonlinear kernel for classification tasks.

Analysis of SVM with Indefinite Kernels

It is shown that the objective function is continuously differentiable and its gradient can be explicitly computed, and that its gradient is Lipschitz continuous, which greatly facilitates the application of gradient-based algorithms.

Support vector machine classification with indefinite kernels

This work proposes a method for support vector machine classification using indefinite kernels that simultaneously computes support vectors and a proxy kernel matrix used in forming the loss.

Solving Indefinite Kernel Support Vector Machine with Difference of Convex Functions Programming

A novel algorithm termed as IKSVM-DC, which reformulates the primal problem as a difference of convex functions (DC) programming which can be optimized by the DC algorithm (DCA), and which can accelerate convergence rate.

Learning SVM in Kreĭn Spaces

This paper justifies and evaluates a solution that uses the original (indefinite) similarity measure, in the original Kreĭn space, and establishes the correspondence between the stabilization problem and a classical SVM based on minimization (easy to solve).

Incremental Robust Nonnegative Matrix Factorization for Object Tracking

NMF with L2,1 norm loss function robust NMF is introduced into appearance modelling in visual tracking and multiplicative update rules in incremental learning for robustNMF are proposed for model update to strengthen its practicality in visualtracking.

A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems

A new fast iterative shrinkage-thresholding algorithm (FISTA) which preserves the computational simplicity of ISTA but with a global rate of convergence which is proven to be significantly better, both theoretically and practically.