Robust Kernel Approximation for Classification

@inproceedings{Liu2017RobustKA,
title={Robust Kernel Approximation for Classification},
author={Fanghui Liu and X. Huang and Cheng Peng and Jie Yang and Nikola K. Kasabov},
booktitle={ICONIP},
year={2017}
}
• Published in ICONIP 14 November 2017
• Computer Science
This paper investigates a robust kernel approximation scheme for support vector machine classification with indefinite kernels. It aims to tackle the issue that the indefinite kernel is contaminated by noises and outliers, i.e. a noisy observation of the true positive definite (PD) kernel. The traditional algorithms recovery the PD kernel from the observation with the small Gaussian noises, however, such way is not robust to noises and outliers that do not follow a Gaussian distribution. In…

References

SHOWING 1-10 OF 18 REFERENCES

A Generalized Kernel Approach to Dissimilarity-based Classification

• Computer Science
J. Mach. Learn. Res.
• 2001
It is shown that other, more global classification techniques are preferable to the nearest neighbor rule, in such cases when dissimilarities used in practice are far from ideal and the performance of the nearest neighbors rule suffers from its sensitivity to noisy examples.

Learning with non-positive kernels

• Computer Science, Mathematics
ICML
• 2004
A general representer theorem for constrained stabilization is shown and generalization bounds are proved by computing the Rademacher averages of the kernel class.

Classification With Truncated $\ell _{1}$ Distance Kernel

• Computer Science
IEEE Transactions on Neural Networks and Learning Systems
• 2018
In numerical experiments, the TL1 kernel with a pregiven parameter achieves similar or better performance than the radial basis function kernel with the parameter tuned by cross validation, implying theTL1 kernel a promising nonlinear kernel for classification tasks.

Analysis of SVM with Indefinite Kernels

• Computer Science
NIPS
• 2009
It is shown that the objective function is continuously differentiable and its gradient can be explicitly computed, and that its gradient is Lipschitz continuous, which greatly facilitates the application of gradient-based algorithms.

Support vector machine classification with indefinite kernels

• Computer Science
Math. Program. Comput.
• 2007
This work proposes a method for support vector machine classification using indefinite kernels that simultaneously computes support vectors and a proxy kernel matrix used in forming the loss.

Solving Indefinite Kernel Support Vector Machine with Difference of Convex Functions Programming

• Computer Science
AAAI
• 2017
A novel algorithm termed as IKSVM-DC, which reformulates the primal problem as a difference of convex functions (DC) programming which can be optimized by the DC algorithm (DCA), and which can accelerate convergence rate.

Learning SVM in Kreĭn Spaces

• Computer Science
IEEE Transactions on Pattern Analysis and Machine Intelligence
• 2016
This paper justifies and evaluates a solution that uses the original (indefinite) similarity measure, in the original Kreĭn space, and establishes the correspondence between the stabilization problem and a classical SVM based on minimization (easy to solve).

Incremental Robust Nonnegative Matrix Factorization for Object Tracking

• Computer Science
ICONIP
• 2016
NMF with L2,1 norm loss function robust NMF is introduced into appearance modelling in visual tracking and multiplicative update rules in incremental learning for robustNMF are proposed for model update to strengthen its practicality in visualtracking.

A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems

• Computer Science, Mathematics
SIAM J. Imaging Sci.
• 2009
A new fast iterative shrinkage-thresholding algorithm (FISTA) which preserves the computational simplicity of ISTA but with a global rate of convergence which is proven to be significantly better, both theoretically and practically.