# Robust Kernel Approximation for Classification

@inproceedings{Liu2017RobustKA,
title={Robust Kernel Approximation for Classification},
author={Fanghui Liu and X. Huang and Cheng Peng and Jie Yang and Nikola K. Kasabov},
booktitle={International Conference on Neural Information Processing},
year={2017}
}
• Published in
International Conference on…
14 November 2017
• Computer Science
This paper investigates a robust kernel approximation scheme for support vector machine classification with indefinite kernels. It aims to tackle the issue that the indefinite kernel is contaminated by noises and outliers, i.e. a noisy observation of the true positive definite (PD) kernel. The traditional algorithms recovery the PD kernel from the observation with the small Gaussian noises, however, such way is not robust to noises and outliers that do not follow a Gaussian distribution. In…

## References

SHOWING 1-10 OF 18 REFERENCES

• Computer Science
J. Mach. Learn. Res.
• 2001
It is shown that other, more global classification techniques are preferable to the nearest neighbor rule, in such cases when dissimilarities used in practice are far from ideal and the performance of the nearest neighbors rule suffers from its sensitivity to noisy examples.
• Computer Science, Mathematics
ICML
• 2004
A general representer theorem for constrained stabilization is shown and generalization bounds are proved by computing the Rademacher averages of the kernel class.
• Computer Science
IEEE Transactions on Neural Networks and Learning Systems
• 2018
In numerical experiments, the TL1 kernel with a pregiven parameter achieves similar or better performance than the radial basis function kernel with the parameter tuned by cross validation, implying theTL1 kernel a promising nonlinear kernel for classification tasks.
• Computer Science
NIPS
• 2009
It is shown that the objective function is continuously differentiable and its gradient can be explicitly computed, and that its gradient is Lipschitz continuous, which greatly facilitates the application of gradient-based algorithms.
• Computer Science
IEEE Transactions on Pattern Analysis and Machine Intelligence
• 2016
This paper justifies and evaluates a solution that uses the original (indefinite) similarity measure, in the original Kreĭn space, and establishes the correspondence between the stabilization problem and a classical SVM based on minimization (easy to solve).
• Computer Science
ICONIP
• 2016
NMF with L2,1 norm loss function robust NMF is introduced into appearance modelling in visual tracking and multiplicative update rules in incremental learning for robustNMF are proposed for model update to strengthen its practicality in visualtracking.
• Computer Science, Mathematics
SIAM J. Imaging Sci.
• 2009
A new fast iterative shrinkage-thresholding algorithm (FISTA) which preserves the computational simplicity of ISTA but with a global rate of convergence which is proven to be significantly better, both theoretically and practically.
• Computer Science
NIPS
• 1998
We investigate the problem of learning a classification task on data represented in terms of their pairwise proximities. This representation does not refer to an explicit feature representation of
• Computer Science
IEEE Trans. Pattern Anal. Mach. Intell.
• 2003
This paper shows that all clustering methods, which are invariant under additive shifts of the pairwise proximities, can be reformulated as grouping problems in Euclidian spaces and preserves the complete preservation of the cluster structure in the embedding space.