Robust Kernel Approximation for Classification
@inproceedings{Liu2017RobustKA, title={Robust Kernel Approximation for Classification}, author={Fanghui Liu and X. Huang and Cheng Peng and Jie Yang and Nikola K. Kasabov}, booktitle={International Conference on Neural Information Processing}, year={2017} }
This paper investigates a robust kernel approximation scheme for support vector machine classification with indefinite kernels. It aims to tackle the issue that the indefinite kernel is contaminated by noises and outliers, i.e. a noisy observation of the true positive definite (PD) kernel. The traditional algorithms recovery the PD kernel from the observation with the small Gaussian noises, however, such way is not robust to noises and outliers that do not follow a Gaussian distribution. In…
References
SHOWING 1-10 OF 18 REFERENCES
A Generalized Kernel Approach to Dissimilarity-based Classification
- Computer ScienceJ. Mach. Learn. Res.
- 2001
It is shown that other, more global classification techniques are preferable to the nearest neighbor rule, in such cases when dissimilarities used in practice are far from ideal and the performance of the nearest neighbors rule suffers from its sensitivity to noisy examples.
Learning with non-positive kernels
- Computer Science, MathematicsICML
- 2004
A general representer theorem for constrained stabilization is shown and generalization bounds are proved by computing the Rademacher averages of the kernel class.
Indefinite kernels in least squares support vector machines and principal component analysis
- Computer Science
- 2017
Classification With Truncated $\ell _{1}$ Distance Kernel
- Computer ScienceIEEE Transactions on Neural Networks and Learning Systems
- 2018
In numerical experiments, the TL1 kernel with a pregiven parameter achieves similar or better performance than the radial basis function kernel with the parameter tuned by cross validation, implying theTL1 kernel a promising nonlinear kernel for classification tasks.
Analysis of SVM with Indefinite Kernels
- Computer ScienceNIPS
- 2009
It is shown that the objective function is continuously differentiable and its gradient can be explicitly computed, and that its gradient is Lipschitz continuous, which greatly facilitates the application of gradient-based algorithms.
Learning SVM in Kreĭn Spaces
- Computer ScienceIEEE Transactions on Pattern Analysis and Machine Intelligence
- 2016
This paper justifies and evaluates a solution that uses the original (indefinite) similarity measure, in the original Kreĭn space, and establishes the correspondence between the stabilization problem and a classical SVM based on minimization (easy to solve).
Incremental Robust Nonnegative Matrix Factorization for Object Tracking
- Computer ScienceICONIP
- 2016
NMF with L2,1 norm loss function robust NMF is introduced into appearance modelling in visual tracking and multiplicative update rules in incremental learning for robustNMF are proposed for model update to strengthen its practicality in visualtracking.
A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems
- Computer Science, MathematicsSIAM J. Imaging Sci.
- 2009
A new fast iterative shrinkage-thresholding algorithm (FISTA) which preserves the computational simplicity of ISTA but with a global rate of convergence which is proven to be significantly better, both theoretically and practically.
Classification on Pairwise Proximity Data
- Computer ScienceNIPS
- 1998
We investigate the problem of learning a classification task on data represented in terms of their pairwise proximities. This representation does not refer to an explicit feature representation of…
Optimal Cluster Preserving Embedding of Nonmetric Proximity Data
- Computer ScienceIEEE Trans. Pattern Anal. Mach. Intell.
- 2003
This paper shows that all clustering methods, which are invariant under additive shifts of the pairwise proximities, can be reformulated as grouping problems in Euclidian spaces and preserves the complete preservation of the cluster structure in the embedding space.