Robust Kernel Approximation for Classification

@inproceedings{Liu2017RobustKA,
  title={Robust Kernel Approximation for Classification},
  author={Fanghui Liu and X. Huang and Cheng Peng and Jie Yang and Nikola K. Kasabov},
  booktitle={International Conference on Neural Information Processing},
  year={2017}
}
This paper investigates a robust kernel approximation scheme for support vector machine classification with indefinite kernels. It aims to tackle the issue that the indefinite kernel is contaminated by noises and outliers, i.e. a noisy observation of the true positive definite (PD) kernel. The traditional algorithms recovery the PD kernel from the observation with the small Gaussian noises, however, such way is not robust to noises and outliers that do not follow a Gaussian distribution. In… 

References

SHOWING 1-10 OF 18 REFERENCES

A Generalized Kernel Approach to Dissimilarity-based Classification

It is shown that other, more global classification techniques are preferable to the nearest neighbor rule, in such cases when dissimilarities used in practice are far from ideal and the performance of the nearest neighbors rule suffers from its sensitivity to noisy examples.

Learning with non-positive kernels

A general representer theorem for constrained stabilization is shown and generalization bounds are proved by computing the Rademacher averages of the kernel class.

Classification With Truncated $\ell _{1}$ Distance Kernel

In numerical experiments, the TL1 kernel with a pregiven parameter achieves similar or better performance than the radial basis function kernel with the parameter tuned by cross validation, implying theTL1 kernel a promising nonlinear kernel for classification tasks.

Analysis of SVM with Indefinite Kernels

It is shown that the objective function is continuously differentiable and its gradient can be explicitly computed, and that its gradient is Lipschitz continuous, which greatly facilitates the application of gradient-based algorithms.

Learning SVM in Kreĭn Spaces

This paper justifies and evaluates a solution that uses the original (indefinite) similarity measure, in the original Kreĭn space, and establishes the correspondence between the stabilization problem and a classical SVM based on minimization (easy to solve).

Incremental Robust Nonnegative Matrix Factorization for Object Tracking

NMF with L2,1 norm loss function robust NMF is introduced into appearance modelling in visual tracking and multiplicative update rules in incremental learning for robustNMF are proposed for model update to strengthen its practicality in visualtracking.

A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems

A new fast iterative shrinkage-thresholding algorithm (FISTA) which preserves the computational simplicity of ISTA but with a global rate of convergence which is proven to be significantly better, both theoretically and practically.

Classification on Pairwise Proximity Data

We investigate the problem of learning a classification task on data represented in terms of their pairwise proximities. This representation does not refer to an explicit feature representation of

Optimal Cluster Preserving Embedding of Nonmetric Proximity Data

This paper shows that all clustering methods, which are invariant under additive shifts of the pairwise proximities, can be reformulated as grouping problems in Euclidian spaces and preserves the complete preservation of the cluster structure in the embedding space.