• Publications
  • Influence
Support-Vector Networks
TLDR
High generalization ability of support-vector networks utilizing polynomial input transformations is demonstrated and the performance of the support- vector network is compared to various classical learning algorithms that all took part in a benchmark study of Optical Character Recognition.
The mnist database of handwritten digits
TLDR
An improved articulated bar flail having shearing edges for efficiently shredding materials and an improved shredder cylinder with a plurality of these flails circumferentially spaced and pivotally attached to the periphery of a rotatable shaft are disclosed.
Algorithms for Learning Kernels Based on Centered Alignment
TLDR
These algorithms consistently outperform the so-called uniform combination solution that has proven to be difficult to improve upon in the past, as well as other algorithms for learning kernels based on convex combinations of base kernels in both classification and regression.
AUC Optimization vs. Error Rate Minimization
TLDR
The results show that the average AUC is monotonically increasing as a function of the classification accuracy, but that the standard deviation for uneven distributions and higher error rates is noticeable, so algorithms designed to minimize the error rate may not lead to the best possible AUC values.
Learning Bounds for Importance Weighting
TLDR
This paper gives learning guarantees for the more common case of unbounded importance weights under the weak assumption that the second moment is bounded, a condition related to the Renyi divergence of the traning and test distributions.
Two-Stage Learning Kernel Algorithms
TLDR
A novel and simple concentration bound for alignment between kernel matrices is given and the existence of good predictors for kernels with high alignment is shown, both for classification and for regression.
Comparison of classifier methods: a case study in handwritten digit recognition
This paper compares the performance of several classifier algorithms on a standard database of handwritten digits. We consider not only raw accuracy, but also training time, recognition time, and
Sample Selection Bias Correction Theory
TLDR
A theoretical analysis of sample selection bias correction based on the novel concept of distributional stability which generalizes the existing concept of point-based stability and can be used to analyze other importance weighting techniques and their effect on accuracy when using a distributionally stable algorithm.
Learning Non-Linear Combinations of Kernels
TLDR
A projection-based gradient descent algorithm is given for solving the optimization problem of learning kernels based on a polynomial combination of base kernels and it is proved that the global solution of this problem always lies on the boundary.
Generalization Bounds for Learning Kernels
TLDR
This paper presents several novel generalization bounds for the problem of learning kernels based on a combinatorial analysis of the Rademacher complexity of the corresponding hypothesis sets, and gives a novel bound for learning with a non-negative combination of p base kernels with an L2 regularization whose dependency on p is also tight and only in p1/4.
...
...