Corpus ID: 1283353

Kernel Design Using Boosting

@inproceedings{Crammer2002KernelDU,
  title={Kernel Design Using Boosting},
  author={Koby Crammer and Joseph Keshet and Yoram Singer},
  booktitle={NIPS},
  year={2002}
}
The focus of the paper is the problem of learning kernel operators from empirical data. We cast the kernel design problem as the construction of an accurate kernel from simple (and less accurate) base kernels. We use the boosting paradigm to perform the kernel construction process. To do so, we modify the booster so as to accommodate kernel operators. We also devise an efficient weak-learner for simple kernels that is based on generalized eigen vector decomposition. We demonstrate the… Expand
Learning with Idealized Kernels
TLDR
This paper considers the problem of adapting the kernel so that it becomes more similar to the so-called ideal kernel as a distance metric learning problem that searches for a suitable linear transform (feature weighting) in the kernel-induced feature space. Expand
Linear kernel combination using boosting
TLDR
This paper proposes a novel algorithm to design multi- class kernels based on an iterative combination of weak kernels in a schema inspired from the boosting framework and evaluates its method for classification on a toy example and comparison with a reference iterative kernel design method. Expand
Semi-supervised mixture of kernels via LPBoost methods
TLDR
By modifying the column generation boosting algorithm LPBoost to a more general linear programming formulation, this work is able to efficiently solve mixture-of-kernel problems and automatically select kernel basis functions centered at labeled data as well as unlabeled data. Expand
Column-generation boosting methods for mixture of kernels
TLDR
A boosting approach to classification and regression based on column generation using a mixture of kernels, which produces sparser solutions, and thus significantly reduces the testing time and is able to scale CG boosting to large datasets. Expand
A survey of the state of the art in learning the kernels
TLDR
An overview of algorithms to learn the kernel is presented and a comparison of various approaches to find an optimal kernel is provided to help identify pivotal issues that lead to efficient design of such algorithms. Expand
Kernels: Regularization and Optimization
TLDR
It is shown that for several machine learning tasks, such as binary classification, regression and novelty detection, the resulting optimization problem is a semidefinite program, and theoretical and experimental evidence is provided to support the idea of regularization by early stopping of conjugate gradient type algorithms. Expand
Learning the kernel matrix by maximizing a KFD-based class separability criterion
TLDR
This paper proposes a novel method for learning the kernel matrix based on maximizing a class separability criterion that is similar to those used by linear discriminant analysis (LDA) and kernel Fisher discriminant (KFD). Expand
Learning the Kernel with Hyperkernels
TLDR
The equivalent representer theorem for the choice of kernels is state and a semidefinite programming formulation of the resulting optimization problem is presented, which leads to a statistical estimation problem similar to the problem of minimizing a regularized risk functional. Expand
Improving efficiency of multi-kernel learning for support vector machines
TLDR
This study reformulate the SDP problem to reduce the time and space requirements, and strategies for reducing the search space in solving the SSPD problem are introduced. Expand
Low-Dimensional Feature Learning with Kernel Construction
We propose a practical method of semi-supervised feature learning with constructed kernels from combinations of non-linear weak rankers for classification applications. While in kernel methods oneExpand
...
1
2
3
4
5
...

References

SHOWING 1-10 OF 21 REFERENCES
On the Complexity of Learning the Kernel Matrix
TLDR
A suitable way of constraining the class is proposed and an efficient algorithm is used to solve the resulting optimization problem, preliminary experimental results are presented, and an alignment-based approach is compared. Expand
Input space versus feature space in kernel-based methods
TLDR
The geometry of feature space is reviewed, and the connection between feature space and input space is discussed by dealing with the question of how one can, given some vector in feature space, find a preimage in input space. Expand
On Kernel-Target Alignment
TLDR
The notion of kernel-alignment, a measure of similarity between two kernel functions or between a kernel and a target function, is introduced, giving experimental results showing that adapting the kernel to improve alignment on the labelled data significantly increases the alignment on a test set, giving improved classification accuracy. Expand
Learning the Kernel Matrix with Semidefinite Programming
TLDR
This paper shows how the kernel matrix can be learned from data via semidefinite programming (SDP) techniques and leads directly to a convex method for learning the 2-norm soft margin parameter in support vector machines, solving an important open problem. Expand
Discussion of the Paper \additive Logistic Regression: a Statistical View of Boosting" By
The main and important contribution of this paper is in establishing a connection between boosting, a newcomer to the statistics scene, and additive models. One of the main properties of boostingExpand
The Spectrum Kernel: A String Kernel for SVM Protein Classification
TLDR
A new sequence-similarity kernel, the spectrum kernel, is introduced for use with support vector machines (SVMs) in a discriminative approach to the protein classification problem and performs well in comparison with state-of-the-art methods for homology detection. Expand
An Introduction to Support Vector Machines and Other Kernel-based Learning Methods
TLDR
This is the first comprehensive introduction to Support Vector Machines (SVMs), a new generation learning system based on recent advances in statistical learning theory, and will guide practitioners to updated literature, new applications, and on-line software. Expand
Text Classification using String Kernels
TLDR
A novel kernel is introduced for comparing two text documents consisting of an inner product in the feature space consisting of all subsequences of length k, which can be efficiently evaluated by a dynamic programming technique. Expand
Logistic Regression, AdaBoost and Bregman Distances
TLDR
A unified account of boosting and logistic regression in which each learning problem is cast in terms of optimization of Bregman distances, and a parameterized family of algorithms that includes both a sequential- and a parallel-update algorithm as special cases are described, thus showing how the sequential and parallel approaches can themselves be unified. Expand
An introduction to Support Vector Machines
This book is the first comprehensive introduction to Support Vector Machines (SVMs), a new generation learning system based on recent advances in statistical learning theory. The book also introducesExpand
...
1
2
3
...