#### Filter Results:

#### Publication Year

2001

2017

#### Publication Type

#### Co-author

#### Publication Venue

#### Data Set Used

#### Key Phrases

Learn More

- Ingo Steinwart
- Journal of Machine Learning Research
- 2001

In this article we study the generalization abilities of several classifiers of support vector machine (SVM) type using a certain class of kernels that we call universal. It is shown that the soft margin algorithms with universal kernels are consistent for a large class of classification problems including some kind of noisy tasks provided that the… (More)

- Ingo Steinwart
- IEEE Transactions on Information Theory
- 2005

It is shown that various classifiers that are based on minimization of a regularized risk are universally consistent, i.e., they can asymptotically learn in every classification task. The role of the loss functions used in these algorithms is considered in detail. As an application of our general framework, several types of support vector machines (SVMs) as… (More)

- Ingo Steinwart
- Journal of Machine Learning Research
- 2003

Support vector machines (SVMs) construct decision functions that are linear combinations of kernel evaluations on the training set. The samples with non-vanishing coefficients are called support vectors. In this work we establish lower (asymptotical) bounds on the number of support vectors. On our way we prove several results which are of great importance… (More)

- Ingo Steinwart
- 2004

We establish learning rates up to the order of n −1 for support vector machines with hinge loss (L1-SVMs) and nontrivial distributions. For the stochastic analysis of these algorithms we use recently developed concepts such as Tsybakov's noise assumption and local Rademacher averages. Furthermore we introduce a new geometric noise condition for… (More)

- Ingo Steinwart, Don R. Hush, Clint Scovel
- Journal of Machine Learning Research
- 2005

One way to describe anomalies is by saying that anomalies are not concentrated. This leads to the problem of finding level sets for the data generating density. We interpret this learning problem as a binary classification problem and compare the corresponding classification risk with the standard performance measure for the density level problem. In… (More)

- Ingo Steinwart
- 2006

Many learning problems are described by a risk functional which in turn is defined by a loss function, and a straightforward and widely-known approach to learn such problems is to minimize a (modified) empirical version of this risk functional. However, in many cases this approach suffers from substantial problems such as computational requirements in… (More)

- Ingo Steinwart, Clint Scovel
- COLT
- 2005

- Ingo Steinwart, Don R. Hush, Clint Scovel
- IEEE Transactions on Information Theory
- 2006

Although Gaussian radial basis function (RBF) kernels are one of the most often used kernels in modern machine learning methods such as support vector machines (SVMs), little is known about the structure of their reproducing kernel Hilbert spaces (RKHSs). In this work, two distinct explicit descriptions of the RKHSs corresponding to Gaussian RBF kernels are… (More)

- Andreas Christmann, Ingo Steinwart
- NIPS
- 2010

During the last years support vector machines (SVMs) have been successfully applied in situations where the input space X is not necessarily a subset of R d. Examples include SVMs for the analysis of histograms or colored images, SVMs for text classification and web mining, and SVMs for applications from computational biology using, e.g., kernels for trees… (More)

- Ingo Steinwart
- J. Complexity
- 2002

We show that support vector machines of the 1-norm soft margin type are universally consistent provided that the regularization parameter is chosen in a distinct manner and the kernel belongs to a specific class}the so-called universal kernels}which has recently been considered by the author. In particular it is shown that the 1-norm soft margin classifier… (More)