Learn More
In this article we study the generalization abilities of several classifiers of support vector machine (SVM) type using a certain class of kernels that we call universal. It is shown that the soft margin algorithms with universal kernels are consistent for a large class of classification problems including some kind of noisy tasks provided that the(More)
It is shown that various classifiers that are based on minimization of a regularized risk are universally consistent, i.e., they can asymptotically learn in every classification task. The role of the loss functions used in these algorithms is considered in detail. As an application of our general framework, several types of support vector machines (SVMs) as(More)
Support vector machines (SVMs) construct decision functions that are linear combinations of kernel evaluations on the training set. The samples with non-vanishing coefficients are called support vectors. In this work we establish lower (asymptotical) bounds on the number of support vectors. On our way we prove several results which are of great importance(More)
One way to describe anomalies is by saying that anomalies are not concentrated. This leads to the problem of finding level sets for the data generating density. We interpret this learning problem as a binary classification problem and compare the corresponding classification risk with the standard performance measure for the density level problem. In(More)
Although Gaussian radial basis function (RBF) kernels are one of the most often used kernels in modern machine learning methods such as support vector machines (SVMs), little is known about the structure of their reproducing kernel Hilbert spaces (RKHSs). In this work, two distinct explicit descriptions of the RKHSs corresponding to Gaussian RBF kernels are(More)
We show that support vector machines of the 1-norm soft margin type are universally consistent provided that the regularization parameter is chosen in a distinct manner and the kernel belongs to a specific class}the so-called universal kernels}which has recently been considered by the author. In particular it is shown that the 1-norm soft margin classifier(More)