#### Filter Results:

- Full text PDF available (68)

#### Publication Year

1975

2017

- This year (2)
- Last 5 years (16)
- Last 10 years (25)

#### Publication Type

#### Co-author

#### Journals and Conferences

#### Data Set Used

#### Key Phrases

Learn More

- Vladimir Vapnik
- 1998

- Corinna Cortes, Vladimir Vapnik
- Machine Learning
- 1995

Thesupport-vector network is a new learning machine for two-group classification problems. The machine conceptually implements the following idea: input vectors are non-linearly mapped to a very high-dimension feature space. In this feature space a linear decision surface is constructed. Special properties of the decision surface ensures high generalization… (More)

- Isabelle Guyon, Jason Weston, Stephen Barnhill, Vladimir Vapnik
- Machine Learning
- 2002

DNA micro-arrays now permit scientists to screen thousands of genes simultaneously and determine whether those genes are active, hyperactive or silent in normal or cancerous tissue. Because these new micro-array devices generate bewildering amounts of raw data, new analytical methods must be developed to sort out whether cancer tissues have distinctive… (More)

- Bernhard E. Boser, Isabelle Guyon, Vladimir Vapnik
- COLT
- 1992

A training algorithm that maximizes the margin between the training patterns and the decision boundary is presented. The technique is applicable to a wide variety of the classification functions, including Perceptrons, polynomials, and Radial Basis Functions. The effective number of parameters is adjusted automatically to match the complexity of the… (More)

According to the classical Bernoulli theorem, the relative frequency of an event A in a sequence of independent trials converges (in probability) to the probability of that event. In many applications, however, the need arises to judge simultaneously the probabilities of events of an entire class S from one and the same sample. Moreover, it is required that… (More)

A new regression technique based on Vapnik’s concept of support vectors is introduced. We compare support vector regression (SVR) with a committee regression technique (bagging) based on regression trees and ridge regression done in feature space. On the basis of these experiments, it is expected that SVR will have advantages in high dimensionality space… (More)

- Vladimir Vapnik
- IEEE Trans. Neural Networks
- 1999

Statistical learning theory was introduced in the late 1960's. Until the 1990's it was a purely theoretical analysis of the problem of function estimation from a given collection of data. In the middle of the 1990's new types of learning algorithms (called support vector machines) based on the developed theory were proposed. This made statistical learning… (More)

- Olivier Chapelle, Vladimir Vapnik, Olivier Bousquet, Sayan Mukherjee
- Machine Learning
- 2002

The problem of automatically tuning multiple parameters for pattern recognition Support Vector Machines (SVMs) is considered. This is done by minimizing some estimates of the generalization error of SVMs using a gradient descent algorithm over the set of parameters. Usual methods for choosing parameters, based on exhaustive search become intractable as soon… (More)

- Asa Ben-Hur, David Horn, Hava T. Siegelmann, Vladimir Vapnik
- Journal of Machine Learning Research
- 2001

We present a novel clustering method using the approach of support vector machines. Data points are mapped by means of a Gaussian kernel to a high dimensional feature space, where we search for the minimal enclosing sphere. This sphere, when mapped back to data space, can separate into several components, each enclosing a separate cluster of points. We… (More)

Alex Smola· GMD First Rudower Shausee 5 12489 Berlin asm@big.att.com The Support Vector (SV) method was recently proposed for estimating regressions, constructing multidimensional splines, and solving linear operator equations [Vapnik, 1995]. In this presentation we report results of applying the SV method to these problems.