#### Filter Results:

- Full text PDF available (85)

#### Publication Year

1975

2017

#### Publication Type

#### Co-author

#### Publication Venue

#### Data Set Used

#### Key Phrases

#### Method

#### Organism

Learn More

- Vladimir Vapnik
- 1998

- Corinna Cortes, Vladimir Vapnik
- Machine Learning
- 1995

The support-vector network is a new learning machine for two-group classiication problems. The machine conceptually implements the following idea: input vectors are non-linearly mapped to a very high-dimension feature space. In this feature space a linear decision surface is constructed. Special properties of the decision surface ensures high generalization… (More)

- Bernhard E. Boser, Isabelle Guyon, Vladimir Vapnik
- COLT
- 1992

A training algorithm that maximizes the margin between the training patterns and the decision boundary is presented. The technique is applicable to a wide variety of the classification functions, including Perceptrons, polynomials, and Radial Basis Functions. The effective number of parameters is adjusted automatically to match the complexity of the… (More)

- Isabelle Guyon, Jason Weston, Stephen Barnhill, Vladimir Vapnik
- Machine Learning
- 2002

DNA micro-arrays now permit scientists to screen thousands of genes simultaneously and determine whether those genes are active, hyperactive or silent in normal or cancerous tissue. Because these new micro-array devices generate bewildering amounts of raw data, new analytical methods must be developed to sort out whether cancer tissues have distinctive… (More)

- Vladimir Vapnik
- 2016

The aim of this book is to discuss the fundamental ideas which lie behind the statistical theory of learning and generalization. It considers learning as a general problem of function estimation based on empirical data. Omitting proofs and technical All of wankers it is, a function to overfitting finding. The empirical risk functional relationship between… (More)

- Vladimir Vapnik
- IEEE Trans. Neural Networks
- 1999

Statistical learning theory was introduced in the late 1960's. Until the 1990's it was a purely theoretical analysis of the problem of function estimation from a given collection of data. In the middle of the 1990's new types of learning algorithms (called support vector machines) based on the developed theory were proposed. This made statistical learning… (More)

A new regression technique based on Vapnik's concept of support vectors is introduced. We compare support vector regression (SVR) with a committee regression technique (bagging) based on regression trees and ridge regression done in feature space. On the basis of these experiments, it is expected that SVR will have advantages in high dimensionality space… (More)

- Olivier Chapelle, Vladimir Vapnik, Olivier Bousquet, Sayan Mukherjee
- Machine Learning
- 2002

The problem of automatically tuning multiple parameters for pattern recognition Support Vector Machines (SVMs) is considered. This is done by minimizing some estimates of the generalization error of SVMs using a gradient descent algorithm over the set of parameters. Usual methods for choosing parameters, based on exhaustive search become intractable as soon… (More)

- Christopher J.C. Burges, Alexander J. Smola, +14 authors Kristin P. Bennett
- 1998