Support-vector networks

@article{Cortes2004SupportvectorN,
  title={Support-vector networks},
  author={Corinna Cortes and Vladimir Naumovich Vapnik},
  journal={Machine Learning},
  year={2004},
  volume={20},
  pages={273-297}
}
Thesupport-vector network is a new learning machine for two-group classification problems. The machine conceptually implements the following idea: input vectors are non-linearly mapped to a very high-dimension feature space. In this feature space a linear decision surface is constructed. Special properties of the decision surface ensures high generalization ability of the learning machine. The idea behind the support-vector network was previously implemented for the restricted case where the… 

On the proliferation of support vectors in high dimensions

This paper identifies new deterministic equivalences for this phenomenon of support vector proliferation, and uses them to substantially broaden the conditions under which the phenomenon occurs in high-dimensional settings, and proves a nearly matching converse result.

Self-optimizing neural network in the classification of real valued data

The article proposes a new technique based on the decision network called self-optimizing neural networks (SONN), which works on discretized data and shows that the information obtained from a training set is better generalized and the final accuracy of the classifier is higher.
...

References

SHOWING 1-10 OF 15 REFERENCES

A training algorithm for optimal margin classifiers

A training algorithm that maximizes the margin between the training patterns and the decision boundary is presented. The technique is applicable to a wide variety of the classification functions,

Neural-Network and k-Nearest-neighbor Classifiers

The performance of a state-of-the-art neural network classifier for hand-written digits is compared to that of a k-nearest-neighbor classifier and to human performance. The neural network has a clear

Handwritten Digit Recognition with a Back-Propagation Network

Minimal preprocessing of the data was required, but architecture of the network was highly constrained and specifically designed for the task, and has 1% error rate and about a 9% reject rate on zipcode digits provided by the U.S. Postal Service.

Classification into two Multivariate Normal Distributions with Different Covariance Matrices

Linear procedures for classifying an observation as coming from one of two multivariate normal distributions are studied in the case that the two distributions differ both in mean vectors and

Comparison of classifier methods: a case study in handwritten digit recognition

  • L. BottouCorinna Cortes V. Vapnik
  • Computer Science
    Proceedings of the 12th IAPR International Conference on Pattern Recognition, Vol. 3 - Conference C: Signal Processing (Cat. No.94CH3440-5)
  • 1994
This paper compares the performance of several classifier algorithms on a standard database of handwritten digits. We consider not only raw accuracy, but also training time, recognition time, and

Estimation of Dependences Based on Empirical Data

Realism and Instrumentalism: Classical Statistics and VC Theory (1960-1980).- Falsifiability and Parsimony: VC Dimension and the Number of Entities (1980-2000).- Noninductive Methods of Inference:

THE USE OF MULTIPLE MEASUREMENTS IN TAXONOMIC PROBLEMS