# Training Invariant Support Vector Machines

@article{DeCoste2004TrainingIS, title={Training Invariant Support Vector Machines}, author={Dennis DeCoste and Bernhard Sch{\"o}lkopf}, journal={Machine Learning}, year={2004}, volume={46}, pages={161-190} }

Practical experience has shown that in order to obtain the best possible performance, prior knowledge about invariances of a classification problem at hand ought to be incorporated into the training procedure. We describe and review all known methods for doing so in support vector machines, provide experimental results, and discuss their respective merits. One of the significant new results reported in this work is our recent achievement of the lowest reported test error on the well-known MNIST…

## 608 Citations

### Integrating class-dependant tangent vectors into SVMs for handwritten digit recognition

- Computer Science2009 3rd International Conference on Signals, Circuits and Systems (SCS)
- 2009

This work investigates the use of tangent vectors that are a priori extracted from training data that improves recognition accuracy and allows a huge reduction in the runtime.

### Handwritten digit classification

- Computer Science
- 2011

A comparison between a multivariate and a probabilistic approach is shown, concluding that both methods provide similar results in terms of test-error rate.

### DIGIT CLASSIFICATION

- Computer Science
- 2011

A comparison between a multivariate and a probabilistic approach is shown, concluding that both methods provide similar results in terms of test-error rate.

### Tangent distance kernels for support vector machines

- Computer ScienceObject recognition supported by user interaction for service robots
- 2002

This work introduces a new class of kernels for support vector machines which incorporate tangent distance and therefore are applicable in cases where such transformation invariances are known.

### Homogenized Virtual Support Vector Machines

- Computer ScienceDigital Image Computing: Techniques and Applications (DICTA'05)
- 2005

This paper presents a new generalisation of the Support Vector Machine that aims to better incorporate classification invariance with respect to image translations and rotations and penalises an approximation of the variance of the decision function across each grouped set of "virtual examples".

### Invariances in Classification: an efficient SVM implementation

- Computer Science
- 2005

This work presents a unifying formulation of the problem of incorporating invariances into a pattern recognition classifier and extends the SimpleSVM algorithm to handle invariance efficiently.

### Extract candidates of support vector from training set

- Computer ScienceProceedings of the 2003 International Conference on Machine Learning and Cybernetics (IEEE Cat. No.03EX693)
- 2003

A heuristic method to extract candidates of support vector from training set and training a support vector on the extracted candidates achieves good generalization on test set shows that candidates ofSupport vector contain almost all the necessary information to solve a given classification task.

### Fast and Accurate Digit Classification

- Computer Science
- 2009

This work explores the use of certain image features, blockwise histograms of local orientations, used in many current object recognition algorithms, for the task of handwritten digit recognition and demonstrates that with improved features a low complexity classifier, in particular an additive-kernel SVM, can achieve state of the art performance.

### Relative Margin Machines

- Computer ScienceNIPS
- 2008

A novel formulation to overcome sensitivity in Support Vector Machines and maximizes the margin relative to the spread of the data is proposed.

### Clustered class-dependant training method for digit recognition classifiers

- Computer Science2016 International Symposium on Industrial Electronics (INDEL)
- 2016

This paper presents a convolutional neural network clustering approach for handwritten digits recognition, and examines various ways of combining such clusters and training their constituent networks.

## References

SHOWING 1-10 OF 59 REFERENCES

### A training algorithm for optimal margin classifiers

- Computer ScienceCOLT '92
- 1992

A training algorithm that maximizes the margin between the training patterns and the decision boundary is presented. The technique is applicable to a wide variety of the classification functions,…

### Extracting Support Data for a Given Task

- Computer ScienceKDD
- 1995

It is observed that three different types of handwritten digit classifiers construct their decision surface from strongly overlapping small subsets of the data base, which opens up the possibility of compressing data bases significantly by disposing of theData which is not important for the solution of a given task.

### Support-Vector Networks

- Computer ScienceMachine Learning
- 2004

High generalization ability of support-vector networks utilizing polynomial input transformations is demonstrated and the performance of the support- vector network is compared to various classical learning algorithms that all took part in a benchmark study of Optical Character Recognition.

### Handwritten digit recognition with a novel vision model that extracts linearly separable features

- Computer Science, MathematicsProceedings IEEE Conference on Computer Vision and Pattern Recognition. CVPR 2000 (Cat. No.PR00662)
- 2000

It is shown empirically that the features extracted by the model are linearly separable over a large training set (MNIST).

### Support vector learning

- Computer Science
- 1997

This book provides a comprehensive analysis of what can be done using Support vector Machines, achieving record results in real-life pattern recognition problems, and proposes a new form of nonlinear Principal Component Analysis using Support Vector kernel techniques, which it is considered as the most natural and elegant way for generalization of classical Principal Component analysis.

### Making large-scale support vector machine learning practical

- Computer Science
- 1998

This chapter presents algorithmic and computational results developed for SV M light V2.0, which make large-scale SVM training more practical and give guidelines for the application of SVMs to large domains.

### Distortion-invariant recognition via jittered queries

- Computer ScienceProceedings IEEE Conference on Computer Vision and Pattern Recognition. CVPR 2000 (Cat. No.PR00662)
- 2000

This paper presents a new approach for achieving distortion-invariant recognition and classification, where instead of querying with a single pattern, a more robust query is constructed, based on the family of patterns formed by distorting the test example.

### Efficient Pattern Recognition Using a New Transformation Distance

- Computer ScienceNIPS
- 1992

A new distance measure which can be made locally invariant to any set of transformations of the input and can be computed efficiently is proposed.

### Boosting Performance in Neural Networks

- Computer ScienceInt. J. Pattern Recognit. Artif. Intell.
- 1993

The boosting algorithm is used to construct an ensemble of neural networks that significantly improves performance (compared to a single network) in optical character recognition (OCR) problems and improved performance significantly, and, in some cases, dramatically.

### Comparison of learning algorithms for handwritten digit recognition

- Computer Science
- 1995

This comparison of several learning algorithms for handwritten digits considers not only raw accuracy, but also rejection, training time, recognition time, and memory requirements.