Training Invariant Support Vector Machines

@article{DeCoste2004TrainingIS,
  title={Training Invariant Support Vector Machines},
  author={Dennis DeCoste and Bernhard Sch{\"o}lkopf},
  journal={Machine Learning},
  year={2004},
  volume={46},
  pages={161-190}
}
Practical experience has shown that in order to obtain the best possible performance, prior knowledge about invariances of a classification problem at hand ought to be incorporated into the training procedure. We describe and review all known methods for doing so in support vector machines, provide experimental results, and discuss their respective merits. One of the significant new results reported in this work is our recent achievement of the lowest reported test error on the well-known MNIST… Expand
Integrating class-dependant tangent vectors into SVMs for handwritten digit recognition
  • H. Nemmour, Y. Chibani
  • Computer Science
  • 2009 3rd International Conference on Signals, Circuits and Systems (SCS)
  • 2009
TLDR
This work investigates the use of tangent vectors that are a priori extracted from training data that improves recognition accuracy and allows a huge reduction in the runtime. Expand
Handwritten digit classification
TLDR
A comparison between a multivariate and a probabilistic approach is shown, concluding that both methods provide similar results in terms of test-error rate. Expand
DIGIT CLASSIFICATION
Pattern recognition is one of the major challenges in statistics framework. Its goal is the feature extraction to classify the patterns into categories. A well-known example in this field is theExpand
Tangent distance kernels for support vector machines
  • B. Haasdonk, Daniel Keysers
  • Mathematics, Computer Science
  • Object recognition supported by user interaction for service robots
  • 2002
TLDR
This work introduces a new class of kernels for support vector machines which incorporate tangent distance and therefore are applicable in cases where such transformation invariances are known. Expand
Homogenized Virtual Support Vector Machines
TLDR
This paper presents a new generalisation of the Support Vector Machine that aims to better incorporate classification invariance with respect to image translations and rotations and penalises an approximation of the variance of the decision function across each grouped set of "virtual examples". Expand
Invariances in Classification: an efficient SVM implementation
TLDR
This work presents a unifying formulation of the problem of incorporating invariances into a pattern recognition classifier and extends the SimpleSVM algorithm to handle invariance efficiently. Expand
Extract candidates of support vector from training set
  • Yangguang Liu, Qi Chen, R. Yu
  • Computer Science
  • Proceedings of the 2003 International Conference on Machine Learning and Cybernetics (IEEE Cat. No.03EX693)
  • 2003
TLDR
A heuristic method to extract candidates of support vector from training set and training a support vector on the extracted candidates achieves good generalization on test set shows that candidates ofSupport vector contain almost all the necessary information to solve a given classification task. Expand
Fast and Accurate Digit Classification
TLDR
This work explores the use of certain image features, blockwise histograms of local orientations, used in many current object recognition algorithms, for the task of handwritten digit recognition and demonstrates that with improved features a low complexity classifier, in particular an additive-kernel SVM, can achieve state of the art performance. Expand
Relative Margin Machines
TLDR
A novel formulation to overcome sensitivity in Support Vector Machines and maximizes the margin relative to the spread of the data is proposed. Expand
Clustered class-dependant training method for digit recognition classifiers
TLDR
This paper presents a convolutional neural network clustering approach for handwritten digits recognition, and examines various ways of combining such clusters and training their constituent networks. Expand
...
1
2
3
4
5
...

References

SHOWING 1-10 OF 59 REFERENCES
Incorporating Invariances in Support Vector Learning Machines
TLDR
This work presents a method of incorporating prior knowledge about transformation invariances by applying transformations to support vectors, the training examples most critical for determining the classification boundary. Expand
Improving the Accuracy and Speed of Support Vector Machines
TLDR
This paper combines two techniques for improving generalization performance and speed on a pattern recognition problem by incorporating known invariances of the problem, and applies the reduced set method, applicable to any support vector machine. Expand
A training algorithm for optimal margin classifiers
A training algorithm that maximizes the margin between the training patterns and the decision boundary is presented. The technique is applicable to a wide variety of the classification functions,Expand
Extracting Support Data for a Given Task
TLDR
It is observed that three different types of handwritten digit classifiers construct their decision surface from strongly overlapping small subsets of the data base, which opens up the possibility of compressing data bases significantly by disposing of theData which is not important for the solution of a given task. Expand
Handwritten digit recognition with a novel vision model that extracts linearly separable features
  • Loo-Nin Teow, K. Loe
  • Computer Science
  • Proceedings IEEE Conference on Computer Vision and Pattern Recognition. CVPR 2000 (Cat. No.PR00662)
  • 2000
TLDR
It is shown empirically that the features extracted by the model are linearly separable over a large training set (MNIST). Expand
Support-Vector Networks
TLDR
High generalization ability of support-vector networks utilizing polynomial input transformations is demonstrated and the performance of the support- vector network is compared to various classical learning algorithms that all took part in a benchmark study of Optical Character Recognition. Expand
Support vector learning
TLDR
This book provides a comprehensive analysis of what can be done using Support vector Machines, achieving record results in real-life pattern recognition problems, and proposes a new form of nonlinear Principal Component Analysis using Support Vector kernel techniques, which it is considered as the most natural and elegant way for generalization of classical Principal Component analysis. Expand
Making large-scale support vector machine learning practical
TLDR
This chapter presents algorithmic and computational results developed for SV M light V2.0, which make large-scale SVM training more practical and give guidelines for the application of SVMs to large domains. Expand
Distortion-invariant recognition via jittered queries
  • M. Burl
  • Computer Science
  • Proceedings IEEE Conference on Computer Vision and Pattern Recognition. CVPR 2000 (Cat. No.PR00662)
  • 2000
TLDR
This paper presents a new approach for achieving distortion-invariant recognition and classification, where instead of querying with a single pattern, a more robust query is constructed, based on the family of patterns formed by distorting the test example. Expand
Efficient Pattern Recognition Using a New Transformation Distance
TLDR
A new distance measure which can be made locally invariant to any set of transformations of the input and can be computed efficiently is proposed. Expand
...
1
2
3
4
5
...