The generalization error of the symmetric and scaled support vector machines

@article{Feng2001TheGE,
  title={The generalization error of the symmetric and scaled support vector machines},
  author={Jianfeng Feng and Peter Williams},
  journal={IEEE transactions on neural networks},
  year={2001},
  volume={12 5},
  pages={
          1255-60
        }
}
It is generally believed that the support vector machine (SVM) optimizes the generalization error and outperforms other learning machines. We show analytically, by concrete examples in the one dimensional case, that the SVM does improve the mean and standard deviation of the generalization error by a constant factor, compared to the worst learning machine. Our approach is in terms of the extreme value theory and both the mean and variance of the generalization errors are calculated exactly for… 
Bounds on Error Expectation for Support Vector Machines
TLDR
It is proved that the value of the span is always smaller (and can be much smaller) than the diameter of the smallest sphere containing the support vectors, used in previous bounds.
General Scaled Support Vector Machines
TLDR
This paper proposes a simple and efficient method called General Scaled SVM (GS-SVM) to extend the existing approach to multi-dimensional case and translates thehyperplane according to the distribution of data projected on the normal vector of the hyperplane.
Improving the Performance of the Support Vector Machine: Two Geometrical Scaling Methods
In this chapter, we discuss two possible ways of improving the per- formance of the SVM, using geometric methods. The first adapts the kernel by magnifying the Riemannian metric in the neighborhood
Support Vector Machines Based on Weighted Scatter Degree
TLDR
A simple but efficient method based on weighted scatter degree (WSD-SVM) to embed the global information into GS-S VM without any distribution assumptions is proposed and the results on several data sets show the advantages of WSD- SVM.
Total margin based adaptive fuzzy support vector machines for multiview face recognition
TLDR
Experimental results indicate that the proposed TAF-SVM is superior to the traditional SVM for multiview face recognition and can achieve smaller error variances than SVM.
Robust relative margin support vector machines
TLDR
This article proposed to change maximizing the shortest distance used in relative margin machine into maximizing the quantile distance, the pinball loss which is related to quantiles was used in classification.
Face Recognition Using Total Margin-Based Adaptive Fuzzy Support Vector Machines
TLDR
Experimental results show that the proposed TAF-SVM is superior to SVM in terms of the face-recognition accuracy and can achieve smaller error variances than SVM over a number of tests such that better recognition stability can be obtained.
Variants of Support Vector Machines
TLDR
This chapter discusses learning paradigms: incremental training, learning using privileged information, semi-supervised learning, multiple classifier systems, multiple kernel learning, and other topics: confidence level and visualization of support vector machines.
...
1
2
3
...

References

SHOWING 1-10 OF 20 REFERENCES
Bounds on Error Expectation for Support Vector Machines
TLDR
It is proved that the value of the span is always smaller (and can be much smaller) than the diameter of the smallest sphere containing the support vectors, used in previous bounds.
Statistical mechanics of Support Vector networks.
TLDR
This work investigates the generalization performance of support vector machines (SVMs), which have been recently introduced as a general alternative to neural networks, and finds that SVMs overfit only weakly.
Generalization errors of the simple perceptron
TLDR
A new approach to tackle the problem of finding an exact form of the generalization error of the simple perceptron in the case of the worst learning with the help of extreme value theory in statistics.
Introduction to Support Vector Machines
TLDR
Support Vector Machines (SVM’s) are intuitive, theoretically wellfounded, and have shown to be practically successful.
Artificial Neural Networks in Control and Optimization
TLDR
This thesis concerns the application of artificial neural networks to solve optimization and dynamical control problems and it is shown that the BPS algorithm can be used in this case to increase the fault tolerance of the neural controller in relation to loss of hidden units.
Knowledge-based analysis of microarray gene expression data by using support vector machines.
TLDR
A method of functionally classifying genes by using gene expression data from DNA microarray hybridization experiments, based on the theory of support vector machines (SVMs), to predict functional roles for uncharacterized yeast ORFs based on their expression data is introduced.
ECG analysis using nonlinear PCA neural networks for ischemia detection
TLDR
The NLPCA techniques are used to classify each segment into one of two classes: normal and abnormal (ST+, ST-, or artifact) and test results show that using only two nonlinear components and a training set of 1000 normal samples from each file produce a correct classification rate.
Behaviors of Spike Output Jitter in the Integrate-and-Fire Model
TLDR
The results suggest that the exponential distribution is critical case: a faster rate of decrease in the distribution tail as compared to the exponential distribu tail ensures the convergence of output jitter, whereas slower decay in theribution tail causes divergence of outputJitter.
A comparison of the noise sensitivity of nine QRS detection algorithms
The noise sensitivities of nine different QRS detection algorithms were measured for a normal, single-channel, lead-II, synthesized ECG corrupted with five different types of synthesized noise:
...
1
2
...