Learn More
Multiclass SVMs are usually implemented by combining several two-class SVMs. The one-versus-all method using winner-takes-all strategy and the one-versus-one method implemented by max-wins voting are popularly used for this purpose. In this paper we give empirical evidence to show that these methods are inferior to another one-versus-one method: one that(More)
Choosing optimal hyperparameters for support vector machines is an important step in SVM design. This is usually done by minimizing either an estimate of generalization error or some other related performance measures. In this paper, we empirically study the usefulness of several simple performance measures that are very inexpensive to compute. The results(More)
This paper proposes a new feature selection method that uses a backward elimination procedure similar to that implemented in support vector machine recursive feature elimination (SVM-RFE). Unlike the SVM-RFE method, at each step, the proposed approach computes the feature ranking score from a statistical analysis of weight vectors of multiple linear SVMs(More)
In this paper, we propose a multi-category classification method that combines binary classifiers through soft-max function. Pos-teriori probabilities are also obtained. Both, one-versus-all and one-versus-one classifiers can be used in the combination. Empirical comparison shows that the proposed method is competitive with other implementations of(More)
We studied two cancer classification problems with mass spectrometry data and used SVM-RFE to select a small subset of peaks as input variables for the classification. Our study shows that, SVM-RFE can select a good small subset of peaks with which the classifier achieves high prediction accuracy and the performance is much better than with the feature(More)
  • 1