Reducing Multiclass to Binary: A Unifying Approach for Margin Classifiers


We present a unifying framework for studying the solution of multiclass categorization problems by reducing them to multiple binary problems that are then solved using a margin-based binary learning algorithm. The proposed framework unifies some of the most popular approaches in which each class is compared against all others, or in which all pairs of classes are compared to each other, or in which output codes with error-correcting properties are used. We propose a general method for combining the classifiers generated on the binary problems, and we prove a general empirical multiclass loss bound given the empirical loss of the individual binary learning algorithms. The scheme and the corresponding bounds apply to many popular classification learning algorithms including support-vector machines, AdaBoost, regression, logistic regression and decision-tree algorithms. We also give a multiclass generalization error analysis for general output codes with AdaBoost as the binary learner. Experimental results with SVM and AdaBoost show that our scheme provides a viable alternative to the most commonly used multiclass algorithms.

Extracted Key Phrases

10 Figures and Tables

Citations per Year

1,976 Citations

Semantic Scholar estimates that this publication has 1,976 citations based on the available data.

See our FAQ for additional information.

Cite this paper

@article{Allwein2000ReducingMT, title={Reducing Multiclass to Binary: A Unifying Approach for Margin Classifiers}, author={Erin L. Allwein and Robert E. Schapire and Yoram Singer}, journal={Journal of Machine Learning Research}, year={2000}, volume={1}, pages={113-141} }