# Using output codes to boost multiclass learning problems

@inproceedings{Schapire1997UsingOC, title={Using output codes to boost multiclass learning problems}, author={Robert E. Schapire}, booktitle={ICML}, year={1997} }

This paper describes a new technique for solv- ing multiclass learning problems by combining Freund and Schapire's boosting algorithm with the main ideas of Diet- terich and Bakiri's method of error-correcting output codes (ECOC). Boosting is a general method of improving the ac- curacy of a given base or "weak" learning algorithm. ECOC is a robust method of solving multiclass learning problems by reducing to a sequence of two-class problems. We show that our new hybrid method has advantages of…

## 319 Citations

Multiclass learning, boosting, and error-correcting codes

- Mathematics, Computer ScienceCOLT '99
- 1999

ECC, that, by using a different weighting of the votes of the weak hypotheses, is able to improve on the performance of ADABOOST.OC, is arguably a more direct reduction of multiclass learning to binary learning problems than previous multiclass boosting algorithms.

Multi-Class Learning by Smoothed Boosting

- Computer ScienceMachine Learning
- 2007

This paper proposes a new boosting algorithm, named “MSmoothBoost”, which introduces a smoothing mechanism into the boosting procedure to explicitly address the overfitting problem with AdaBoost.OC.

Multiclass boosting with repartitioning

- Computer ScienceICML
- 2006

This paper proposes a new multiclass boosting algorithm that modifies the coding matrix according to the learning ability of the base learner, and shows experimentally that this algorithm is very efficient in optimizing the multiclass margin cost, and outperforms existing multiclass algorithms such as AdaBoost.

A smoothed boosting algorithm using probabilistic output codes

- Computer Science, MathematicsICML
- 2005

A new boosting algorithm is proposed that improves the AdaBoost.OC algorithm for multi-class learning and introduces a probabilistic coding scheme to generate binary codes for multiple classes such that training errors can be efficiently reduced.

Improved Boosting Algorithms Using Confidence-rated Predictions

- Computer ScienceCOLT' 98
- 1998

We describe several improvements to Freund and Schapire's AdaBoost boosting algorithm, particularly in a setting in which hypotheses may assign confidences to each of their predictions. We give a…

Improved Boosting Algorithms using Confidence-Rated Predictions

- Mathematics, Computer ScienceCOLT
- 1998

We describe several improvements to Freund and Schapire‘s AdaBoost boosting algorithm, particularly in a setting in which hypotheses may assign confidences to each of their predictions. We give a…

Boosting Multiclass Learning with Repeating Codes

- 2006

A long-standing goal of machine learning is to build a system which can detect a large number of classes with accuracy and efficiency. Some relationships between classes would become a scale-free…

On the Consistency of Output Code Based Learning Algorithms for Multiclass Learning Problems

- Mathematics, Computer ScienceCOLT
- 2014

This is the first work that comprehensively studies consistency properties of output code based methods for multiclass learning, and derives general conditions on the binary surrogate loss under which the one-vs-all and all-pairs code matrices yield consistent algorithms with respect to the multiclass 0-1 loss.

Minimal classification method with error-correcting codes for multiclass recognition

- Mathematics, Computer ScienceInt. J. Pattern Recognit. Artif. Intell.
- 2005

In this work, we develop an efficient technique to transform a multiclass recognition problem into a minimal binary classification problem using the Minimal Classification Method (MCM). The MCM…

Multi-class AdaBoost ∗

- Computer Science
- 2009

A new algorithm is proposed that naturally extends the original AdaBoost algorithm to the multiclass case without reducing it to multiple two-class problems and is extremely easy to implement and is highly competitive with the best currently available multi-class classification methods.

## References

SHOWING 1-10 OF 27 REFERENCES

Solving Multiclass Learning Problems via Error-Correcting Output Codes

- Computer ScienceJ. Artif. Intell. Res.
- 1995

It is demonstrated that error-correcting output codes provide a general-purpose method for improving the performance of inductive learning programs on multiclass problems.

Experiments with a New Boosting Algorithm

- Computer ScienceICML
- 1996

This paper describes experiments carried out to assess how well AdaBoost with and without pseudo-loss, performs on real learning problems and compared boosting to Breiman's "bagging" method when used to aggregate various classifiers.

A decision-theoretic generalization of on-line learning and an application to boosting

- Computer ScienceEuroCOLT
- 1995

The model studied can be interpreted as a broad, abstract extension of the well-studied on-line prediction model to a general decision-theoretic setting, and the multiplicative weightupdate Littlestone Warmuth rule can be adapted to this model, yielding bounds that are slightly weaker in some cases, but applicable to a considerably more general class of learning problems.

A Decision-Theoretic Generalization of On-Line Learning and an Application to Boosting

- Computer Science, MathematicsCOLT 1997
- 1997

The model studied can be interpreted as a broad, abstract extension of the well-studied on-line prediction model to a general decision-theoretic setting, and it is shown that the multiplicative weight-update Littlestone?Warmuth rule can be adapted to this model, yielding bounds that are slightly weaker in some cases, but applicable to a considerably more general class of learning problems.

The strength of weak learnability

- Computer ScienceMach. Learn.
- 1990

In this paper, it is shown that the two notions of learnability are equivalent, and a method is described for converting a weak learning algorithm into one that achieves arbitrarily high accuracy.

Boosting Decision Trees

- Computer ScienceNIPS
- 1995

A constructive, incremental learning system for regression problems that models data by means of locally linear experts that does not compete for data during learning and derives asymptotic results for this method.

Bias, Variance , And Arcing Classifiers

- Computer Science
- 1996

This work explores two arcing algorithms, compares them to each other and to bagging, and tries to understand how arcing works, which is more sucessful than bagging in variance reduction.

Bagging, Boosting, and C4.5

- Computer ScienceAAAI/IAAI, Vol. 1
- 1996

Results of applying Breiman's bagging and Freund and Schapire's boosting to a system that learns decision trees and testing on a representative collection of datasets show boosting shows the greater benefit.

Boosting the margin: A new explanation for the effectiveness of voting methods

- Mathematics, Computer ScienceICML
- 1997

It is shown that techniques used in the analysis of Vapnik's support vector classifiers and of neural networks with small weights can be applied to voting methods to relate the margin distribution to the test error.

C4.5: Programs for Machine Learning

- Computer Science
- 1992

A complete guide to the C4.5 system as implemented in C for the UNIX environment, which starts from simple core learning methods and shows how they can be elaborated and extended to deal with typical problems such as missing data and over hitting.