• Corpus ID: 1836349

Experiments with a New Boosting Algorithm

@inproceedings{Freund1996ExperimentsWA,
  title={Experiments with a New Boosting Algorithm},
  author={Yoav Freund and Robert E. Schapire},
  booktitle={International Conference on Machine Learning},
  year={1996}
}
In an earlier paper, we introduced a new "boosting" algorithm called AdaBoost which, theoretically, can be used to significantly reduce the error of any learning algorithm that con- sistently generates classifiers whose performance is a little better than random guessing. [] Key Result In the second set of experiments, we studied in more detail the performance of boosting using a nearest-neighbor classifier on an OCR problem.

Figures and Tables from this paper

A New Boosting Algorithm Using Input-Dependent Regularizer

Empirical studies on eight dierent UCI data sets and one text categorization data set show that WeightBoost almost always achieves a considerably better classification accuracy than AdaBoost, and experiments on data with artificially controlled noise indicate that the WeightBoost is more robust to noise than Ada boost.

Improved Boosting Algorithms Using Confidence-rated Predictions

We describe several improvements to Freund and Schapire's AdaBoost boosting algorithm, particularly in a setting in which hypotheses may assign confidences to each of their predictions. We give a

Boosting Neural Networks

It is suggested that random resampling of the training data is not the main explanation of the success of the improvements brought by Ada Boost, and training methods based on sampling the training set and weighting the cost function are compared.

Stopping Criterion for Boosting-Based Data Reduction Techniques: from Binary to Multiclass Problem

The aim of the present paper is to relax the class constraint, and extend the contribution to multiclass problems, showing the benefits that the boosting-derived weighting rule brings to weighted nearest neighbor classifiers.

Supervised projection approach for boosting classifiers

Quadratic boosting

An Empirical Boosting Scheme for ROC-Based Genetic Programming Classifiers

A geometrical interpretation of the ROC curve to attribute an error measure to every training case is proposed and proposed to compare boosted Genetic Programming performance with published results on ROC-based Evolution Strategies and Support Vector Machines.

Training Methods for Adaptive Boosting of Neural Networks

This paper uses AdaBoost to improve the performances of neural networks and compares training methods based on sampling the training set and weighting the cost function.

An Empirical Comparison of Voting Classification Algorithms: Bagging, Boosting, and Variants

It is found that Bagging improves when probabilistic estimates in conjunction with no-pruning are used, as well as when the data was backfit, and that Arc-x4 behaves differently than AdaBoost if reweighting is used instead of resampling, indicating a fundamental difference.
...

References

SHOWING 1-10 OF 21 REFERENCES

Boosting a weak learning algorithm by majority

An algorithm for improving the accuracy of algorithms for learning binary concepts by combining a large number of hypotheses, each of which is generated by training the given learning algorithm on a different set of examples, is presented.

Improving Performance in Neural Networks Using a Boosting Algorithm

The effect of boosting is reported on four databases consisting of 12,000 digits from segmented ZIP codes from the United State Postal Service and the following from the National Institute of Standards and Testing (NIST).

A decision-theoretic generalization of on-line learning and an application to boosting

The model studied can be interpreted as a broad, abstract extension of the well-studied on-line prediction model to a general decision-theoretic setting, and it is shown that the multiplicative weight-update Littlestone?Warmuth rule can be adapted to this model, yielding bounds that are slightly weaker in some cases, but applicable to a considerably more general class of learning problems.

Applying the Waek Learning Framework to Understand and Improve C4.5

This paper performs experiments suggested by the formal results for Adaboost and C4:5 within the weak learning framework, and argues through experimental results that the theory must be understood in terms of a measure of a boosting algorithm's behavior called its advantage sequence.

Boosting and Other Ensemble Methods

A surprising result is shown for the original boosting algorithm: namely, that as the training set size increases, the training error decreases until it asymptotes to the test error rate.

Bias, Variance , And Arcing Classifiers

This work explores two arcing algorithms, compares them to each other and to bagging, and tries to understand how arcing works, which is more sucessful than bagging in variance reduction.

Boosting Decision Trees

A constructive, incremental learning system for regression problems that models data by means of locally linear experts that does not compete for data during learning and derives asymptotic results for this method.

C4.5: Programs for Machine Learning

A complete guide to the C4.5 system as implemented in C for the UNIX environment, which starts from simple core learning methods and shows how they can be elaborated and extended to deal with typical problems such as missing data and over hitting.

On the boosting ability of top-down decision tree learning algorithms

This work analyzes the performance of top?down algorithms for decision tree learning and proves that some popular and empirically successful heuristics that are base on first principles meet the criteria of an independently motivated theoretical model.

Boosting Performance in Neural Networks

The boosting algorithm is used to construct an ensemble of neural networks that significantly improves performance (compared to a single network) in optical character recognition (OCR) problems and improved performance significantly, and, in some cases, dramatically.