# Boosting methodology for regression problems

@inproceedings{Ridgeway1999BoostingMF, title={Boosting methodology for regression problems}, author={G. Ridgeway and D. Madigan and T. Richardson}, booktitle={AISTATS}, year={1999} }

Classification problems have dominated research on boosting to date. The application of boosting to regression problems, on the other hand, has received little investigation. In this paper we develop a new boosting method for regression problems. We cast the regression problem as a classification problem and apply an interpretable form of the boosted naive Bayes classifier. This induces a regression model that we show to be expressible as an additive model for which we derive estimators and… Expand

#### 102 Citations

Boosting Methods for Regression

- Mathematics, Computer Science
- Machine Learning
- 2004

This paper examines ensemble methods for regression that leverage or “boost” base regressors by iteratively calling them on modified samples and bound the complexity of the regression functions produced in order to derive PAC-style bounds on their generalization errors. Expand

New boosting methods of Gaussian processes for regression

- Computer Science
- Proceedings. 2005 IEEE International Joint Conference on Neural Networks, 2005.
- 2005

Two boosting methods of GPs for regression according to the characteristic of them are developed and it is found that these methods are more stable and essentially have less over-fitting problems than the other methods. Expand

The Boosting Approach to Machine Learning An Overview

- Computer Science
- 2003

This chapter overviews some of the recent work on boosting including analyses of AdaBoost's training error and generalization error; boosting’s connection to game theory and linear programming; the relationship between boosting and logistic regression; extensions of Ada boost for multiclass classification problems; methods of incorporating human knowledge into boosting; and experimental and applied work using boosting. Expand

Experiments with AdaBoost.RT, an Improved Boosting Scheme for Regression

- Computer Science, Medicine
- Neural Computation
- 2006

A new boosting algorithm, AdaBoost.RT, is described, which requires selecting the suboptimal value of the error threshold to demarcate examples as poorly or well predicted for regression problems. Expand

Boosting and instability for regression trees

- Computer Science, Mathematics
- Comput. Stat. Data Anal.
- 2006

The AdaBoost like algorithm for boosting CART regression trees is considered, the ability of boosting to track outliers and to concentrate on hard observations is used to explore a non-standard regression context. Expand

The State of Boosting ∗

- 1999

In many problem domains, combining the predictions of several models often results in a model with improved predictive performance. Boosting is one such method that has shown great promise. On the… Expand

Improving nonparametric regression methods by bagging and boosting

- Mathematics
- 2002

Recently, many authors have proposed new algorithms to improve the accuracy of certain classifiers by assembling a collection of individual classifiers obtained resampling on the training sample.… Expand

Boosting the PLS Algorithm for Regressive Modelling

- Mathematics
- 2006 6th World Congress on Intelligent Control and Automation
- 2006

Boosting algorithms are a class of general methods used to improve the generalization performance of regression analysis. The main idea is to maintain a distribution over the train set. In order to… Expand

A Comparison of Model Aggregation Methods for Regression

- Computer Science
- ICANN
- 2003

Experiments reveal that different types of AdaBoost algorithms require different complexities of base models, and they outperform Bagging at their best, but Bagging achieves a consistent level of success with all base model, providing a robust alternative. Expand

Boosting regression methods based on a geometric conversion approach: Using SVMs base learners

- Mathematics, Computer Science
- Neurocomputing
- 2013

A new approach to extending boosting to regression is proposed that converts a regression sample to a binary classification sample from a geometric point of view, and performs AdaBoost with support vector machines base learner on the converted classification sample. Expand

#### References

SHOWING 1-10 OF 26 REFERENCES

Improving Regressors using Boosting Techniques

- Computer Science
- ICML
- 1997

This work uses regression trees as fundamental building blocks in bagging committee machines and boosting committee machines to build a committee of regressors that may be superior to a single regressor. Expand

Special Invited Paper-Additive logistic regression: A statistical view of boosting

- Mathematics
- 2000

Boosting is one of the most important recent developments in classification methodology. Boosting works by sequentially applying a classification algorithm to reweighted versions of the training data… Expand

Interpretable Boosted Naïve Bayes Classification

- Computer Science
- KDD
- 1998

A variant of the boosted naive Bayes classifier that facilitates explanations while retaining predictive performance is proposed. Expand

A decision-theoretic generalization of on-line learning and an application to boosting

- Computer Science
- EuroCOLT
- 1995

The model studied can be interpreted as a broad, abstract extension of the well-studied on-line prediction model to a general decision-theoretic setting, and the multiplicative weightupdate Littlestone Warmuth rule can be adapted to this model, yielding bounds that are slightly weaker in some cases, but applicable to a considerably more general class of learning problems. Expand

Variable selection via Gibbs sampling

- Mathematics
- 1993

Abstract A crucial problem in building a multiple regression model is the selection of predictors to include. The main thrust of this article is to propose and develop a procedure that uses… Expand

Arcing Classifiers

- 1998

Recent work has shown that combining multiple versions of unstable classifiers such as trees or neural nets results in reduced test set error. One of the more effective is bagging (Breiman [1996a])… Expand

An Empirical Comparison of Voting Classification Algorithms

- Mathematics
- 1999

Methods for voting classification algorithms, such as Bagging and AdaBoost, have been shown to be very successful in improving the accuracy of certain classifiers for artificial and real-world data...

Projection Pursuit Regression

- Mathematics
- 1981

Abstract A new method for nonparametric multiple regression is presented. The procedure models the regression surface as a sum of general smooth functions of linear combinations of the predictor… Expand

Prediction Games and Arcing Algorithms

- Mathematics, Computer Science
- Neural Computation
- 1999

The theory behind the success of adaptive reweighting and combining algorithms (arcing) such as Adaboost and others in reducing generalization error has not been well understood, and an explanation of whyAdaboost works in terms of its ability to produce generally high margins is offered. Expand

Slicing Regression: A Link-free Regression Method

- Mathematics, Computer Science
- 1991

Slicing Regression: A Link-Free Regression M e t h o d Author(s): Naihua Duan and K e r - C h a u L i S o u r c e : The Annals of Statistics, V o l . 19, N o . 2 ( T u n . , 1991), p p . 5 0 5 - 5 3… Expand