Corpus ID: 10978542

Boosting methodology for regression problems

@inproceedings{Ridgeway1999BoostingMF,
  title={Boosting methodology for regression problems},
  author={G. Ridgeway and D. Madigan and T. Richardson},
  booktitle={AISTATS},
  year={1999}
}
Classification problems have dominated research on boosting to date. The application of boosting to regression problems, on the other hand, has received little investigation. In this paper we develop a new boosting method for regression problems. We cast the regression problem as a classification problem and apply an interpretable form of the boosted naive Bayes classifier. This induces a regression model that we show to be expressible as an additive model for which we derive estimators and… Expand
Boosting Methods for Regression
TLDR
This paper examines ensemble methods for regression that leverage or “boost” base regressors by iteratively calling them on modified samples and bound the complexity of the regression functions produced in order to derive PAC-style bounds on their generalization errors. Expand
New boosting methods of Gaussian processes for regression
  • Y. Song, Changshui Zhang
  • Computer Science
  • Proceedings. 2005 IEEE International Joint Conference on Neural Networks, 2005.
  • 2005
TLDR
Two boosting methods of GPs for regression according to the characteristic of them are developed and it is found that these methods are more stable and essentially have less over-fitting problems than the other methods. Expand
The Boosting Approach to Machine Learning An Overview
TLDR
This chapter overviews some of the recent work on boosting including analyses of AdaBoost's training error and generalization error; boosting’s connection to game theory and linear programming; the relationship between boosting and logistic regression; extensions of Ada boost for multiclass classification problems; methods of incorporating human knowledge into boosting; and experimental and applied work using boosting. Expand
Experiments with AdaBoost.RT, an Improved Boosting Scheme for Regression
TLDR
A new boosting algorithm, AdaBoost.RT, is described, which requires selecting the suboptimal value of the error threshold to demarcate examples as poorly or well predicted for regression problems. Expand
Boosting and instability for regression trees
TLDR
The AdaBoost like algorithm for boosting CART regression trees is considered, the ability of boosting to track outliers and to concentrate on hard observations is used to explore a non-standard regression context. Expand
The State of Boosting ∗
In many problem domains, combining the predictions of several models often results in a model with improved predictive performance. Boosting is one such method that has shown great promise. On theExpand
Improving nonparametric regression methods by bagging and boosting
Recently, many authors have proposed new algorithms to improve the accuracy of certain classifiers by assembling a collection of individual classifiers obtained resampling on the training sample.Expand
Boosting the PLS Algorithm for Regressive Modelling
  • Ling Yu, Tiejun Wu
  • Mathematics
  • 2006 6th World Congress on Intelligent Control and Automation
  • 2006
Boosting algorithms are a class of general methods used to improve the generalization performance of regression analysis. The main idea is to maintain a distribution over the train set. In order toExpand
A Comparison of Model Aggregation Methods for Regression
TLDR
Experiments reveal that different types of AdaBoost algorithms require different complexities of base models, and they outperform Bagging at their best, but Bagging achieves a consistent level of success with all base model, providing a robust alternative. Expand
Boosting regression methods based on a geometric conversion approach: Using SVMs base learners
TLDR
A new approach to extending boosting to regression is proposed that converts a regression sample to a binary classification sample from a geometric point of view, and performs AdaBoost with support vector machines base learner on the converted classification sample. Expand
...
1
2
3
4
5
...

References

SHOWING 1-10 OF 26 REFERENCES
Improving Regressors using Boosting Techniques
TLDR
This work uses regression trees as fundamental building blocks in bagging committee machines and boosting committee machines to build a committee of regressors that may be superior to a single regressor. Expand
Special Invited Paper-Additive logistic regression: A statistical view of boosting
Boosting is one of the most important recent developments in classification methodology. Boosting works by sequentially applying a classification algorithm to reweighted versions of the training dataExpand
Interpretable Boosted Naïve Bayes Classification
TLDR
A variant of the boosted naive Bayes classifier that facilitates explanations while retaining predictive performance is proposed. Expand
A decision-theoretic generalization of on-line learning and an application to boosting
TLDR
The model studied can be interpreted as a broad, abstract extension of the well-studied on-line prediction model to a general decision-theoretic setting, and the multiplicative weightupdate Littlestone Warmuth rule can be adapted to this model, yielding bounds that are slightly weaker in some cases, but applicable to a considerably more general class of learning problems. Expand
Variable selection via Gibbs sampling
Abstract A crucial problem in building a multiple regression model is the selection of predictors to include. The main thrust of this article is to propose and develop a procedure that usesExpand
Arcing Classifiers
Recent work has shown that combining multiple versions of unstable classifiers such as trees or neural nets results in reduced test set error. One of the more effective is bagging (Breiman [1996a])Expand
An Empirical Comparison of Voting Classification Algorithms
Methods for voting classification algorithms, such as Bagging and AdaBoost, have been shown to be very successful in improving the accuracy of certain classifiers for artificial and real-world data...
Projection Pursuit Regression
Abstract A new method for nonparametric multiple regression is presented. The procedure models the regression surface as a sum of general smooth functions of linear combinations of the predictorExpand
Prediction Games and Arcing Algorithms
  • L. Breiman
  • Mathematics, Computer Science
  • Neural Computation
  • 1999
TLDR
The theory behind the success of adaptive reweighting and combining algorithms (arcing) such as Adaboost and others in reducing generalization error has not been well understood, and an explanation of whyAdaboost works in terms of its ability to produce generally high margins is offered. Expand
Slicing Regression: A Link-free Regression Method
Slicing Regression: A Link-Free Regression M e t h o d Author(s): Naihua Duan and K e r - C h a u L i S o u r c e : The Annals of Statistics, V o l . 19, N o . 2 ( T u n . , 1991), p p . 5 0 5 - 5 3Expand
...
1
2
3
...