Boosting Methods for Regression

@article{Duffy2004BoostingMF,
  title={Boosting Methods for Regression},
  author={Nigel P. Duffy and David P. Helmbold},
  journal={Machine Learning},
  year={2004},
  volume={47},
  pages={153-200}
}
In this paper we examine ensemble methods for regression that leverage or “boost” base regressors by iteratively calling them on modified samples. The most successful leveraging algorithm for classification is AdaBoost, an algorithm that requires only modest assumptions on the base learning method for its strong theoretical guarantees. We present several gradient descent leveraging algorithms for regression and prove AdaBoost-style bounds on their sample errors using intuitive assumptions on… Expand
Combining Bagging and Additive Regression
TLDR
An ensemble using an averaging methodology of bagging and boosting ensembles with 10 sublearners in each one is built and it is found that bagging is much more robust than boosting in noisy settings. Expand
Local Additive Regression of Decision Stumps
TLDR
A technique of local boosting of decision stumps is proposed coupled with a machine learning technique – boosting and performed a comparison with other well known methods and ensembles on standard benchmark datasets and the performance of the proposed technique was greater in most cases. Expand
Ensemble approaches for regression: A survey
TLDR
Different approaches to each of these phases that are able to deal with the regression problem are discussed, categorizing them in terms of their relevant characteristics and linking them to contributions from different fields. Expand
Re-scale boosting for regression and classification
TLDR
A new boosting strategy is developed, called the re-scale boosting (RBoosting), to accelerate the numerical convergence rate and improve the learning performance of boosting, and shows that RBoosting outperforms boosting in terms of generalization. Expand
Using Boosting to prune Double-Bagging ensembles
In this paper, Boosting is used to determine the order in which base predictors are aggregated into a Double-Bagging ensemble, and a subensemble is constructed by early stopping the aggregationExpand
BooST: Boosting Smooth Trees for Partial Effect Estimation in Nonlinear Regressions
TLDR
A new machine learning model for nonlinear regression called the Boosted Smooth Transition Regression Trees (BooST), which is a combination of boosting algorithms with smooth transition regression trees, which can provide more interpretation about the mapping between the covariates and the dependent variable than other tree-based models. Expand
COMBINING BAGGING , BOOSTING AND RANDOM SUBSPACE ENSEMBLES FOR REGRESSION PROBLEMS
Bagging, boosting and random subspace methods are well known re-sampling ensemble methods that generate and combine a diversity of learners using the same learning algorithm for the base-regressor.Expand
Rescaled Boosting in Classification
TLDR
This paper aims at developing a new boosting strategy, called rescaled boosting (RBoosting), to accelerate the numerical convergence rate and, consequently, improve learning performances of the original boosting. Expand
The Boosting Approach to Machine Learning An Overview
TLDR
This chapter overviews some of the recent work on boosting including analyses of AdaBoost's training error and generalization error; boosting’s connection to game theory and linear programming; the relationship between boosting and logistic regression; extensions of Ada boost for multiclass classification problems; methods of incorporating human knowledge into boosting; and experimental and applied work using boosting. Expand
Bagged Averaging of Regression Models
TLDR
A comparison of the presented ensemble with other ensembles that use either the linear regression or the regression trees as base learner and the performance of the proposed method was better in most cases. Expand
...
1
2
3
4
5
...

References

SHOWING 1-10 OF 43 REFERENCES
Boosting methodology for regression problems
TLDR
This paper develops a new boosting method for regression problems that casts the regression problem as a classification problem and applies an interpretable form of the boosted naive Bayes classifier, which induces a regression model that is shown to be expressible as an additive model. Expand
Improving Regressors using Boosting Techniques
TLDR
This work uses regression trees as fundamental building blocks in bagging committee machines and boosting committee machines to build a committee of regressors that may be superior to a single regressor. Expand
A Geometric Approach to Leveraging Weak Learners
TLDR
A new leveraging algorithm is introduced based on a natural potential function that has bounds that are incomparable to AdaBoost's, and their empirical performance is similar to Ada boost's. Expand
Boosting Algorithms as Gradient Descent
TLDR
Following previous theoretical results bounding the generalization performance of convex combinations of classifiers in terms of general cost functions of the margin, a new algorithm (DOOM II) is presented for performing a gradient descent optimization of such cost functions. Expand
Stochastic gradient boosting
Gradient boosting constructs additive regression models by sequentially fitting a simple parameterized function (base learner) to current "pseudo'-residuals by least squares at each iteration. TheExpand
Improved Boosting Algorithms Using Confidence-rated Predictions
We describe several improvements to Freund and Schapire's AdaBoost boosting algorithm, particularly in a setting in which hypotheses may assign confidences to each of their predictions. We give aExpand
Bagging predictors
TLDR
Tests on real and simulated data sets using classification and regression trees and subset selection in linear regression show that bagging can give substantial gains in accuracy. Expand
An Empirical Comparison of Voting Classification Algorithms: Bagging, Boosting, and Variants
TLDR
It is found that Bagging improves when probabilistic estimates in conjunction with no-pruning are used, as well as when the data was backfit, and that Arc-x4 behaves differently than AdaBoost if reweighting is used instead of resampling, indicating a fundamental difference. Expand
Bagging, Boosting, and C4.5
TLDR
Results of applying Breiman's bagging and Freund and Schapire's boosting to a system that learns decision trees and testing on a representative collection of datasets show boosting shows the greater benefit. Expand
Special Invited Paper-Additive logistic regression: A statistical view of boosting
Boosting is one of the most important recent developments in classification methodology. Boosting works by sequentially applying a classification algorithm to reweighted versions of the training dataExpand
...
1
2
3
4
5
...