Stacked regressions

@article{Breiman1996StackedR,
  title={Stacked regressions},
  author={Leo Breiman},
  journal={Machine Learning},
  year={1996},
  volume={24},
  pages={49-64}
}
Stacking regressions is a method for forming linear combinations of different predictors to give improved prediction accuracy. The idea is to use cross-validation data and least squares under non-negativity constraints to determine the coefficients in the combination. Its effectiveness is demonstrated in stacking regression trees of different sizes and in a simulation stacking linear subset and ridge regressions. Reasons why this method works are explored. The idea of stacking originated with… CONTINUE READING

Citations

Publications citing this paper.
SHOWING 1-10 OF 529 CITATIONS

Scalable Ensemble Learning and Computationally Efficient Variance Estimation

VIEW 17 EXCERPTS
CITES METHODS & BACKGROUND
HIGHLY INFLUENCED

Aggregating Density Estimators: An Empirical Study

VIEW 5 EXCERPTS
CITES BACKGROUND
HIGHLY INFLUENCED

Experiments with AdaBoost.RT, an Improved Boosting Scheme for Regression

VIEW 12 EXCERPTS
CITES METHODS & BACKGROUND
HIGHLY INFLUENCED

Learning Sentence-internal Temporal Relations

  • J. Artif. Intell. Res.
  • 2006
VIEW 6 EXCERPTS
CITES METHODS
HIGHLY INFLUENCED

Model combination by decomposition and aggregation

VIEW 9 EXCERPTS
CITES METHODS
HIGHLY INFLUENCED

Stacking Bagged and Dagged Models

VIEW 21 EXCERPTS
CITES METHODS & BACKGROUND
HIGHLY INFLUENCED

Combining Estimates in Regression and Classification

VIEW 10 EXCERPTS
CITES METHODS & BACKGROUND
HIGHLY INFLUENCED

FILTER CITATIONS BY YEAR

1994
2019

CITATION STATISTICS

  • 77 Highly Influenced Citations

  • Averaged 44 Citations per year from 2017 through 2019