• Publications
  • Influence
Random Forests
  • L. Breiman
  • Computer Science, Mathematics
  • Machine Learning
  • 1 October 2001
TLDR
The generalization error of a forest of tree classifiers depends on the strength of the individual trees in the forest and the correlation between them. Expand
  • 49,017
  • 3892
  • PDF
Classification and Regression Trees
TLDR
We use classification and regression trees to model the relationship between some outcome or response and a set of features or explanatory variables. Expand
  • 21,815
  • 1818
  • PDF
Bagging predictors
TLDR
Bagging predictors is a method for generating multiple versions of a predictor and using these to get an aggregated predictor. Expand
  • 12,079
  • 1198
  • PDF
Classification and Regression Trees Wadsworth
  • 1,221
  • 178
Statistical modeling: The two cultures
There are two cultures in the use of statistical modeling to reach conclusions from data. One assumes that the data are generated bya given stochastic data model. The other uses algorithmic modelsExpand
  • 1,459
  • 162
  • PDF
Bagging Predictors
  • L. Breiman
  • Computer Science
  • Machine Learning
  • 1 August 1996
TLDR
Bagging predictors is a method for generating multiple versions of a predictor and using these to get an aggregated predictor. Expand
  • 6,026
  • 145
  • PDF
Better subset regression using the nonnegative garrote
A new method, called the nonnegative (nn) garrote, is proposed for doing subset regression. It both shrinks and zeroes coefficients. In tests on real and simulated data, it produces lower predictionExpand
  • 903
  • 130
Heuristics of instability and stabilization in model selection
In model selection, usually a best predictor is chosen from a collection {μ(.,s)} of predictors where μ(.,s) is the minimum least-squares predictor in a collection U s of predictors. Here s is aExpand
  • 1,031
  • 108
  • PDF
Stacked regressions
TLDR
Stacking regressions is a method for forming linear combinations of different predictors to give improved prediction accuracy. Expand
  • 769
  • 86
  • PDF