Stochastic Gradient Boosting

@inproceedings{Friedman1999StochasticGB,
  title={Stochastic Gradient Boosting},
  author={Jerome H. Friedman},
  year={1999}
}
Gradient boosting constructs additive regression models by sequentially tting a simple parameterized function (base learner) to current \pseudo"{residuals by least{squares at each iteration. The pseudo{residuals are the gradient of the loss functional being minimized, with respect to the model values at each training data point, evaluated at the current step. It is shown that both the approximation accuracy and execution speed of gradient boosting can be substantially improved by incorporating… CONTINUE READING
Highly Influential
This paper has highly influenced 169 other papers. REVIEW HIGHLY INFLUENTIAL CITATIONS
Highly Cited
This paper has 1,968 citations. REVIEW CITATIONS

Citations

Publications citing this paper.
Showing 1-10 of 971 extracted citations

1,969 Citations

0100200300'99'03'08'13'18
Citations per Year
Semantic Scholar estimates that this publication has 1,969 citations based on the available data.

See our FAQ for additional information.

References

Publications referenced by this paper.

Similar Papers

Loading similar papers…