Gradient Boosting on Stochastic Data Streams

@inproceedings{Hu2017GradientBO,
  title={Gradient Boosting on Stochastic Data Streams},
  author={Hanzhang Hu and Wen Sun and Arun Venkatraman and Martial Hebert and J. Andrew Bagnell},
  booktitle={AISTATS},
  year={2017}
}
Boosting is a popular ensemble algorithm that generates more powerful learners by linearly combining base models from a simpler hypothesis class. In this work, we investigate the problem of adapting batch gradient boosting for minimizing convex loss functions to online setting where the loss at each iteration is i.i.d sampled from an unknown distribution. To generalize from batch to online, we first introduce the definition of online weak learning edge with which for strongly convex and smooth… CONTINUE READING
3
Twitter Mentions

Citations

Publications citing this paper.

Online Multiclass Boosting

VIEW 1 EXCERPT
CITES BACKGROUND

Similar Papers

Loading similar papers…