Gradient boosting for kernelized output spaces

@inproceedings{Geurts2007GradientBF,
  title={Gradient boosting for kernelized output spaces},
  author={Pierre Geurts and Louis Wehenkel and Florence d'Alch{\'e}-Buc},
  booktitle={ICML},
  year={2007}
}
A general framework is proposed for gradient boosting in supervised learning problems where the loss function is defined using a kernel over the output space. It extends boosting in a principled way to complex output spaces (images, text, graphs etc.) and can be applied to a general class of base learners working in kernelized output spaces. Empirical results are provided on three problems: a regression problem, an image completion task and a graph prediction problem. In these experiments, the… CONTINUE READING

Figures, Tables, and Topics from this paper.

Citations

Publications citing this paper.
SHOWING 1-10 OF 14 CITATIONS

GBDT-MO: Gradient Boosted Decision Trees for Multiple Outputs

Zhendong Zhang, Cheolkon Jung
  • ArXiv
  • 2019
VIEW 5 EXCERPTS
CITES BACKGROUND
HIGHLY INFLUENCED

Output Fisher embedding regression

Moussab Djerrab, Alexandre Rossetto Garcia, Maxime Sangnier, Florence d'Alché-Buc
  • Machine Learning
  • 2018
VIEW 3 EXCERPTS
CITES METHODS

References

Publications referenced by this paper.