Boosting Algorithms as Gradient Descent in Function

  title={Boosting Algorithms as Gradient Descent in Function},
  author={S Mason and Jonathan Baxter and Peter Bartlett and Marcus Frean},
Much recent attention, both experimental and theoretical, has been focussed on classii-cation algorithms which produce voted combinations of classiiers. Recent theoretical work has shown that the impressive generalization performance of algorithms like AdaBoost can be attributed to the classiier having large margins on the training data. We present abstract algorithms for nding linear and convex combinations of functions that minimize arbitrary cost functionals (i.e functionals that do not… CONTINUE READING
Highly Influential
This paper has highly influenced 23 other papers. REVIEW HIGHLY INFLUENTIAL CITATIONS
Highly Cited
This paper has 249 citations. REVIEW CITATIONS
114 Citations
21 References
Similar Papers


Publications citing this paper.
Showing 1-10 of 114 extracted citations

250 Citations

Citations per Year
Semantic Scholar estimates that this publication has 250 citations based on the available data.

See our FAQ for additional information.

Similar Papers

Loading similar papers…