AdaTree: Boosting a Weak Classifier into a Decision Tree

@article{Grossmann2004AdaTreeBA,
  title={AdaTree: Boosting a Weak Classifier into a Decision Tree},
  author={Etienne Grossmann},
  journal={2004 Conference on Computer Vision and Pattern Recognition Workshop},
  year={2004},
  pages={105-105}
}
We present a boosting method that results in a decision tree rather than a fixed linear sequence of classifiers. An equally correct statement is that we present a tree-growing method whose performance can be analysed in the framework of Adaboost. We argue that Adaboost can be improved by presenting the input to a sequence of weak classifiers, each one tuned to the conditional probability determined by the output of previous weak classifiers. As a result, the final classifier has a tree… CONTINUE READING

Similar Papers

Topics from this paper.

Citations

Publications citing this paper.
SHOWING 1-10 OF 29 CITATIONS

Fast AdaBoost training using weighted novelty selection

  • The 2011 International Joint Conference on Neural Networks
  • 2011
VIEW 6 EXCERPTS
CITES METHODS
HIGHLY INFLUENCED

Probabilistic boosting-tree: learning discriminative models for classification, recognition, and clustering

  • Tenth IEEE International Conference on Computer Vision (ICCV'05) Volume 1
  • 2005
VIEW 5 EXCERPTS
CITES BACKGROUND & METHODS
HIGHLY INFLUENCED

Semantic driven hierarchical learning for energy-efficient image classification

  • Design, Automation & Test in Europe Conference & Exhibition (DATE), 2017
  • 2017
VIEW 2 EXCERPTS
CITES METHODS

A new method for solving overfitting problem of gentle AdaBoost

  • International Conference on Graphic and Image Processing
  • 2014
VIEW 1 EXCERPT
CITES BACKGROUND