Process Consistency for Adaboost

  title={Process Consistency for Adaboost},
  author={Wenxin Jiang},
Recent experiments and theoretical studies show that AdaBoost can over t in the limit of large time. If running the algorithm forever is suboptimal, a natural question is how low can the prediction error be during the process of AdaBoost? We show under general regularity conditions that during the process of AdaBoost a consistent prediction is generated, which has the prediction error approximating the optimal Bayes error as the sample size increases. This result suggests that, while running… CONTINUE READING

From This Paper

Topics from this paper.
67 Citations
10 References
Similar Papers


Publications citing this paper.


Publications referenced by this paper.

Similar Papers

Loading similar papers…