Process Consistency for Adaboost

@inproceedings{Jiang2000ProcessCF,
  title={Process Consistency for Adaboost},
  author={Wenxin Jiang},
  year={2000}
}
Recent experiments and theoretical studies show that AdaBoost can over t in the limit of large time. If running the algorithm forever is suboptimal, a natural question is how low can the prediction error be during the process of AdaBoost? We show under general regularity conditions that during the process of AdaBoost a consistent prediction is generated, which has the prediction error approximating the optimal Bayes error as the sample size increases. This result suggests that, while running… CONTINUE READING

From This Paper

Topics from this paper.
67 Citations
10 References
Similar Papers

Citations

Publications citing this paper.

References

Publications referenced by this paper.

Similar Papers

Loading similar papers…