Random classification noise defeats all convex potential boosters

@article{Long2008RandomCN,
  title={Random classification noise defeats all convex potential boosters},
  author={Philip M. Long and R. Servedio},
  journal={Machine Learning},
  year={2008},
  volume={78},
  pages={287-304}
}
A broad class of boosting algorithms can be interpreted as performing coordinate-wise gradient descent to minimize some potential function of the margins of a data set. This class includes AdaBoost, LogitBoost, and other widely used and well-studied boosters. In this paper we show that for a broad class of convex potential functions, any such boosting algorithm is highly susceptible to random classification noise. We do this by showing that for any such booster and any nonzero random… Expand
165 Citations
Noise peeling methods to improve boosting algorithms
Learning with Noisy Labels
Cost-Sensitive Learning with Noisy Labels
Soft-max boosting
  • M. Geist
  • Mathematics, Computer Science
  • Machine Learning
  • 2015
On the Error Resistance of Hinge Loss Minimization
Loss factorization, weakly supervised learning and label noise robustness
Direct 0-1 Loss Minimization and Margin Maximization with Boosting
Adaptive Martingale Boosting
...
1
2
3
4
5
...

References

SHOWING 1-7 OF 7 REFERENCES
SOME INFINITY THEORY FOR PREDICTOR ENSEMBLES
A Geometric Approach to Leveraging Weak Learners
MadaBoost: A Modification of AdaBoost
Potential boosters? In Advances in Neural Information
  • Processing Systems (NIPS),
  • 1999
Potential boosters? In NIPS, pages 258–264
  • 1999