Corpus ID: 212628319

No Regret Sample Selection with Noisy Labels

@article{Mitsuo2020NoRS,
  title={No Regret Sample Selection with Noisy Labels},
  author={N. Mitsuo and S. Uchida and D. Suehiro},
  journal={ArXiv},
  year={2020},
  volume={abs/2003.03179}
}
  • N. Mitsuo, S. Uchida, D. Suehiro
  • Published 2020
  • Computer Science, Mathematics
  • ArXiv
  • The Deep Neural Network (DNN) suffers from noisy labeled data because of the risk of overfitting. To avoid the risk, in this paper, we propose a novel sample selection framework for learning noisy samples. The core idea is to employ a "regret" minimization approach. The proposed sample selection method adaptively selects a subset of noisy labeled training samples to minimize the regret of selecting noise samples. The algorithm works efficiently and performs with theoretical support. Moreover… CONTINUE READING

    Figures and Tables from this paper

    References

    SHOWING 1-10 OF 48 REFERENCES
    Joint Optimization Framework for Learning with Noisy Labels
    • 209
    • PDF
    Learning from Noisy Labels with Distillation
    • 244
    • PDF
    Learning to Learn From Noisy Labeled Data
    • 84
    • PDF
    Classification with Noisy Labels by Importance Reweighting
    • T. Liu, D. Tao
    • Computer Science, Mathematics
    • IEEE Transactions on Pattern Analysis and Machine Intelligence
    • 2016
    • 404
    • PDF
    Combating Noisy Labels by Agreement: A Joint Training Method with Co-Regularization
    • 28
    • Highly Influential
    • PDF
    A Semi-Supervised Two-Stage Approach to Learning from Noisy Labels
    • 33
    • PDF
    Learning with Noisy Labels
    • 581
    • PDF
    Generalized Cross Entropy Loss for Training Deep Neural Networks with Noisy Labels
    • 385
    • PDF
    Iterative Learning with Open-set Noisy Labels
    • 134
    • PDF
    How does Disagreement Help Generalization against Label Corruption?
    • 101
    • Highly Influential
    • PDF