Corpus ID: 9963515

Toward Robustness against Label Noise in Training Deep Discriminative Neural Networks

@inproceedings{Vahdat2017TowardRA,
  title={Toward Robustness against Label Noise in Training Deep Discriminative Neural Networks},
  author={Arash Vahdat},
  booktitle={NIPS},
  year={2017}
}
  • Arash Vahdat
  • Published in NIPS 2017
  • Computer Science, Mathematics
  • Collecting large training datasets, annotated with high-quality labels, is costly and time-consuming. This paper proposes a novel framework for training deep convolutional neural networks from noisy labeled datasets that can be obtained cheaply. The problem is formulated using an undirected graphical model that represents the relationship between noisy and clean labels, trained in a semi-supervised setting. In our formulation, the inference over latent clean labels is tractable and is… CONTINUE READING

    Figures, Tables, and Topics from this paper.

    Explore key concepts

    Links to highly relevant papers for key concepts in this paper:

    Citations

    Publications citing this paper.
    SHOWING 1-10 OF 113 CITATIONS

    Joint Optimization Framework for Learning with Noisy Labels

    VIEW 4 EXCERPTS
    CITES BACKGROUND & METHODS
    HIGHLY INFLUENCED

    Learning to Learn From Noisy Labeled Data

    VIEW 5 EXCERPTS
    CITES BACKGROUND, METHODS & RESULTS
    HIGHLY INFLUENCED

    Iterative Learning with Open-set Noisy Labels

    VIEW 8 EXCERPTS
    CITES BACKGROUND & METHODS
    HIGHLY INFLUENCED

    A Simple yet Effective Baseline for Robust Deep Learning with Noisy Labels

    VIEW 1 EXCERPT
    CITES BACKGROUND

    Deep Self-Learning From Noisy Labels

    VIEW 1 EXCERPT
    CITES METHODS

    Probabilistic End-To-End Noise Correction for Learning With Noisy Labels

    • Kun Yi, Jianxin Wu
    • Computer Science
    • 2019 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)
    • 2019
    VIEW 3 EXCERPTS
    CITES BACKGROUND & METHODS

    Collaborative learning with corrupted labels

    A Semi-Supervised Two-Stage Approach to Learning from Noisy Labels

    VIEW 2 EXCERPTS
    CITES METHODS & BACKGROUND

    FILTER CITATIONS BY YEAR

    2017
    2020

    CITATION STATISTICS

    • 11 Highly Influenced Citations

    • Averaged 37 Citations per year from 2018 through 2020

    References

    Publications referenced by this paper.
    SHOWING 1-10 OF 40 REFERENCES

    Learning from massive noisy labeled data for image classification

    VIEW 3 EXCERPTS

    Making Deep Neural Networks Robust to Label Noise: A Loss Correction Approach

    VIEW 3 EXCERPTS
    HIGHLY INFLUENTIAL

    Learning from Noisy Large-Scale Datasets with Minimal Supervision

    VIEW 2 EXCERPTS

    Conditional Random Fields as Recurrent Neural Networks

    VIEW 2 EXCERPTS

    Efficient Piecewise Training of Deep Structured Models for Semantic Segmentation

    VIEW 1 EXCERPT