Extended T: Learning with Mixed Closed-set and Open-set Noisy Labels

@article{Xia2020ExtendedTL,
  title={Extended T: Learning with Mixed Closed-set and Open-set Noisy Labels},
  author={Xiaobo Xia and Tongliang Liu and Bo Han and Nannan Wang and Jiankang Deng and Jiatong Li and Yinian Mao},
  journal={IEEE transactions on pattern analysis and machine intelligence},
  year={2020},
  volume={PP}
}
The noise transition matrix , reflecting the probabilities that true labels flip into noisy ones, is of vital importance to model label noise and build statistically consistent classifiers. The traditional transition matrix is limited to model closed-set label noise, where noisy training data have true class labels within the noisy label set. It is unfitted to employ such a transition matrix to model open-set label noise, where some true class labels are outside the noisy label set. Therefore… 

Figures and Tables from this paper

Open-set Label Noise Can Improve Robustness Against Inherent Label Noise

Empirically show that open-set noisy labels can be non-toxic and even benefit the robustness against inherent noisy labels, and propose a simple yet effective regularization by introducing Open-set samples with Dynamic Noisy Labels (ODNL) into training.

A Second-Order Approach to Learning with Instance-Dependent Label Noise

This work proposes and studies the potentials of a second-order approach that leverages the estimation of several covariance terms defined between the instance-dependent noise rates and the Bayes optimal label, and shows that this set of second- order statistics successfully captures the induced imbalances.

Tackling Instance-Dependent Label Noise with Dynamic Distribution Calibration

This paper hypothesizes that, before training data are corrupted by label noise, each class conforms to a multivariate Gaussian distribution at the feature level, and proposes two methods based on the mean and covariance of multivariateGaussian distribution respectively.

Class2Simi: A Noise Reduction Perspective on Learning with Noisy Labels

This paper proposes a framework called Class2Simi, which transforms data points with noisy class labels to data pairs with noisy similarity labels, where a similarity label denotes whether a pair shares the class label or not, and changes loss computation on top of model prediction into a pairwise manner.

Centrality and Consistency: Two-Stage Clean Samples Identification for Learning with Instance-Dependent Noisy Labels

This work proposes a two-stage clean samples identification method that employs a class-level feature clustering procedure for the early identification of clean samples that are near the class-wise prediction centers and addresses the class imbalance problem by aggregating rare classes according to their prediction entropy.

To Smooth or Not? When Label Smoothing Meets Noisy Labels

Among other established properties, it is theoretically show NLS is considered more beneficial when the label noise rates are high, and understandings for the properties of LS and NLS when learning with noisy labels are provided.

Constrained Instance and Class Reweighting for Robust Learning under Label Noise

This work proposes a principled approach for tackling label noise with the aim of assigning importance weights to individual instances and class labels by formulating a class of constrained optimization problems that yield simple closed form updates for these importance weights.

Clusterability as an Alternative to Anchor Points When Learning with Noisy Labels

This paper discovers an efficient estimation procedure based on a clusterability condition and proves that with clusterable representations of features, using up to third-order consensuses of noisy labels among neighbor representations is sufficient to estimate a unique transition matrix.

A Good Representation Detects Noisy Labels

Given good representations that are commonly available in practice, given good representations, this paper proposes a universally applicable and training-free solution to detect noisy labels and theoretically analyze how they affect the local voting and provide guidelines for tuning neighborhood size.

Robust early-learning: Hindering the memorization of noisy labels

The memorization effects of deep networks show that they will first memorize training data with clean labels and then those with noisy labels. The early stopping method therefore can be exploited for

References

SHOWING 1-10 OF 83 REFERENCES

EvidentialMix: Learning with Combined Open-set and Closed-set Noisy Labels

A new variant of the noisy label problem that combines the open-set and closed-set noisy labels is studied, and a novel algorithm is proposed, called EvidentialMix, that addresses this problem and is compared with the state-of-the-art methods for both closed- set and open- set noise on the proposed benchmark.

Meta Transition Adaptation for Robust Deep Learning with Noisy Labels

Through the sound guidance of a small set of meta data with clean labels, the noise transition matrix and the classifier parameters can be mutually ameliorated to avoid being trapped by noisy training samples, and without need of any anchor point assumptions.

Dual T: Reducing Estimation Error for Transition Matrix in Label-noise Learning

This paper introduces an intermediate class to avoid directly estimating the noisy class posterior of the transition matrix, and introduces the dual $T-estimator for estimating transition matrices, leading to better classification performances.

Are Anchor Points Really Indispensable in Label-Noise Learning?

Empirical results on benchmark-simulated and real-world label-noise datasets demonstrate that without using exact anchor points, the proposed method is superior to the state-of-the-art label- noise learning methods.

Masking: A New Perspective of Noisy Supervision

A human-assisted approach called Masking is proposed that conveys human cognition of invalid class transitions and naturally speculates the structure of the noise transition matrix and can improve the robustness of classifiers significantly.

A Survey of Label-noise Representation Learning: Past, Present and Future

A formal definition of Label-Noise Representation Learning is clarified from the perspective of machine learning and the reason why noisy labels affect deep models' performance is figured out.

Beyond Synthetic Noise: Deep Learning on Controlled Noisy Labels

This paper establishes the first benchmark of controlled real-world label noise from the web, and conducts the largest study by far into understanding deep neural networks trained on noisy labels across different noise levels, noise types, network architectures, and training settings.

Symmetric Cross Entropy for Robust Learning With Noisy Labels

The proposed Symmetric cross entropy Learning (SL) approach simultaneously addresses both the under learning and overfitting problem of CE in the presence of noisy labels, and empirically shows that SL outperforms state-of-the-art methods.

Class2Simi: A New Perspective on Learning with Label Noise

A new perspective on dealing with label noise is proposed called Class2Simi, which achieves remarkably better classification accuracy than its baselines that directly deals with the noisy class labels.

Learning with Bounded Instance- and Label-dependent Label Noise

This paper introduces the concept of distilled examples, i.e. examples whose labels are identical with the labels assigned for them by the Bayes optimal classifier, and proves that under certain conditions classifier learnt on distilled examples will converge to the Bayefficient classifier.
...