# Provably End-to-end Label-Noise Learning without Anchor Points

@inproceedings{Li2021ProvablyEL, title={Provably End-to-end Label-Noise Learning without Anchor Points}, author={Xuefeng Li and Tongliang Liu and Bo Han and Gang Niu and Masashi Sugiyama}, booktitle={International Conference on Machine Learning}, year={2021} }

In label-noise learning, the transition matrix plays a key role in building statistically consistent classifiers. Existing consistent estimators for the transition matrix have been developed by exploiting anchor points. However, the anchorpoint assumption is not always satisfied in real scenarios. In this paper, we propose an end-toend framework for solving label-noise learning without anchor points, in which we simultaneously optimize two objectives: the cross entropy loss between the noisy…

## 43 Citations

### Clusterability as an Alternative to Anchor Points When Learning with Noisy Labels

- Computer ScienceICML
- 2021

This paper discovers an efficient estimation procedure based on a clusterability condition and proves that with clusterable representations of features, using up to third-order consensuses of noisy labels among neighbor representations is sufficient to estimate a unique transition matrix.

### Learning Noise Transition Matrix from Only Noisy Labels via Total Variation Regularization

- Computer ScienceICML
- 2021

This work proposes total variation regularization, which encourages the predicted probabilities to be more distinguishable from each other, and shows the effectiveness of the proposed method through experiments on benchmark and real-world datasets.

### Instance-dependent Label-noise Learning under a Structural Causal Model

- Computer ScienceNeurIPS
- 2021

This paper proposes a novel generative approach for instance-dependent label-noise learning by leveraging a structural causal model and shows that properly modelling the instances will contribute to the identifiability of the label noise transition matrix and thus lead to a better classifier.

### Confidence Scores Make Instance-dependent Label-noise Learning Possible

- Computer ScienceICML
- 2021

Confidence-scored instance-dependent noise (CSIDN) is introduced, where each instance-label pair is associated with a confidence score, sufficient to estimate the noise functions of each instance with minimal assumptions.

### Class2Simi: A Noise Reduction Perspective on Learning with Noisy Labels

- Computer ScienceICML
- 2021

This paper proposes a framework called Class2Simi, which transforms data points with noisy class labels to data pairs with noisy similarity labels, where a similarity label denotes whether a pair shares the class label or not, and changes loss computation on top of model prediction into a pairwise manner.

### Estimating Instance-dependent Label-noise Transition Matrix using DNNs

- Computer ScienceArXiv
- 2021

This paper directly model the transition from Bayes optimal labels to noisy labels and learn a Bayes ideal label classifier and estimates the Bayes label transition matrix by employing a deep neural network in a parameterized way, leading to better generalization and superior classification performance.

### A Second-Order Approach to Learning with Instance-Dependent Label Noise

- Computer Science2021 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)
- 2021

This work proposes and studies the potentials of a second-order approach that leverages the estimation of several covariance terms defined between the instance-dependent noise rates and the Bayes optimal label, and shows that this set of second- order statistics successfully captures the induced imbalances.

### Instance Correction for Learning with Open-set Noisy Labels

- Computer ScienceArXiv
- 2021

It is shown that the instances of discarded data could consist of some meaningful information for generalization, and instance correction is used to modify the instance of the discarded data, which makes the predictions for the discardedData consistent with given labels.

### Sample Selection with Uncertainty of Losses for Learning with Noisy Labels

- Computer ScienceICLR
- 2022

In this paper, the uncertainty of losses is incorporated by adopting interval estimation instead of point estimation of losses, where lower bounds of the conﬁdence intervals of losses derived from distribution-free concentration inequalities, but not losses themselves, are used for sample selection.

### A Good Representation Detects Noisy Labels

- Computer ScienceArXiv
- 2021

Given good representations that are commonly available in practice, given good representations, this paper proposes a universally applicable and training-free solution to detect noisy labels and theoretically analyze how they affect the local voting and provide guidelines for tuning neighborhood size.

## References

SHOWING 1-10 OF 56 REFERENCES

### Are Anchor Points Really Indispensable in Label-Noise Learning?

- Computer ScienceNeurIPS
- 2019

Empirical results on benchmark-simulated and real-world label-noise datasets demonstrate that without using exact anchor points, the proposed method is superior to the state-of-the-art label- noise learning methods.

### Clusterability as an Alternative to Anchor Points When Learning with Noisy Labels

- Computer ScienceICML
- 2021

This paper discovers an efficient estimation procedure based on a clusterability condition and proves that with clusterable representations of features, using up to third-order consensuses of noisy labels among neighbor representations is sufficient to estimate a unique transition matrix.

### Learning Noise Transition Matrix from Only Noisy Labels via Total Variation Regularization

- Computer ScienceICML
- 2021

This work proposes total variation regularization, which encourages the predicted probabilities to be more distinguishable from each other, and shows the effectiveness of the proposed method through experiments on benchmark and real-world datasets.

### Dual T: Reducing Estimation Error for Transition Matrix in Label-noise Learning

- Computer ScienceNeurIPS
- 2020

This paper introduces an intermediate class to avoid directly estimating the noisy class posterior of the transition matrix, and introduces the dual $T-estimator for estimating transition matrices, leading to better classification performances.

### Confidence Scores Make Instance-dependent Label-noise Learning Possible

- Computer ScienceICML
- 2021

Confidence-scored instance-dependent noise (CSIDN) is introduced, where each instance-label pair is associated with a confidence score, sufficient to estimate the noise functions of each instance with minimal assumptions.

### Classification with Noisy Labels by Importance Reweighting

- Computer ScienceIEEE Transactions on Pattern Analysis and Machine Intelligence
- 2016

It is proved that any surrogate loss function can be used for classification with noisy labels by using importance reweighting, with consistency assurance that the label noise does not ultimately hinder the search for the optimal classifier of the noise-free sample.

### L_DMI: A Novel Information-theoretic Loss Function for Training Deep Nets Robust to Label Noise

- Computer ScienceNeurIPS
- 2019

A novel information-theoretic loss function, L_DMI, is proposed, which is the first loss function that is provably robust to instance-independent label noise, regardless of noise pattern, and it can be applied to any existing classification neural networks straightforwardly without any auxiliary information.

### A Second-Order Approach to Learning with Instance-Dependent Label Noise

- Computer Science2021 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)
- 2021

This work proposes and studies the potentials of a second-order approach that leverages the estimation of several covariance terms defined between the instance-dependent noise rates and the Bayes optimal label, and shows that this set of second- order statistics successfully captures the induced imbalances.

### Label-Noise Robust Domain Adaptation

- Computer ScienceICML
- 2020

This paper is the first to comprehensively investigate how label noise could adversely affect existing domain adaptation methods in various scenarios and theoretically prove that there exists a method that can essentially reduce the side-effect of noisy source labels in domain adaptation.

### SELF: Learning to Filter Noisy Labels with Self-Ensembling

- Computer ScienceICLR
- 2020

This work presents a simple and effective method self-ensemble label filtering (SELF) to progressively filter out the wrong labels during training that substantially outperforms all previous works on noise-aware learning across different datasets and can be applied to a broad set of network architectures.