Semi-Supervised Semantic Segmentation Using Unreliable Pseudo-Labels

@article{Wang2022SemiSupervisedSS,
  title={Semi-Supervised Semantic Segmentation Using Unreliable Pseudo-Labels},
  author={Yuchao Wang and Haochen Wang and Yujun Shen and Jingjing Fei and Wei Li and Guoqiang Jin and Liwei Wu and Rui Zhao and Xinyi Le},
  journal={2022 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)},
  year={2022},
  pages={4238-4247}
}
  • Yuchao Wang, Haochen Wang, Xinyi Le
  • Published 8 March 2022
  • Computer Science
  • 2022 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)
The crux of semi-supervised semantic segmentation is to assign adequate pseudo-labels to the pixels of unlabeled images. A common practice is to select the highly confident predictions as the pseudo ground-truth, but it leads to a problem that most pixels may be left unused due to their unreliability. We argue that every pixel matters to the model training, even its prediction is ambiguous. Intuitively, an unreliable prediction may get confused among the top classes (i.e., those with the… 

Figures and Tables from this paper

Multi-View Correlation Consistency for Semi-Supervised Semantic Segmentation

TLDR
This paper proposes multi-view correlation consistency (MVCC) learning: it considers rich pairwise relationships in self-correlation matrices and matches them across views to provide robust supervision and proposes a view-coherent data augmentation strategy that guarantees pixel-pixel correspondence between different views.

Revisiting Weak-to-Strong Consistency in Semi-Supervised Semantic Segmentation

TLDR
This work revisits the weak-to-strong consistency framework, popularized by FixMatch from semi-supervised classification, and presents a dual-stream perturbation technique, enabling two strong views to be simultaneously guided by a common weak view.

Semi-supervised Semantic Segmentation with Mutual Knowledge Distillation

TLDR
This work innovatively introduces two auxiliary mean-teacher models based on the consistency regularization method and uses the pseudo label generated by one mean teacher to supervise the other student network to achieve a mutual knowledge distillation between two branches.

Transformer-CNN Cohort: Semi-supervised Semantic Segmentation by the Best of Both Students

TLDR
This paper proposes a novel Semi-supervised Learning approach, called Transformer-CNN Cohort (TCC), that consists of two stu- dents with one based on the vision transformer (ViT) and the otherbased on the CNN, and validate the TCC framework on Cityscapes and Pascal VOC 2012 datasets, which outperforms existing semi- supervised methods by a large margin.

PercentMatch: Percentile-based Dynamic Thresholding for Multi-Label Semi-Supervised Classification

While much of recent study in semi-supervised learning (SSL) has achieved strong performance on single-label classification problems, an equally important yet underexplored problem is how to leverage

Seamless Iterative Semi-Supervised Correction of Imperfect Labels in Microscopy Images

TLDR
This work proposes Seamless Iterative Semi-Supervised correction of Imperfect labels (SISSI), a new method for training object detection models with noisy and missing annotations in a semi-supervised fashion, and successfully provides an adaptive early learning correction technique for object detection.

MFNet: Multi-class Few-shot Segmentation Network with Pixel-wise Metric Learning

TLDR
A novel multi-way (class) encoding and decoding architecture which effectively fuses multi-scale query information and multi-class sup- port information into one query-support embedding is presented.

Prompt-driven efficient Open-set Semi-supervised Learning

Open-set semi-supervised learning (OSSL) has attracted growing interest, which investigates a more practical scenario where out-of-distribution (OOD) samples are only contained in unlabeled data.

Online pseudo labeling for polyp segmentation with momentum networks

—Semantic segmentation is an essential task in de- veloping medical image diagnosis systems. However, building an annotated medical dataset is expensive. Thus, semi-supervised methods are significant

An Embarrassingly Simple Approach to Semi-Supervised Few-Shot Learning

Semi-supervised few-shot learning consists in training a classifier to adapt to new tasks with limited labeled data and a fixed quantity of unlabeled data. Many sophisticated methods have been

References

SHOWING 1-10 OF 53 REFERENCES

Semi-Supervised Semantic Segmentation via Adaptive Equalization Learning

TLDR
A novel framework for semi-supervised semantic segmentation, named adaptive equalization learning (AEL), which adaptively balances the training of well and badly performed categories, with a confidence bank to dynamically track category-wise performance during training.

Re-distributing Biased Pseudo Labels for Semi-supervised Semantic Segmentation: A Baseline Investigation

TLDR
This paper presents a simple and yet effective Distribution Alignment and Random Sampling (DARS) method to produce unbiased pseudo labels that match the true class distribution estimated from the labeled data.

Semi-Supervised Semantic Segmentation With Cross-Consistency Training

TLDR
This work observes that for semantic segmentation, the low-density regions are more apparent within the hidden representations than within the inputs, and proposes cross-consistency training, where an invariance of the predictions is enforced over different perturbations applied to the outputs of the encoder.

Semi-supervised semantic segmentation needs strong, varied perturbations

TLDR
This work finds that adapted variants of the recently proposed CutOut and CutMix augmentation techniques yield state-of-the-art semi-supervised semantic segmentation results in standard datasets.

PseudoSeg: Designing Pseudo Labels for Semantic Segmentation

TLDR
This work presents a simple and novel re-design of pseudo-labeling to generate well-calibrated structured pseudo labels for training with unlabeled or weakly-labeled data and demonstrates the effectiveness of the proposed pseudo- labeling strategy in both low-data and high-data regimes.

A Simple Baseline for Semi-supervised Semantic Segmentation with Strong Data Augmentation*

TLDR
It is demonstrated that the devil is in the details: a set of simple designs and training techniques can collectively improve the performance of semi-supervised semantic segmentation significantly.

Semi-Supervised Semantic Segmentation with Pixel-Level Contrastive Learning from a Class-wise Memory Bank

TLDR
The key element of this approach is the contrastive learning module that enforces the segmentation network to yield similar pixel-level feature representations for same-class samples across the whole dataset, maintaining a memory bank which is continuously updated with relevant and high-quality feature vectors from labeled data.

Semisupervised Semantic Segmentation by Improving Prediction Confidence

TLDR
This work proposes a method for semisupervised semantic segmentation by improving the confidence of the predicted class probability map via two parts, which results in more confident predictions by focusing on the misclassified regions, especially the boundary regions.

ClassMix: Segmentation-Based Data Augmentation for Semi-Supervised Learning

TLDR
This work proposes a novel data augmentation mechanism called ClassMix, which generates augmentations by mixing unlabelled samples, by leveraging on the network’s predictions for respecting object boundaries, and attains state-of-the-art results.

In Defense of Pseudo-Labeling: An Uncertainty-Aware Pseudo-label Selection Framework for Semi-Supervised Learning

TLDR
This work proposes an uncertainty-aware pseudo-label selection (UPS) framework which improves pseudo labeling accuracy by drastically reducing the amount of noise encountered in the training process and generalizes the pseudo- labeling process, allowing for the creation of negative pseudo-labels.
...