Label Propagation with Augmented Anchors: A Simple Semi-Supervised Learning baseline for Unsupervised Domain Adaptation

@article{Zhang2020LabelPW,
  title={Label Propagation with Augmented Anchors: A Simple Semi-Supervised Learning baseline for Unsupervised Domain Adaptation},
  author={Yabin Zhang and Bin Deng and Kui Jia and Lei Zhang},
  journal={ArXiv},
  year={2020},
  volume={abs/2007.07695}
}
Motivated by the problem relatedness between unsupervised domain adaptation (UDA) and semi-supervised learning (SSL), many state-of-the-art UDA methods adopt SSL principles (e.g., the cluster assumption) as their learning ingredients. However, they tend to overlook the very domain-shift nature of UDA. In this work, we take a step further to study the proper extensions of SSL techniques for UDA. Taking the algorithm of label propagation (LP) as an example, we analyze the challenges of adopting… 
Semi-supervised Models are Strong Unsupervised Domain Adaptation Learners
TLDR
It is found that UDA and SSL are closely related in terms of task objectives and solutions, and SSL is a special case of UDA problems, and state-of-the-art UDA methods could be further enhanced with SSL techniques.
Selective mixing and voting network for semi-supervised domain generalization
TLDR
This paper proposes a novel Selective Mixing and Voting Network (SMV-Net), which effectively extracts useful knowledge from the set of unlabeled training data, available to the model, and introduces a test-time mixing strategy to re-look at the top class-predictions and re-order them if required to further boost the classification performance.
Category Dictionary Guided Unsupervised Domain Adaptation for Object Detection
TLDR
This paper takes a different approach to reduce the domain gap by a selftraining paradigm, which regards the pseudo-labels as ground truth to fully exploit the unlabeled target data and proposes a category dictionary guided UDA model for crossdomain object detection, which learns category-specific dictionaries from the source domain to represent the candidate boxes in target domain.
A Survey of Unsupervised Domain Adaptation for Visual Recognition
TLDR
The principal objective of UDA is to reduce the domain discrepancy between the labeled source data and unlabeled target data and to learn domain-invariant representations across the two domains during training.
Transporting Causal Mechanisms for Unsupervised Domain Adaptation
TLDR
Transporting Causal Mechanisms (TCM) is proposed, to identify the confounder stratum and representations by using the domain-invariant disentangled causal mechanisms, which are discovered in an unsupervised fashion.
Gradual Domain Adaptation via Self-Training of Auxiliary Models
TLDR
Self-training of auxiliary models (AuxSelfTrain) that learns models for intermediate domains and gradually combats the distancing shifts across domains is proposed and extended to semi-supervised domain adaptation.
Select, Label, and Mix: Learning Discriminative Invariant Feature Representations for Partial Domain Adaptation
TLDR
A novel 'Select, Label, and Mix' (SLM) framework that aims to learn discriminative invariant feature representations for partial domain adaptation and demonstrates the superiority of the proposed framework over state-of-the-art methods.
Compound Domain Generalization via Meta-Knowledge Encoding
TLDR
Experiments reveal that COMEN exceeds the state-of-the-art performance without the need of domain supervision, and introduces Style-induced Domain-specific Normalization (SDNorm) to re-normalize the multi-modal underlying distributions, thereby dividing the mixture of source domains into latent clusters.
Safe Self-Refinement for Transformer-based Domain Adaptation
TLDR
This paper finds that the combination of vision transformer with simple adversarial adaptation surpasses best reported Convolutional Neural Network (CNN)-based results on the challenging DomainNet benchmark, show-ing its strong transferable feature representation.
Transferrable Contrastive Learning for Visual Domain Adaptation
TLDR
This work presents a particular paradigm of self-supervised learning tailored for domain adaptation, i.e., Transferrable Contrastive Learning (TCL), which links the SSL and the desired cross-domain transferability congruently, and finds contrastive learning intrinsically a suitable candidate fordomain adaptation.
...
1
2
3
4
...

References

SHOWING 1-10 OF 53 REFERENCES
Unsupervised Domain Adaptation via Structurally Regularized Deep Clustering
  • Hui Tang, Ke Chen, K. Jia
  • Computer Science
    2020 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)
  • 2020
TLDR
This work describes the proposed method as Structurally Regularized Deep Clustering (SRDC), where it enhances target discrimination with clustering of intermediate network features, and enhance structural regularization with soft selection of less divergent source examples.
Graph Adaptive Knowledge Transfer for Unsupervised Domain Adaptation
TLDR
A novel Graph Adaptive Knowledge Transfer model is developed to jointly optimize target labels and domain-free features in a unified framework and hence the marginal and conditional disparities across different domains will be better alleviated.
Learning to Propagate Labels: Transductive Propagation Network for Few-Shot Learning
TLDR
This paper proposes Transductive Propagation Network (TPN), a novel meta-learning framework for transductive inference that classifies the entire test set at once to alleviate the low-data problem.
Learning Semantic Representations for Unsupervised Domain Adaptation
TLDR
Moving semantic transfer network is presented, which learn semantic representations for unlabeled target samples by aligning labeled source centroid and pseudo-labeled target centroid, resulting in an improved target classification accuracy.
Semi-supervised learning with graphs
TLDR
A series of novel semi-supervised learning approaches arising from a graph representation, where labeled and unlabeled instances are represented as vertices, and edges encode the similarity between instances are presented.
Label Propagation for Deep Semi-Supervised Learning
TLDR
This work employs a transductive label propagation method that is based on the manifold assumption to make predictions on the entire dataset and use these predictions to generate pseudo-labels for the unlabeled data and train a deep neural network.
Unsupervised Domain Adaptation With Label and Structural Consistency
TLDR
This paper proposes to utilize the label information inferred from the source domain, while the structural information of the unlabeled target-domain data will be jointly exploited for adaptation purposes, which reduces the distribution mismatch between domains and improved recognition of target- domain data can be achieved simultaneously.
Co-regularized Alignment for Unsupervised Domain Adaptation
Deep neural networks, trained with large amount of labeled data, can fail to generalize well when tested with examples from a target domain whose distribution differs from the training data
Locality Preserving Joint Transfer for Domain Adaptation
TLDR
This work proposes a novel approach to jointly exploit feature adaptation with distribution matching and sample adaptation with landmark selection, suitable for both homogeneous- and heterogeneous-domain adaptations by learning domain-specific projections.
Efficient Non-Parametric Function Induction in Semi-Supervised Learning
TLDR
Experiments show that the proposed non-parametric algorithms which provide an estimated continuous label for the given unlabeled examples are extended to function induction algorithms that correspond to the minimization of a regularization criterion applied to an out-of-sample example, and happens to have the form of a Parzen windows regressor.
...
1
2
3
4
5
...