Active Domain Adaptation via Clustering Uncertainty-weighted Embeddings
@article{Prabhu2021ActiveDA, title={Active Domain Adaptation via Clustering Uncertainty-weighted Embeddings}, author={Viraj Prabhu and Arjun Chandrasekaran and Kate Saenko and Judy Hoffman}, journal={2021 IEEE/CVF International Conference on Computer Vision (ICCV)}, year={2021}, pages={8485-8494} }
Generalizing deep neural networks to new target domains is critical to their real-world utility. In practice, it may be feasible to get some target data labeled, but to be cost-effective it is desirable to select a maximally-informative subset via active learning (AL). We study the problem of AL under a domain shift, called Active Domain Adaptation (Active DA). We demonstrate how existing AL approaches based solely on model uncertainty or diversity sampling are less effective for Active DA. We…
Figures and Tables from this paper
15 Citations
Active Learning for Domain Adaptation: An Energy-based Approach
- Computer ScienceArXiv
- 2021
This paper presents a novel active learning strategy to assist knowledge transfer in the target domain, dubbed active domain adaptation that surpasses state-of-the-art methods on well-known challenging benchmarks with substantial improvements, making it a useful option in the open world.
Learning Distinctive Margin toward Active Domain Adaptation
- Computer ScienceArXiv
- 2022
This work proposes a concise but effective ADA method called Select-by-Distinctive-Margin (SDM), which consists of a maximum margin loss and a margin sampling algorithm for data selection and provides theoretical analysis to show that SDM works like a Support Vector Machine.
Loss-based Sequential Learning for Active Domain Adaptation
- Computer ScienceArXiv
- 2022
Active domain adaptation (ADA) studies have mainly addressed query selection while following existing domain adaptation strategies. However, we argue that it is critical to consider not only query…
ADeADA: Adaptive Density-aware Active Domain Adaptation for Semantic Segmentation
- Computer ScienceArXiv
- 2022
ADeADA is presented, a general active domain adaptation framework for semantic segmentation and an adaptive budget allocation policy is designed, which dynamically balances the labeling budgets among different categories as well as between density-aware and uncertainty-based methods.
Active Source Free Domain Adaptation
- Computer Science
- 2022
Source free domain adaptation (SFDA) aims to transfer a trained source model to the unlabeled target domain without accessing the source data. However, the SFDA setting faces an effect bottleneck due…
D2ADA: Dynamic Density-aware Active Domain Adaptation for Semantic Segmentation
- Computer Science
- 2022
D 2 ADA is presented, a general active domain adaptation framework for semantic segmentation and a dynamic scheduling policy is designed to adjust the labeling budgets between domain exploration and model uncertainty over time to facilitate labeling efficiency.
Online Continual Adaptation with Active Self-Training
- EducationAISTATS
- 2022
This paper presents a parallel version of the Tsinghua-Berkeley Shenzhen Institute’s parallel reinforcement learning model, which is based on a model derived from the model developed at the University of California, Berkeley.
Cost-effective Framework for Gradual Domain Adaptation with Multifidelity
- Computer ScienceArXiv
- 2022
A framework that combines multifidelity and active domain adaptation, which is evaluated by experiments with both artificial and real-world datasets to solve the trade-off between cost and accuracy.
Attentive Prototypes for Source-free Unsupervised Domain Adaptive 3D Object Detection
- Computer ScienceArXiv
- 2021
This work proposes a single-frame approach for source-free, unsupervised domain adaptation of lidar-based 3D object detectors that uses class prototypes to mitigate the effect pseudo-label noise and demonstrates its effectiveness on two recent object detectors.
Burn After Reading: Online Adaptation for Cross-domain Streaming Data
- Computer ScienceArXiv
- 2021
This paper proposes an online framework that “burns after reading”, i.e. each online sample is immediately deleted after it is processed, and proposes a novel algorithm that aims at the most fundamental challenge of the online adaptation setting–the lack of diverse source-target data pairs.
References
SHOWING 1-10 OF 68 REFERENCES
Active Adversarial Domain Adaptation
- Computer Science2020 IEEE Winter Conference on Applications of Computer Vision (WACV)
- 2020
This work shows that the two views of adversarial domain alignment and importance sampling can be unified in one framework for domain adaptation and transfer learning when the source domain has many labeled examples while the target domain does not.
Semi-Supervised Domain Adaptation via Minimax Entropy
- Computer Science2019 IEEE/CVF International Conference on Computer Vision (ICCV)
- 2019
A novel Minimax Entropy (MME) approach that adversarially optimizes an adaptive few-shot model for semi-supervised domain adaptation (SSDA) setting, setting a new state of the art for SSDA.
A DIRT-T Approach to Unsupervised Domain Adaptation
- Computer ScienceICLR
- 2018
Two novel and related models are proposed: the Virtual Adversarial Domain Adaptation (VADA) model, which combines domain adversarial training with a penalty term that punishes the violation the cluster assumption, and the Decision-boundary Iterative Refinement Training with a Teacher (DIRT-T) models, which takes the VADA model as initialization and employs natural gradient steps to further minimize the Cluster assumption violation.
Adversarial Active Learning for Deep Networks: a Margin Based Approach
- Computer ScienceArXiv
- 2018
It is demonstrated empirically that adversarial active queries yield faster convergence of CNNs trained on MNIST, the Shoe-Bag and the Quick-Draw datasets.
Semi-supervised Domain Adaptation with Subspace Learning for visual recognition
- Computer Science2015 IEEE Conference on Computer Vision and Pattern Recognition (CVPR)
- 2015
A novel domain adaptation framework, named Semi-supervised Domain Adaptation with Subspace Learning (SDASL), which jointly explores invariant low-dimensional structures across domains to correct data distribution mismatch and leverages available unlabeled target examples to exploit the underlying intrinsic information in the target domain.
Adversarial Discriminative Domain Adaptation
- Computer Science2017 IEEE Conference on Computer Vision and Pattern Recognition (CVPR)
- 2017
It is shown that ADDA is more effective yet considerably simpler than competing domain-adversarial methods, and the promise of the approach is demonstrated by exceeding state-of-the-art unsupervised adaptation results on standard domain adaptation tasks as well as a difficult cross-modality object classification task.
Domain-Adversarial Training of Neural Networks
- Computer ScienceJ. Mach. Learn. Res.
- 2016
A new representation learning approach for domain adaptation, in which data at training and test time come from similar but different distributions, which can be achieved in almost any feed-forward model by augmenting it with few standard layers and a new gradient reversal layer.
A new active labeling method for deep learning
- Computer Science2014 International Joint Conference on Neural Networks (IJCNN)
- 2014
A new active labeling method, AL-DL, for cost-effective selection of data to be labeled, which outperforms random labeling consistently and is applied to deep learning networks based on stacked restricted Boltzmann machines, as well as stacked autoencoders.
Joint Transfer and Batch-mode Active Learning
- Computer ScienceICML
- 2013
This work presents an integrated framework that performs transfer and active learning simultaneously by solving a single convex optimization problem by minimizing a common objective of reducing distribution difference between the data set consisting of re-weighted source and the queried target domain data and the set of unlabeled targetdomain data.
Learning Transferable Features with Deep Adaptation Networks
- Computer ScienceICML
- 2015
A new Deep Adaptation Network (DAN) architecture is proposed, which generalizes deep convolutional neural network to the domain adaptation scenario and can learn transferable features with statistical guarantees, and can scale linearly by unbiased estimate of kernel embedding.