• Corpus ID: 246015836

Parameter-free Online Test-time Adaptation

@article{Boudiaf2022ParameterfreeOT,
  title={Parameter-free Online Test-time Adaptation},
  author={Malik Boudiaf and Romain Mueller and Ismail Ben Ayed and Luca Bertinetto},
  journal={ArXiv},
  year={2022},
  volume={abs/2201.05718}
}
Training state-of-the-art vision models has become prohibitively expensive for researchers and practitioners. For the sake of accessibility and resource reuse, it is important to focus on adapting these models to a variety of down-stream scenarios. An interesting and practical paradigm is online test-time adaptation, according to which training data is inaccessible, no labelled data from the test distribution is available, and adaptation can only happen at test time and on a handful of samples… 

References

SHOWING 1-10 OF 64 REFERENCES
Self-supervised Test-time Adaptation on Video Data
TLDR
This paper explores whether the recent progress in test-time adaptation in the image domain and self-supervised learning can be lever-aged to adapt a model to previously unseen and unlabelled videos presenting both mild (but arbitrary) and severe covariate shifts.
A Survey of Unsupervised Deep Domain Adaptation
TLDR
A survey will compare single-source and typically homogeneous unsupervised deep domain adaptation approaches, combining the powerful, hierarchical representations from deep learning with domain adaptation to reduce reliance on potentially costly target data labels.
Adaptive Batch Normalization for practical domain adaptation
Towards Inheritable Models for Open-Set Domain Adaptation
TLDR
This work formalizes knowledge inheritability as a novel concept and proposes a simple yet effective solution to realize inheritable models suitable for the above practical DA paradigm.
Universal Source-Free Domain Adaptation
TLDR
A novel two-stage learning process is proposed with superior DA performance even over state-of-the-art source-dependent approaches, utilizing a novel instance-level weighting mechanism, named as Source Similarity Metric (SSM).
Language Models are Few-Shot Learners
TLDR
GPT-3 achieves strong performance on many NLP datasets, including translation, question-answering, and cloze tasks, as well as several tasks that require on-the-fly reasoning or domain adaptation, such as unscrambling words, using a novel word in a sentence, or performing 3-digit arithmetic.
Evaluating Prediction-Time Batch Normalization for Robustness under Covariate Shift
TLDR
It is shown that prediction-time batch normalization provides complementary benefits to existing state-of-the-art approaches for improving robustness and combining the two further improves performance, and has mixed results when used alongside pre-training, and does not seem to perform as well under more natural types of dataset shift.
Model Adaptation: Unsupervised Domain Adaptation Without Source Data
TLDR
This paper proposes a new framework, which is referred to as collaborative class conditional generative adversarial net, to bypass the dependence on the source data and achieves superior performance on multiple adaptation tasks with only unlabeled target data, which verifies its effectiveness in this challenging setting.
Tent: Fully Test-Time Adaptation by Entropy Minimization
TLDR
Tent reduces generalization error for image classification on corrupted ImageNet and CIFAR-10/100 and reaches a new state-of-the-art error on ImageNet-C, and optimize the model for confidence as measured by the entropy of its predictions.
Unsupervised Domain Adaptation by Backpropagation
TLDR
The method performs very well in a series of image classification experiments, achieving adaptation effect in the presence of big domain shifts and outperforming previous state-of-the-art on Office datasets.
...
...