TAFSSL: Task-Adaptive Feature Sub-Space Learning for few-shot classification

@article{Lichtenstein2020TAFSSLTF,
  title={TAFSSL: Task-Adaptive Feature Sub-Space Learning for few-shot classification},
  author={Moshe Lichtenstein and Prasanna Sattigeri and Rog{\'e}rio Schmidt Feris and Raja Giryes and Leonid Karlinsky},
  journal={ArXiv},
  year={2020},
  volume={abs/2003.06670}
}
The field of Few-Shot Learning (FSL), or learning from very few (typically $1$ or $5$) examples per novel class (unseen during training), has received a lot of attention and significant performance advances in the recent literature. While number of techniques have been proposed for FSL, several factors have emerged as most important for FSL performance, awarding SOTA even to the simplest of techniques. These are: the backbone architecture (bigger is better), type of pre-training on the base… Expand
Few-shot Learning via Dependency Maximization and Instance Discriminant Analysis
  • Zejiang Hou, S. Kung
  • Computer Science
  • ArXiv
  • 2021
TLDR
This work proposes a Dependency Maximization method based on the Hilbert-Schmidt norm of the cross-covariance operator, which maximizes the statistical dependency between the embedded feature of those unlabeled data and their label predictions, together with the supervised loss over the support set. Expand
Embedding Adaptation is Still Needed for Few-Shot Learning
TLDR
This work proposes ATG, a principled clustering method to defining train and test tasksets without additional human knowledge, and empirically demonstrates the effectiveness of ATG in generating tasksets that are easier, in-between, or harder than existing benchmarks, including those that rely on semantic information. Expand
Iterative label cleaning for transductive and semi-supervised few-shot learning
TLDR
A new algorithm is introduced that leverages the manifold structure of the labeled and unlabeled data distribution to predict pseudo-labels, while balancing over classes and using the loss value distribution of a limited-capacity classifier to select the cleanest labels, iterately improving the quality of pseudo-Labels. Expand
Predicting the Generalization Ability of a Few-Shot Classifier
TLDR
This paper investigates the case of transfer-based few-shot learning solutions, and proposes reasonable measures that empirically demonstrate to be correlated with the generalization ability of the considered classifiers and shows that these simple measures can predict thegeneralization ability up to a certain confidence. Expand
Transductive Relation-Propagation With Decoupling Training for Few-Shot Learning.
  • Yuqing Ma, Shihao Bai, +5 authors Meng Wang
  • Medicine
  • IEEE transactions on neural networks and learning systems
  • 2021
TLDR
A transductive relation-propagation graph neural network (GNN) with a decoupling training strategy (TRPN-D) to explicitly model and propagate such relations across support-query pairs, and empower the few-shot module the ability of transferring past knowledge to new tasks via the decoupled training. Expand
Predicting the Accuracy of a Few-Shot Classifier
TLDR
This paper analyzes the reasons for the variability of generalization performances, and proposes reasonable measures that empirically demonstrate to be correlated with the generalization ability of considered classifiers. Expand
Unsupervised Embedding Adaptation via Early-Stage Feature Reconstruction for Few-Shot Classification
TLDR
Early-Stage Feature Reconstruction (ESFR) is developed — a novel adaptation scheme with feature reconstruction and dimensionality-driven early stopping that consistently improves the performance of baseline methods on all standard settings, including the recently proposed transductive method. Expand
Multi-Level Contrastive Learning for Few-Shot Problems
  • Qing Chen, Jian Zhang
  • Computer Science
  • ArXiv
  • 2021
TLDR
A multi-level contrasitive learning approach which applies contrastive losses at different layers of an encoder to learn multiple representations from the encoder, and an ensemble can be constructed to take advantage of the multiple representations for the downstream tasks. Expand
Few-shot Learning for Unsupervised Feature Selection
TLDR
This work proposes a few-shot learning method for unsupervised feature selection, which is a task to select a subset of relevant features in unlabeled data and demonstrates that the proposed method outperforms existing feature selection methods. Expand
Few-Shot Class-Adaptive Anomaly Detection with Model-Agnostic Meta-Learning
TLDR
A reliable solution to few-shot anomaly detection will have huge potential for real-world applications since it is expensive and arduous to collect a massive amount of data onto the new anomaly class; extensive experimental results demonstrate the effectiveness of the proposed approach. Expand
...
1
2
...

References

SHOWING 1-10 OF 68 REFERENCES
MetAdapt: Meta-Learned Task-Adaptive Architecture for Few-Shot Classification
TLDR
This work proposes to employ tools inspired by the Differentiable Neural Architecture Search (D-NAS) literature in order to optimize the architecture for FSL without over-fitting, and proposes the concept of `MetAdapt Controller' modules, added to the model and meta-trained to predict the optimal network connections for a given novel task. Expand
Revisiting Fine-tuning for Few-shot Learning
TLDR
In this study, it is shown that in the commonly used low-resolution mini-ImageNet dataset, the fine-tuning method achieves higher accuracy than common few-shot learning algorithms in the 1-shot task and nearly the same accuracy as that of the state-of-the-art algorithm in the 5- shot task. Expand
Diversity With Cooperation: Ensemble Methods for Few-Shot Classification
TLDR
This work shows that by addressing the fundamental high-variance issue of few-shot learning classifiers, it is possible to significantly outperform current meta-learning techniques. Expand
Meta-Learning for Semi-Supervised Few-Shot Classification
TLDR
This work proposes novel extensions of Prototypical Networks that are augmented with the ability to use unlabeled examples when producing prototypes, and confirms that these models can learn to improve their predictions due to unlabeling examples, much like a semi-supervised algorithm would. Expand
TADAM: Task dependent adaptive metric for improved few-shot learning
TLDR
This work identifies that metric scaling and metric task conditioning are important to improve the performance of few-shot algorithms and proposes and empirically test a practical end-to-end optimization procedure based on auxiliary task co-training to learn a task-dependent metric space. Expand
A Closer Look at Few-shot Classification
TLDR
The results reveal that reducing intra-class variation is an important factor when the feature backbone is shallow, but not as critical when using deeper backbones, and a baseline method with a standard fine-tuning practice compares favorably against other state-of-the-art few-shot learning algorithms. Expand
Learning to Self-Train for Semi-Supervised Few-Shot Classification
TLDR
A novel semi-supervised meta-learning method called learning to self-train (LST) that leverages unlabeled data and specifically meta-learns how to cherry-pick and label such unsupervised data to further improve performance is proposed. Expand
Learning to Compare: Relation Network for Few-Shot Learning
TLDR
A conceptually simple, flexible, and general framework for few-shot learning, where a classifier must learn to recognise new classes given only few examples from each, which is easily extended to zero- shot learning. Expand
Prototypical Networks for Few-shot Learning
TLDR
This work proposes Prototypical Networks for few-shot classification, and provides an analysis showing that some simple design decisions can yield substantial improvements over recent approaches involving complicated architectural choices and meta-learning. Expand
Adaptive Subspaces for Few-Shot Learning
TLDR
This paper provides a framework for few-shot learning by introducing dynamic classifiers that are constructed from few samples and empirically shows that such modelling leads to robustness against perturbations and yields competitive results on the task of supervised and semi-supervised few- shot classification. Expand
...
1
2
3
4
5
...