Corpus ID: 236881055

Uniform Sampling over Episode Difficulty

@article{Arnold2021UniformSO,
  title={Uniform Sampling over Episode Difficulty},
  author={S{\'e}bastien M. R. Arnold and Guneet S. Dhillon and Avinash Ravichandran and Stefano Soatto},
  journal={ArXiv},
  year={2021},
  volume={abs/2108.01662}
}
Episodic training is a core ingredient of few-shot learning to train models on tasks with limited labelled data. Despite its success, episodic training remains largely understudied, prompting us to ask the question: what is the best way to sample episodes? In this paper, we first propose a method to approximate episode sampling distributions based on their difficulty. Building on this method, we perform an extensive analysis and find that sampling uniformly over episode difficulty outperforms… Expand

Figures and Tables from this paper

References

SHOWING 1-10 OF 68 REFERENCES
Adaptive Task Sampling for Meta-Learning
TLDR
This paper proposes an adaptive task sampling method, which selects difficult tasks according to class-pair potentials and achieves consistent improvements across different feature backbones, meta-learning algorithms and datasets. Expand
On Episodes, Prototypical Networks, and Few-shot Learning
TLDR
Surprisingly, it is detrimental to use the episodic learning strategy of separating training samples between support and query set, as it is a data-inefficient way to exploit training batches in Prototypical Networks and Matching Networks. Expand
Optimization as a Model for Few-Shot Learning
Generalizing from a Few Examples: A Survey on Few-Shot Learning
TLDR
A thorough survey to fully understand Few-Shot Learning (FSL), and categorizes FSL methods from three perspectives: data, which uses prior knowledge to augment the supervised experience; model, which used to reduce the size of the hypothesis space; and algorithm, which using prior knowledgeto alter the search for the best hypothesis in the given hypothesis space. Expand
Meta-Learning for Semi-Supervised Few-Shot Classification
TLDR
This work proposes novel extensions of Prototypical Networks that are augmented with the ability to use unlabeled examples when producing prototypes, and confirms that these models can learn to improve their predictions due to unlabeling examples, much like a semi-supervised algorithm would. Expand
Expert Training: Task Hardness Aware Meta-Learning for Few-Shot Classification
TLDR
An easy-to-hard expert meta-training strategy to arrange the training tasks properly, where easy tasks are preferred in the first phase, then, hard tasks are emphasized in the second phase, and Experimental results show that the meta-learners can obtain better results with this expert training strategy. Expand
Prototypical Networks for Few-shot Learning
TLDR
This work proposes Prototypical Networks for few-shot classification, and provides an analysis showing that some simple design decisions can yield substantial improvements over recent approaches involving complicated architectural choices and meta-learning. Expand
TADAM: Task dependent adaptive metric for improved few-shot learning
TLDR
This work identifies that metric scaling and metric task conditioning are important to improve the performance of few-shot algorithms and proposes and empirically test a practical end-to-end optimization procedure based on auxiliary task co-training to learn a task-dependent metric space. Expand
Meta-Transfer Learning through Hard Tasks
TLDR
This work proposes a novel approach called meta-transfer learning (MTL), which learns to transfer the weights of a deep NN for few-shot learning tasks, and introduces the hard task (HT) meta-batch scheme as an effective learning curriculum of few- shot classification tasks. Expand
Few-Shot Learning With Embedded Class Models and Shot-Free Meta Training
TLDR
This work proposes a method for learning embeddings for few-shot learning that is suitable for use with any number of shots (shot-free), that encompasses metric learning, that facilitates adding new classes without crowding the class representation space. Expand
...
1
2
3
4
5
...