• Corpus ID: 102351185

A Closer Look at Few-shot Classification

@article{Chen2019ACL,
  title={A Closer Look at Few-shot Classification},
  author={Wei-Yu Chen and Yen-Cheng Liu and Zsolt Kira and Y. Wang and Jia-Bin Huang},
  journal={ArXiv},
  year={2019},
  volume={abs/1904.04232}
}
Few-shot classification aims to learn a classifier to recognize unseen classes during training with limited labeled examples. [...] Key Method In this paper, we present 1) a consistent comparative analysis of several representative few-shot classification algorithms, with results showing that deeper backbones significantly reduce the performance differences among methods on datasets with limited domain differences, 2) a modified baseline method that surprisingly achieves competitive performance when compared…Expand
A Closer Look at Few-Shot Video Classification: A New Baseline and Benchmark
TLDR
This paper proposes a simple classifier-based baseline without any temporal alignment that surprisingly outperforms the state-of-the-art meta-learning based methods and discovers that there is a high correlation between the novel action class and the ImageNet object class, which is problematic in the few-shot recognition setting.
Diversity With Cooperation: Ensemble Methods for Few-Shot Classification
TLDR
This work shows that by addressing the fundamental high-variance issue of few-shot learning classifiers, it is possible to significantly outperform current meta-learning techniques.
Meta Generalized Network for Few-Shot Classification
TLDR
This paper develops a meta backbone training method that learns a flexible feature extractor and a classifier initializer efficiently, delightedly leading to fast adaption to unseen few-shot tasks without overfitting, and designs a trainable adaptive interval model to improve the cosine classifier, which increases the recognition accuracy of hard examples.
Region Comparison Network for Interpretable Few-shot Image Classification
TLDR
A metric learning based method named Region Comparison Network (RCN) is proposed, able to reveal how few-shot learning works as in a neural network as well as to find out specific regions that are related to each other in images coming from the query and support sets.
Looking Wider for Better Adaptive Representation in Few-Shot Learning
TLDR
The Cross Non-Local Neural Network (CNL) is proposed for capturing the long-range dependency of the samples and the current task, and extracts the taskspecific and context-aware features dynamically by strengthening the features of the sample at a position via aggregating information from all positions of itself and theCurrent task.
Boosting Few-Shot Classification with View-Learnable Contrastive Learning
TLDR
This work introduces the contrastive loss into few-shot classification for learning latent fine-grained structure in the embedding space and develops a learning-to-learn algorithm to automatically generate different views of the same image.
Novelty-Prepared Few-Shot Classification
TLDR
This work proposes to use a novelty-prepared loss function, called self-compacting softmax loss (SSL), for few-shot classification, and shows that SSL leads to significant improvement of the state-of-the-art performance.
Class-wise Metric Scaling for Improved Few-Shot Classification
TLDR
A class-wise metric scaling (CMS) mechanism is proposed, which can be applied to both training and testing stages of few-shot classification, to learn a more discriminative and transferable feature representation.
Few Shot Learning With No Labels
TLDR
This paper presents a more challenging fewshot setting where no label access is allowed during training or testing, and achieves competitive baselines while using zero labels, which is at least 10,000 times fewer labels than state-of-the-art.
Impact of base dataset design on few-shot image classification
TLDR
This paper systematically study the effect of variations in the training data by evaluating deep features trained on different image sets in a few- shot classification setting, and shows how the base dataset design can improve performance in few-shot classification more drastically than replacing a simple baseline by an advanced state of the art algorithm.
...
1
2
3
4
5
...

References

SHOWING 1-10 OF 33 REFERENCES
Learning to Compare: Relation Network for Few-Shot Learning
TLDR
A conceptually simple, flexible, and general framework for few-shot learning, where a classifier must learn to recognise new classes given only few examples from each, which is easily extended to zero- shot learning.
Few-Shot Learning with Metric-Agnostic Conditional Embeddings
TLDR
This work introduces a novel architecture where class representations are conditioned for each few-shot trial based on a target image, and deviates from traditional metric-learning approaches by training a network to perform comparisons between classes rather than relying on a static metric comparison.
Prototypical Networks for Few-shot Learning
TLDR
This work proposes Prototypical Networks for few-shot classification, and provides an analysis showing that some simple design decisions can yield substantial improvements over recent approaches involving complicated architectural choices and meta-learning.
Optimization as a Model for Few-Shot Learning
Matching Networks for One Shot Learning
TLDR
This work employs ideas from metric learning based on deep neural features and from recent advances that augment neural networks with external memories to learn a network that maps a small labelled support set and an unlabelled example to its label, obviating the need for fine-tuning to adapt to new class types.
Dynamic Few-Shot Visual Learning Without Forgetting
TLDR
This work proposes to extend an object recognition system with an attention based few-shot classification weight generator, and to redesign the classifier of a ConvNet model as the cosine similarity function between feature representations and classification weight vectors.
Metric Learning for Large Scale Image Classification: Generalizing to New Classes at Near-Zero Cost
TLDR
The goal is to devise classifiers which can incorporate images and classes on-the-fly at (near) zero cost and to explore k-nearest neighbor (k-NN) and nearest class mean (NCM) classifiers.
Siamese Neural Networks for One-Shot Image Recognition
TLDR
A method for learning siamese neural networks which employ a unique structure to naturally rank similarity between inputs and is able to achieve strong results which exceed those of other deep learning models with near state-of-the-art performance on one-shot classification tasks.
Domain Adaption in One-Shot Learning
TLDR
This paper proposes a domain adaption framework based on adversarial networks, generalized for situations where the source and target domain have different labels, and uses a policy network, inspired by human learning behaviors, to effectively select samples from the source domain in the training process.
Few-Shot Adversarial Domain Adaptation
TLDR
This work provides a framework for addressing the problem of supervised domain adaptation with deep models by carefully designing a training scheme whereby the typical binary adversarial discriminator is augmented to distinguish between four different classes.
...
1
2
3
4
...