Generalized and Incremental Few-Shot Learning by Explicit Learning and Calibration without Forgetting

  title={Generalized and Incremental Few-Shot Learning by Explicit Learning and Calibration without Forgetting},
  author={Anna Kukleva and Hilde Kuehne and Bernt Schiele},
  journal={2021 IEEE/CVF International Conference on Computer Vision (ICCV)},
Both generalized and incremental few-shot learning have to deal with three major challenges: learning novel classes from only few samples per class, preventing catastrophic forgetting of base classes, and classifier calibration across novel and base classes. In this work we propose a three-stage framework that allows to explicitly and effectively address these challenges. While the first phase learns base classes with many samples, the second phase learns a calibrated classifier for novel… 

Incremental Few-Shot Learning via Implanting and Compressing

This work proposes a two-step learning strategy, called IMCO, which optimizes both feature space partition and novel class reconstruction in a systematic manner and outperforms competing baselines with a significant margin, both in image classification task and more challenging object detection task.

A Strong Baseline for Semi-Supervised Incremental Few-Shot Learning

A novel paradigm containing two parts is proposed: a well-designed meta-training algorithm for mitigating ambiguity between base and novel classes caused by unreliable pseudo labels and a model adaptation mechanism to learn discriminative features for novel classes while preserving base knowledge using few labeled and all the unlabeled data.

Variable Few Shot Class Incremental and Open World Learning

  • T. AhmadA. Dhamija T. Boult
  • Computer Science
    2022 IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops (CVPRW)
  • 2022
The state-of-the-art FSCIL approach is extended to operate in Up-to N-Ways, Up- to K-Shots class incremental and open-world settings and a novel but simple approach for VFSCIL/VFSOWL is proposed where the current advancements in self-supervised feature learning are leveraged.

Better Generalized Few-Shot Learning Even Without Base Data

The experimental results somewhat surprisingly show that the proposed zero-base GFSL method that does not utilize any base samples even outperforms the existing GFSL methods that make the best use of base data.

Forward Compatible Few-Shot Class-Incremental Learning

Virtual prototypes are assigned to squeeze the embedding of known classes and reserve for new classes, which allow the model to accept possible updates in the future and act as proxies scattered among embedding space to build a stronger classifier during inference.

Coarse-To-Fine Incremental Few-Shot Learning

This paper formulates a hybrid natural problem of coarse-to-fine few-shot (C2FS) recognition as a CIL problem named C2FSCIL, and proposes a simple, effective, and theoretically-sound strategy Knowe: to learn, freeze, and normalize a classifier’s weights from fine labels, once learning an embedding space contrastively from coarse labels.

CTR: Contrastive Training Recognition Classifier for Few-Shot Open-World Recognition

A novel methodology, the data distribution boundary Contrastive Training Recognition (CTR) classifier for few-shot OWR, which takes advantage of labels and classes to learn the normal (and few- shot abnormal) data better, to more accurately detect OoD.

RMM: Reinforced Memory Management for Class-Incremental Learning

Class-Incremental Learning (CIL) [40] trains classifiers under a strict memory budget: in each incremental phase, learning is done for new data, most of which is abandoned to free space for the next

Few-Shot Incremental Learning for Label-to-Image Translation

A few-shot incremental learning method that outperforms existing related methods in most cases and achieves zero forgetting of already-learned semantic classes and enables label-to-image translation of scenes with increasingly rich content is introduced.



Incremental few-shot learning via vector quantization in deep embedded space

The proposed learning vector quantization in deep embedded space can be customized as a kernel smoother to handle incremental few-shot regression tasks and outperforms other state-of-the-art methods in incremental learning.

Few-Shot Class-Incremental Learning

This paper proposes the TOpology-Preserving knowledge InCrementer (TOPIC) framework, which mitigates the forgetting of the old classes by stabilizing NG's topology and improves the representation learning for few-shot new classes by growing and adapting NG to new training samples.

Learning to Compare: Relation Network for Few-Shot Learning

A conceptually simple, flexible, and general framework for few-shot learning, where a classifier must learn to recognise new classes given only few examples from each, which is easily extended to zero- shot learning.

Relational Generalized Few-Shot Learning

This work proposes a graph-based framework that explicitly models relationships between all seen and novel classes in the joint label space of generalized few-shot learning and incorporates these inter-class relations using graph-convolution in order to embed novel class representations into the existing space of previously seen classes in a globally consistent manner.

Few-Shot Learning With Global Class Representations

This paper proposes to tackle the challenging few-shot learning (FSL) problem by learning global class representations using both base and novel class training samples, and an effective sample synthesis strategy is developed to avoid overfitting.

Generalized Many-Way Few-Shot Video Classification

A simple 3D CNN baseline is developed, surpassing existing methods by a large margin and proposed to leverage weakly-labeled videos from a large dataset using tag retrieval followed by selecting the best clips with visual similarities, yielding further improvement.

Cognitively-Inspired Model for Incremental Learning Using a Few Examples

  • Ali AyubAlan R. Wagner
  • Computer Science
    2020 IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops (CVPRW)
  • 2020
This work proposes a novel approach inspired by the concept learning model of the hippocampus and the neocortex that represents each image class as centroids and does not suffer from catastrophic forgetting when learning classes incrementally.

Incremental Few-Shot Learning with Attention Attractor Networks

A meta-learning model, the Attention Attractor Network, which regularizes the learning of novel classes, and it is demonstrated that the learned attractor network can help recognize novel classes while remembering old classes without the need to review the original training set.

Few-Shot Learning via Embedding Adaptation With Set-to-Set Functions

This paper proposes a novel approach to adapt the instance embeddings to the target classification task with a set-to-set function, yielding embeddeddings that are task-specific and are discriminative.

A Baseline for Few-Shot Image Classification

This work performs extensive studies on benchmark datasets to propose a metric that quantifies the "hardness" of a few-shot episode and finds that using a large number of meta-training classes results in high few- shot accuracies even for a largeNumber of few-shots classes.