Memorizing Complementation Network for Few-Shot Class-Incremental Learning

  title={Memorizing Complementation Network for Few-Shot Class-Incremental Learning},
  author={Zhong Ji and Zhi Hou and Xiyao Liu and Yanwei Pang and Xuelong Li},
—Few-shot Class-Incremental Learning (FSCIL) aims at learning new concepts continually with only a few samples, which is prone to suffer the catastrophic forgetting and overfitting problems. The inaccessibility of old classes and the scarcity of the novel samples make it formidable to realize the trade-off between retaining old knowledge and learning novel concepts. Inspired by that different models memorize different knowledge when learn- ing novel concepts, we propose a Memorizing… 



Few-Shot Class-Incremental Learning

This paper proposes the TOpology-Preserving knowledge InCrementer (TOPIC) framework, which mitigates the forgetting of the old classes by stabilizing NG's topology and improves the representation learning for few-shot new classes by growing and adapting NG to new training samples.

Semantic-aware Knowledge Distillation for Few-Shot Class-Incremental Learning

A distillation algorithm is introduced to address the problem of FSCIL and a method based on an attention mechanism on multiple parallel embeddings of visual data to align visual and semantic vectors, which reduces issues related to catastrophic forgetting.

Overcoming Catastrophic Forgetting in Incremental Few-Shot Learning by Finding Flat Minima

This paper considers incremental few-shot learning, which requires a model to continually recognize new categories with only a few examples provided, and proposes to search for flat local minima of the base training objective function and then fine-tune the model parameters within the flat region on new tasks.

MgSvF: Multi-Grained Slow vs. Fast Framework for Few-Shot Class-Incremental Learning.

This paper proposes a multi-grained SvF learning strategy to cope with the SvF dilemma from two different grains: intra-space and inter-space (between two different feature spaces); the proposed strategy designs a novel frequency-aware regularization to boost the intra- Space SvF capability, and meanwhile develops a new feature space composition operation to enhance the inter- space SvFlearning performance.

Self-Promoted Prototype Refinement for Few-Shot Class-Incremental Learning

This work proposes a novel incremental prototype learning scheme which consists of a random episode selection strategy that adapts the feature representation to various generated incremental episodes to enhance the corresponding extensibility, and a self-promoted prototype refinement mechanism which strengthens the expression ability of the new classes by explicitly considering the dependencies among different classes.

Few-Shot Incremental Learning with Continually Evolved Classifiers

This paper adopted a simple but effective decoupled learning strategy of representations and classifiers that only the classifiers are updated in each incremental session, which avoids knowledge forgetting in the representations and proposes a Continually Evolved Classifier (CEC) that employs a graph model to propagate context information between classifiers for adaptation.

Synthesized Feature based Few-Shot Class-Incremental Learning on a Mixture of Subspaces

This paper proposes to employ a variational autoencoder (VAE) to generate synthesized visual samples for augmenting pseudo-feature while learning novel classes incrementally using a mixture of subspaces to reduce the forgetting and overfitting problem of FSCIL.

Learning to Compare: Relation Network for Few-Shot Learning

A conceptually simple, flexible, and general framework for few-shot learning, where a classifier must learn to recognise new classes given only few examples from each, which is easily extended to zero- shot learning.

Few-Shot Lifelong Learning

A novel Few-Shot Lifelong Learning (FSLL) method that enables deep learning models to perform lifelong/continual learning on few-shot data and minimize the cosine similarity between the new and the old class prototypes in order to maximize their separation, thereby improving the classification performance.

Diversity With Cooperation: Ensemble Methods for Few-Shot Classification

This work shows that by addressing the fundamental high-variance issue of few-shot learning classifiers, it is possible to significantly outperform current meta-learning techniques.