Class Impression for Data-free Incremental Learning

@inproceedings{Ayromlou2022ClassIF,
  title={Class Impression for Data-free Incremental Learning},
  author={Sana Ayromlou and Purang Abolmaesumi and Teresa Tsang and Xiaoxiao Li},
  booktitle={MICCAI},
  year={2022}
}
. Standard deep learning-based classification approaches require collecting all samples from all classes in advance and are trained offline. This paradigm may not be practical in real-world clinical applications, where new classes are incrementally introduced through the addition of new data. Class incremental learning is a strategy allowing learning from such data. However, a major challenge is catastrophic forgetting, i.e., performance degradation on previous classes when adapting a trained… 

Figures and Tables from this paper

References

SHOWING 1-10 OF 28 REFERENCES

Learning a Unified Classifier Incrementally via Rebalancing

TLDR
This work develops a new framework for incrementally learning a unified classifier, e.g. a classifier that treats both old and new classes uniformly, and incorporates three components, cosine normalization, less-forget constraint, and inter-class separation, to mitigate the adverse effects of the imbalance.

Class-incremental Learning via Deep Model Consolidation

TLDR
A class-incremental learning paradigm called Deep Model Consolidation (DMC), which works well even when the original training data is not available, and demonstrates significantly better performance in image classification and object detection in the single-headed IL setting.

Class-Incremental Learning via Dual Augmentation

TLDR
A simple and novel approach that employs explicit class augmentation and implicit semantic augmentation to address the two biases, respectively, in class-incremental learning.

Always Be Dreaming: A New Approach for Data-Free Class-Incremental Learning

TLDR
This work considers the high-impact problem of Data-Free Class-Incremental Learning (DFCIL), where an incremental learning agent must learn new concepts over time without storing generators or training data from past tasks, and proposes a novel incremental distillation strategy for DFCIL.

Continual Learning with Bayesian Model Based on a Fixed Pre-trained Feature Extractor

TLDR
Experiments on multiple medical and natural image classification tasks showed that the proposed approach outperforms state-of-the-art approaches which even keep some images of old classes during continual learning of new classes.

Boosting Few-Shot Learning With Adaptive Margin Loss

TLDR
An adaptive margin principle is proposed to improve the generalization ability of metric-based meta-learning approaches for few-shot learning problems by developing a class-relevant additive margin loss, where semantic similarity between each pair of classes is considered to separate samples in the feature embedding space from similar classes.

Encoder Based Lifelong Learning

TLDR
A new lifelong learning solution where a single model is trained for a sequence of tasks, aimed at preserving the knowledge of the previous tasks while learning a new one by using autoencoders.

Dreaming to Distill: Data-Free Knowledge Transfer via DeepInversion

TLDR
DeepInversion is introduced, a new method for synthesizing images from the image distribution used to train a deep neural network, which optimizes the input while regularizing the distribution of intermediate feature maps using information stored in the batch normalization layers of the teacher.

Learning without Forgetting

TLDR
This work proposes the Learning without Forgetting method, which uses only new task data to train the network while preserving the original capabilities, and performs favorably compared to commonly used feature extraction and fine-tuning adaption techniques.

iCaRL: Incremental Classifier and Representation Learning

TLDR
iCaRL can learn many classes incrementally over a long period of time where other strategies quickly fail, and distinguishes it from earlier works that were fundamentally limited to fixed data representations and therefore incompatible with deep learning architectures.