Mnemonics Training: Multi-Class Incremental Learning Without Forgetting

@article{Liu2020MnemonicsTM,
  title={Mnemonics Training: Multi-Class Incremental Learning Without Forgetting},
  author={Yaoyao Liu and Anan Liu and Yuting Su and B. Schiele and Qianru Sun},
  journal={2020 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)},
  year={2020},
  pages={12242-12251}
}
  • Yaoyao Liu, Anan Liu, +2 authors Qianru Sun
  • Published 2020
  • Computer Science, Mathematics
  • 2020 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)
Multi-Class Incremental Learning (MCIL) aims to learn new concepts by incrementally updating a model trained on previous concepts. However, there is an inherent trade-off to effectively learning new concepts without catastrophic forgetting of previous ones. To alleviate this issue, it has been proposed to keep around a few examples of the previous concepts but the effectiveness of this approach heavily depends on the representativeness of these examples. This paper proposes a novel and… Expand

Figures and Tables from this paper

Essentials for Class Incremental Learning
Few-Shot Incremental Learning with Continually Evolved Classifiers
ZS-IL: Looking Back on Learned Experiences For Zero-Shot Incremental Learning
Unsupervised Continual Learning Via Pseudo Labels
  • Jiangpeng He, Feng Zhu
  • Computer Science
  • ArXiv
  • 2021
Learnable Expansion-and-Compression Network for Few-shot Class-Incremental Learning
Continual Learning From Unlabeled Data Via Deep Clustering
  • Jiangpeng He
  • 2021
Does Continual Learning = Catastrophic Forgetting?
A Comprehensive Study of Class Incremental Learning Algorithms for Visual Tasks
Balanced Softmax Cross-Entropy for Incremental Learning
...
1
2
3
4
5
...

References

SHOWING 1-10 OF 41 REFERENCES
End-to-End Incremental Learning
Learning a Unified Classifier Incrementally via Rebalancing
A Strategy for an Uncompromising Incremental Learner
Large Scale Incremental Learning
  • Yue Wu, Yinpeng Chen, +4 authors Y. Fu
  • Computer Science
  • 2019 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)
  • 2019
Meta-Transfer Learning through Hard Tasks
From N to N+1: Multiclass Transfer Incremental Learning
Meta-Transfer Learning for Few-Shot Learning
Measuring Catastrophic Forgetting in Neural Networks
Overcoming Catastrophic Forgetting for Continual Learning via Model Adaptation
Model-Agnostic Meta-Learning for Fast Adaptation of Deep Networks
...
1
2
3
4
5
...