• Corpus ID: 245704625

Exemplar-free Class Incremental Learning via Discriminative and Comparable One-class Classifiers

@article{Sun2022ExemplarfreeCI,
  title={Exemplar-free Class Incremental Learning via Discriminative and Comparable One-class Classifiers},
  author={Wenju Sun and Qingyong Li and Jing Zhang and Danyu Wang and Wen Wang and Yangli-ao Geng},
  journal={ArXiv},
  year={2022},
  volume={abs/2201.01488}
}
The exemplar-free class incremental learning requires classification models to learn new class knowledge incrementally without retaining any old samples. Recently, the framework based on parallel one-class classifiers (POC), which trains a one-class classifier (OCC) independently for each category, has attracted extensive attention, since it can naturally avoid catastrophic forgetting. POC, however, suffers from weak discriminability and comparability due to its independent training strategy… 

Figures and Tables from this paper

References

SHOWING 1-10 OF 40 REFERENCES

ILCOC: An Incremental Learning Framework based on Contrastive One-class Classifiers

  • WenJu SunJ. Zhang
  • Computer Science
    2021 IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops (CVPRW)
  • 2021
A novel Incremental Learning framework based on Contrastive One-class Classifiers (ILCOC) to avoid catastrophic forgetting, and designs a scale-boundary loss, a classifier-contrastive loss and a negative-suppression loss to strengthen the comparability of classifiers outputs and the discrimination ability of each one-class classifier.

Continual Learning by Using Information of Each Class Holistically

This paper proposes a one-class learning based technique for CL, which considers features of each class holistically rather than only the discriminative information for classifying the classes seen so far, and represents a new approach to solving the CL problem.

Large Scale Incremental Learning

  • Yue WuYinpeng Chen Y. Fu
  • Computer Science
    2019 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)
  • 2019
This work found that the last fully connected layer has a strong bias towards the new classes, and this bias can be corrected by a linear model, and with two bias parameters, this method performs remarkably well on two large datasets.

iCaRL: Incremental Classifier and Representation Learning

iCaRL can learn many classes incrementally over a long period of time where other strategies quickly fail, and distinguishes it from earlier works that were fundamentally limited to fixed data representations and therefore incompatible with deep learning architectures.

A Continual Learning Survey: Defying Forgetting in Classification Tasks

This work focuses on task incremental classification, where tasks arrive sequentially and are delineated by clear boundaries, and develops a novel framework to continually determine the stability-plasticity trade-off of the continual learner.

PODNet: Pooled Outputs Distillation for Small-Tasks Incremental Learning

This work proposes PODNet, a model inspired by representation learning that fights catastrophic forgetting, even over very long runs of small incremental tasks --a setting so far unexplored by current works.

Learning Multiple Layers of Features from Tiny Images

It is shown how to train a multi-layer generative model that learns to extract meaningful features which resemble those found in the human visual cortex, using a novel parallelization algorithm to distribute the work among multiple machines connected on a network.

Overcoming Catastrophic Forgetting by Incremental Moment Matching

IMM incrementally matches the moment of the posterior distribution of the neural network which is trained on the first and the second task, respectively to make the search space of posterior parameter smooth.

Learning without Forgetting

This work proposes the Learning without Forgetting method, which uses only new task data to train the network while preserving the original capabilities, and performs favorably compared to commonly used feature extraction and fine-tuning adaption techniques.