Novel Class Discovery without Forgetting

@article{Joseph2022NovelCD,
  title={Novel Class Discovery without Forgetting},
  author={K. J. Joseph and S. Paul and Gaurav Aggarwal and Soma Biswas and Piyush Rai and Kai Han and Vineeth N. Balasubramanian},
  journal={ArXiv},
  year={2022},
  volume={abs/2207.10659}
}
. Humans possess an innate ability to identify and differentiate instances that they are not familiar with, by leveraging and adapting the knowledge that they have acquired so far. Importantly, they achieve this without deteriorating the performance on their earlier learning. Inspired by this, we identify and formulate a new, pragmatic problem setting of NCDwF: Novel Class Discovery without Forgetting , which tasks a machine learning model to incrementally discover novel categories of instances… 

Figures and Tables from this paper

References

SHOWING 1-10 OF 63 REFERENCES

Mnemonics Training: Multi-Class Incremental Learning Without Forgetting

TLDR
This paper proposes a novel and automatic framework, called mnemonics, where parameterize exemplars and make them optimizable in an end-to-end manner, and shows that using mnemonic exemplars can surpass the state-of-the-art by a large margin.

Incremental Object Detection via Meta-Learning

TLDR
A meta-learning approach that learns to reshape model gradients, such that information across incremental tasks is optimally shared ensures a seamless information transfer via a meta-learned gradient preconditioning that minimizes forgetting and maximizes knowledge transfer.

IL2M: Class Incremental Learning With Dual Memory

TLDR
This paper presents a class incremental learning method which exploits fine tuning and a dual memory to reduce the negative effect of catastrophic forgetting in image recognition and shows that the proposed approach is more effective than a range of competitive state-of-the-art methods.

Spacing Loss for Discovering Novel Categories

TLDR
This work characterize existing NCD approaches into singlestage and two-stage methods based on whether they require access to labeled and unlabeled data together while discovering new classes and devise a simple yet powerful loss function that enforces separability in the latent space using cues from multi-dimensional scaling, which is referred to as Spacing Loss.

Learning a Unified Classifier Incrementally via Rebalancing

TLDR
This work develops a new framework for incrementally learning a unified classifier, e.g. a classifier that treats both old and new classes uniformly, and incorporates three components, cosine normalization, less-forget constraint, and inter-class separation, to mitigate the adverse effects of the imbalance.

A Unified Objective for Novel Class Discovery

TLDR
A UNified Objective function (UNO) for discovering novel classes, with the explicit purpose of favoring synergy between supervised and unsupervised learning, and outperforms the state of the art on several benchmarks.

Neighborhood Contrastive Learning for Novel Class Discovery

TLDR
This paper addresses Novel Class Discovery (NCD), the task of unveiling new classes in a set of unlabeled samples given a labeled dataset with known classes, and builds a new framework, named Neighborhood Contrastive Learning (NCL), to learn discriminative representations that are important to clustering performance.

Large Scale Incremental Learning

  • Yue WuYinpeng Chen Y. Fu
  • Computer Science
    2019 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)
  • 2019
TLDR
This work found that the last fully connected layer has a strong bias towards the new classes, and this bias can be corrected by a linear model, and with two bias parameters, this method performs remarkably well on two large datasets.

Learning without Forgetting

TLDR
This work proposes the Learning without Forgetting method, which uses only new task data to train the network while preserving the original capabilities, and performs favorably compared to commonly used feature extraction and fine-tuning adaption techniques.

PODNet: Pooled Outputs Distillation for Small-Tasks Incremental Learning

TLDR
This work proposes PODNet, a model inspired by representation learning that fights catastrophic forgetting, even over very long runs of small incremental tasks --a setting so far unexplored by current works.
...