Corpus ID: 236428455

In Defense of the Learning Without Forgetting for Task Incremental Learning

  title={In Defense of the Learning Without Forgetting for Task Incremental Learning},
  author={Guy Oren and Lior Wolf},
Catastrophic forgetting is one of the major challenges on the road for continual learning systems, which are presented with an on-line stream of tasks. The field has attracted considerable interest and a diverse set of methods have been presented for overcoming this challenge. Learning without Forgetting (LwF) is one of the earliest and most frequently cited methods. It has the advantages of not requiring the storage of samples from the previous tasks, of implementation simplicity, and of being… Expand

Figures and Tables from this paper


Overcoming catastrophic forgetting with hard attention to the task
A task-based hard attention mechanism that preserves previous tasks' information without affecting the current task's learning, and features the possibility to control both the stability and compactness of the learned knowledge, which makes it also attractive for online learning or network compression applications. Expand
Continual learning: A comparative study on how to defy forgetting in classification tasks
This work focuses on task-incremental classification, where tasks arrive in a batch-like fashion, and are delineated by clear boundaries, and studies the influence of model capacity, weight decay and dropout regularization, and the order in which the tasks are presented, to compare methods in terms of required memory, computation time and storage. Expand
Learn to Grow: A Continual Structure Learning Framework for Overcoming Catastrophic Forgetting
By separating the explicit neural structure learning and the parameter estimation, the proposed method is capable of evolving neural structures in an intuitively meaningful way, but also shows strong capabilities of alleviating catastrophic forgetting in experiments. Expand
Continual learning with hypernetworks
Insight is provided into the structure of low-dimensional task embedding spaces (the input space of the hypernetwork) and it is shown that task-conditioned hypernetworks demonstrate transfer learning. Expand
Encoder Based Lifelong Learning
A new lifelong learning solution where a single model is trained for a sequence of tasks, aimed at preserving the knowledge of the previous tasks while learning a new one by using autoencoders. Expand
Overcoming Catastrophic Forgetting by Incremental Moment Matching
IMM incrementally matches the moment of the posterior distribution of the neural network which is trained on the first and the second task, respectively to make the search space of posterior parameter smooth. Expand
Gradient Episodic Memory for Continual Learning
A model for continual learning, called Gradient Episodic Memory (GEM) is proposed that alleviates forgetting, while allowing beneficial transfer of knowledge to previous tasks. Expand
Overcoming Catastrophic Forgetting for Continual Learning via Model Adaptation
This paper proposes a very different approach, called Parameter Generation and Model Adaptation (PGMA), to dealing with the problem of catastrophic forgetting in standard neural network architectures. Expand
Three scenarios for continual learning
Three continual learning scenarios are described based on whether at test time task identity is provided and--in case it is not--whether it must be inferred, and it is found that regularization-based approaches fail and that replaying representations of previous experiences seems required for solving this scenario. Expand
Overcoming catastrophic forgetting in neural networks
It is shown that it is possible to overcome the limitation of connectionist models and train networks that can maintain expertise on tasks that they have not experienced for a long time and selectively slowing down learning on the weights important for previous tasks. Expand