Corpus ID: 219573283

Understanding Regularisation Methods for Continual Learning

  title={Understanding Regularisation Methods for Continual Learning},
  author={Frederik Benzing},
  • Frederik Benzing
  • Published 2020
  • Computer Science, Mathematics
  • ArXiv
  • The problem of Catastrophic Forgetting has received a lot of attention in the past years. An important class of proposed solutions are so-called regularisation approaches, which protect weights from large changes according to their importances. Various ways to measure this importance have been put forward, all stemming from different theoretical or intuitive motivations. We present mathematical and empirical evidence that two of these methods -- Synaptic Intelligence and Memory Aware Synapses… CONTINUE READING

    Figures, Tables, and Topics from this paper.


    Continual Learning via Neural Pruning
    • 28
    • PDF
    Memory Aware Synapses: Learning what (not) to forget
    • 224
    • Highly Influential
    • PDF
    Learn to Grow: A Continual Structure Learning Framework for Overcoming Catastrophic Forgetting
    • 47
    • PDF
    Three scenarios for continual learning
    • 59
    • PDF
    An Empirical Investigation of Catastrophic Forgeting in Gradient-Based Neural Networks
    • 433
    • Highly Influential
    • PDF
    Overcoming catastrophic forgetting in neural networks
    • 1,336
    • Highly Influential
    • PDF
    Continual learning: A comparative study on how to defy forgetting in classification tasks
    • 50
    • Highly Influential
    Gradient Episodic Memory for Continual Learning
    • 463
    • PDF
    Online Structured Laplace Approximations For Overcoming Catastrophic Forgetting
    • 83
    • Highly Influential
    • PDF
    Continual Learning with Deep Generative Replay
    • 384
    • PDF