Corpus ID: 219573283

Understanding Regularisation Methods for Continual Learning

@article{Benzing2020UnderstandingRM,
  title={Understanding Regularisation Methods for Continual Learning},
  author={Frederik Benzing},
  journal={ArXiv},
  year={2020},
  volume={abs/2006.06357}
}
  • Frederik Benzing
  • Published 2020
  • Computer Science, Mathematics
  • ArXiv
  • The problem of Catastrophic Forgetting has received a lot of attention in the past years. An important class of proposed solutions are so-called regularisation approaches, which protect weights from large changes according to their importances. Various ways to measure this importance have been put forward, all stemming from different theoretical or intuitive motivations. We present mathematical and empirical evidence that two of these methods -- Synaptic Intelligence and Memory Aware Synapses… CONTINUE READING

    Figures, Tables, and Topics from this paper.

    References

    SHOWING 1-10 OF 49 REFERENCES
    Continual Learning via Neural Pruning
    • 28
    • PDF
    Memory Aware Synapses: Learning what (not) to forget
    • 224
    • Highly Influential
    • PDF
    Learn to Grow: A Continual Structure Learning Framework for Overcoming Catastrophic Forgetting
    • 47
    • PDF
    Three scenarios for continual learning
    • 59
    • PDF
    An Empirical Investigation of Catastrophic Forgeting in Gradient-Based Neural Networks
    • 433
    • Highly Influential
    • PDF
    Overcoming catastrophic forgetting in neural networks
    • 1,336
    • Highly Influential
    • PDF
    Continual learning: A comparative study on how to defy forgetting in classification tasks
    • 50
    • Highly Influential
    Gradient Episodic Memory for Continual Learning
    • 463
    • PDF
    Online Structured Laplace Approximations For Overcoming Catastrophic Forgetting
    • 83
    • Highly Influential
    • PDF
    Continual Learning with Deep Generative Replay
    • 384
    • PDF