Overcoming catastrophic forgetting in neural networks

  title={Overcoming catastrophic forgetting in neural networks},
  author={James Kirkpatrick and Razvan Pascanu and Neil C. Rabinowitz and Joel Veness and Guillaume Desjardins and Andrei A. Rusu and Kieran Milan and John Quan and Tiago Ramalho and Agnieszka Grabska-Barwinska and Demis Hassabis and Claudia Clopath and Dharshan Kumaran and Raia Hadsell},
  journal={Proceedings of the National Academy of Sciences of the United States of America},
  volume={114 13},
The ability to learn tasks in a sequential fashion is crucial to the development of artificial intelligence. Until now neural networks have not been capable of this and it has been widely thought that catastrophic forgetting is an inevitable feature of connectionist models. We show that it is possible to overcome this limitation and train networks that can maintain expertise on tasks that they have not experienced for a long time. Our approach remembers old tasks by selectively slowing down… CONTINUE READING
Highly Influential
This paper has highly influenced 36 other papers. REVIEW HIGHLY INFLUENTIAL CITATIONS
Highly Cited
This paper has 316 citations. REVIEW CITATIONS

6 Figures & Tables



Citations per Year

317 Citations

Semantic Scholar estimates that this publication has 317 citations based on the available data.

See our FAQ for additional information.

Blog posts, news articles and tweet counts and IDs sourced by