• Mathematics, Computer Science, Medicine
  • Published in ICML 2017

Continual Learning Through Synaptic Intelligence

@article{Zenke2017ContinualLT,
  title={Continual Learning Through Synaptic Intelligence},
  author={Friedemann Zenke and Ben Poole and Surya Ganguli},
  journal={Proceedings of machine learning research},
  year={2017},
  volume={70},
  pages={
          3987-3995
        }
}
While deep learning has led to remarkable advances across diverse applications, it struggles in domains where the data distribution changes over the course of learning. In stark contrast, biological neural networks continually adapt to changing domains, possibly by leveraging complex molecular machinery to solve many tasks simultaneously. In this study, we introduce intelligent synapses that bring some of this biological complexity into artificial neural networks. Each synapse accumulates task… CONTINUE READING

Figures, Tables, and Topics from this paper.

Citations

Publications citing this paper.
SHOWING 1-10 OF 211 CITATIONS

Attention-Based Selective Plasticity

VIEW 12 EXCERPTS
CITES BACKGROUND, RESULTS & METHODS
HIGHLY INFLUENCED

Attention-Based Structural-Plasticity

VIEW 12 EXCERPTS
CITES BACKGROUND, RESULTS & METHODS
HIGHLY INFLUENCED

Continual Learning with Tiny Episodic Memories

VIEW 11 EXCERPTS
CITES BACKGROUND & METHODS
HIGHLY INFLUENCED

LEARNING WITH LONG-TERM REMEMBERING: FOL-

VIEW 12 EXCERPTS
CITES METHODS & BACKGROUND
HIGHLY INFLUENCED

C ONTINUAL LEARNING WITH HYPERNETWORKS

VIEW 9 EXCERPTS
CITES METHODS, RESULTS & BACKGROUND
HIGHLY INFLUENCED

Continual Learning via Online Leverage Score Sampling

VIEW 13 EXCERPTS
CITES BACKGROUND & METHODS
HIGHLY INFLUENCED

FILTER CITATIONS BY YEAR

2017
2020

CITATION STATISTICS

  • 53 Highly Influenced Citations

  • Averaged 69 Citations per year from 2017 through 2019

  • 137% Increase in citations per year in 2019 over 2018

References

Publications referenced by this paper.
SHOWING 1-10 OF 28 REFERENCES

An Empirical Investigation of Catastrophic Forgetting in Gradient-Based Neural Networks

Goodfellow, J Ian, +7 authors Yoshua
  • 2013
VIEW 12 EXCERPTS
HIGHLY INFLUENTIAL

Learning without Forgetting

  • IEEE Transactions on Pattern Analysis and Machine Intelligence
  • 2016

Deep learning

Geoffrey. Hinton
  • 2015

Split CIFAR-10/100 CNN architecture For our CIFAR-10/100 experiments, we used the default CIFAR-10 CNN from Keras: Operation Kernel Stride Filters Dropout Nonlin

A. 10.1523JNEUROSCI
  • 2015