Understanding Synthetic Gradients and Decoupled Neural Interfaces
@inproceedings{Czarnecki2017UnderstandingSG, title={Understanding Synthetic Gradients and Decoupled Neural Interfaces}, author={Wojciech Czarnecki and G. Swirszcz and Max Jaderberg and Simon Osindero and Oriol Vinyals and K. Kavukcuoglu}, booktitle={ICML}, year={2017} }
When training neural networks, the use of Synthetic Gradients (SG) allows layers or modules to be trained without update locking - without waiting for a true error gradient to be backpropagated - resulting in Decoupled Neural Interfaces (DNIs). This unlocked ability of being able to update parts of a neural network asynchronously and with only local information was demonstrated to work empirically in Jaderberg et al (2016). However, there has been very little demonstration of what changes DNIs… Expand
35 Citations
Benchmarking Decoupled Neural Interfaces with Synthetic Gradients
- Computer Science, Mathematics
- ArXiv
- 2017
- PDF
Learning Without Feedback: Fixed Random Learning Signals Allow for Feedforward Training of Deep Neural Networks
- Medicine, Mathematics
- Frontiers in Neuroscience
- 2021
- 1
- Highly Influenced
- PDF
Learning to solve the credit assignment problem
- Computer Science, Biology
- ICLR
- 2020
- 14
- Highly Influenced
- PDF
Network Parameter Learning Using Nonlinear Transforms, Local Representation Goals and Local Propagation Constraints
- Computer Science, Mathematics
- ArXiv
- 2019
- PDF
Fast & Slow Learning: Incorporating Synthetic Gradients in Neural Memory Controllers
- Computer Science
- ArXiv
- 2020
- PDF
Decoupled Parallel Backpropagation with Convergence Guarantee
- Computer Science, Mathematics
- ICML
- 2018
- 34
- PDF
References
SHOWING 1-10 OF 20 REFERENCES
Toward an Integration of Deep Learning and Neuroscience
- Computer Science, Biology
- Front. Comput. Neurosci.
- 2016
- 327
- PDF
Learning in the machine: Random backpropagation and the deep learning channel
- Computer Science, Medicine
- Artif. Intell.
- 2018
- 34
- PDF
Direct Feedback Alignment Provides Learning in Deep Neural Networks
- Computer Science, Mathematics
- NIPS
- 2016
- 164
- PDF
Learning in the Machine: Random Backpropagation and the Learning Channel
- Computer Science
- ArXiv
- 2016
- 15
- Highly Influential
- PDF
Random synaptic feedback weights support error backpropagation for deep learning
- Computer Science, Medicine
- Nature communications
- 2016
- 356
- Highly Influential
- PDF
Understanding intermediate layers using linear classifier probes
- Computer Science, Mathematics
- ICLR
- 2017
- 226
- PDF