Corpus ID: 15341500

Learning in the Machine: Random Backpropagation and the Learning Channel

@article{Baldi2016LearningIT,
  title={Learning in the Machine: Random Backpropagation and the Learning Channel},
  author={P. Baldi and Peter Sadowski and Z. Lu},
  journal={ArXiv},
  year={2016},
  volume={abs/1612.02734}
}
Abstract: Random backpropagation (RBP) is a variant of the backpropagation algorithm for training neural networks, where the transpose of the forward matrices are replaced by fixed random matrices in the calculation of the weight updates. It is remarkable both because of its effectiveness, in spite of using random matrices to communicate error information, and because it completely removes the taxing requirement of maintaining symmetric weights in a physical neural system. To better understand… Expand
15 Citations
Event-Driven Random Back-Propagation: Enabling Neuromorphic Deep Learning Machines
  • 132
  • Highly Influenced
  • PDF
Deep Supervised Learning Using Local Errors
  • 38
  • Highly Influenced
  • PDF
Event-driven random backpropagation: Enabling neuromorphic deep learning machines
  • 46
  • Highly Influenced
  • PDF
A Learning Framework for Winner-Take-All Networks with Stochastic Synapses
  • 8
  • PDF
Understanding Synthetic Gradients and Decoupled Neural Interfaces
  • 35
  • Highly Influenced
  • PDF
Biologically plausible deep learning - but how far can we go with shallow networks?
  • 28
  • PDF
SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks
  • 165
  • Highly Influenced
  • PDF
The Neural Coding Framework for Learning Generative Models
  • PDF
Neural and Synaptic Array Transceiver: A Brain-Inspired Computing Framework for Embedded Learning
  • 18
  • PDF
Feedback alignment in deep convolutional networks
  • 25
  • PDF
...
1
2
...

References

SHOWING 1-10 OF 27 REFERENCES
A theory of local learning, the learning channel, and the optimality of backpropagation
  • 51
  • PDF
Random feedback weights support learning in deep neural networks
  • 115
  • PDF
Understanding the difficulty of training deep feedforward neural networks
  • 9,685
  • Highly Influential
  • PDF
The dropout learning algorithm
  • 202
  • PDF
Complex-Valued Autoencoders
  • 51
  • PDF
How Important Is Weight Symmetry in Backpropagation?
  • 86
  • PDF
Quantized Neural Networks: Training Neural Networks with Low Precision Weights and Activations
  • 991
  • PDF
A direct adaptive method for faster backpropagation learning: the RPROP algorithm
  • 4,238
  • PDF
Neural networks and principal component analysis: Learning from examples without local minima
  • 1,198
Deep Compression: Compressing Deep Neural Network with Pruning, Trained Quantization and Huffman Coding
  • 4,330
  • PDF
...
1
2
3
...