• Corpus ID: 237532316

Associative Memories via Predictive Coding

  title={Associative Memories via Predictive Coding},
  author={Tommaso Salvatori and Yuhang Song and Yujian Hong and Simon Frieder and Lei Sha and Zhenghua Xu and Rafał Bogacz and Thomas Lukasiewicz},
Associative memories in the brain receive and store patterns of activity registered by the sensory neurons, and are able to retrieve them when necessary. Due to their importance in human intelligence, computational models of associative memories have been developed for several decades now. In this paper, we present a novel neural model for realizing associative memories, which is based on a hierarchical generative network that receives external stimuli via sensory neurons. It is trained using… 


Dense Associative Memory for Pattern Recognition
The proposed duality makes it possible to apply energy-based intuition from associative memory to analyze computational properties of neural networks with unusual activation functions - the higher rectified polynomials which until now have not been used in deep learning.
Deep associative neural network for associative memory based on unsupervised representation learning
The DANN is able to achieve many machine learning problems, including not only classification, but also depicting the data given a label and recovering corrupted images, and is optimized by a modified contrastive divergence algorithm with a novel iterated sampling process.
An Approximation of the Error Backpropagation Algorithm in a Predictive Coding Network with Local Hebbian Synaptic Plasticity
This work shows that a network developed in the predictive coding framework can efficiently perform supervised learning fully autonomously, employing only simple local Hebbian plasticity.
Predictive Coding Can Do Exact Backpropagation on Convolutional and Recurrent Neural Networks
This work is the first to show that a biologically plausible algorithm is able to exactly replicate the accuracy of BP on such complex architectures, bridging the existing gap between IL and BP, and setting an unprecedented performance for PCNs, which can now be considered as efficient alternatives to BP.
Predictive Coding Can Do Exact Backpropagation on Any Neural Network
This is the first biologically plausible algorithm that is shown to be equivalent to BP in the way of updating parameters on any neural network, and it is thus a great breakthrough for the interdisciplinary research of neuroscience and deep learning.
On a Model of Associative Memory with Huge Storage Capacity
With this interaction, Krotov and Hopfield's generalized version of the well-known Hopfield model of associative memory has an exponential storage capacity in the number of neurons, yet the basins of attraction are almost as large as in the standard Hopfields.
Why there are complementary learning systems in the hippocampus and neocortex: insights from the successes and failures of connectionist models of learning and memory.
The account presented here suggests that memories are first stored via synaptic changes in the hippocampal system, that these changes support reinstatement of recent memories in the neocortex, that neocortical synapses change a little on each reinstatement, and that remote memory is based on accumulated neocorticals changes.
Unsupervised learning by competing hidden units
A learning algorithm is designed that utilizes global inhibition in the hidden layer and is capable of learning early feature detectors in a completely unsupervised way, and which is motivated by Hebb’s idea that change of the synapse strength should be local.
Associative memory optimized method on deep neural networks for image classification
This work proposes a novel classification optimization method based on deep neural networks to improve image classifiers, and introduces the LSTM network into an end-to-end deep learning framework to boost the performance of imageclassifiers.
Can the Brain Do Backpropagation? - Exact Implementation of Backpropagation in Predictive Coding Networks
A BL model is proposed that produces exactly the same updates of the neural weights as BP, while employing local plasticity, i.e., all neurons perform only local computations, done simultaneously and is modified to modify it to an alternative BL model that works fully autonomously.