• Corpus ID: 14772035

A wake-sleep algorithm for recurrent, spiking neural networks

@article{Thiele2017AWA,
  title={A wake-sleep algorithm for recurrent, spiking neural networks},
  author={Johannes C. Thiele and Peter U. Diehl and Matthew Cook},
  journal={ArXiv},
  year={2017},
  volume={abs/1703.06290}
}
We investigate a recently proposed model for cortical computation which performs relational inference. It consists of several interconnected, structurally equivalent populations of leaky integrate-and-fire (LIF) neurons, which are trained in a self-organized fashion with spike-timing dependent plasticity (STDP). Despite its robust learning dynamics, the model is susceptible to a problem typical for recurrent networks which use a correlation based (Hebbian) learning rule: if trained with high… 

Figures from this paper

Learning to Generate Sequences with Combination of Hebbian and Non-hebbian Plasticity in Recurrent Spiking Neural Networks
TLDR
It is shown that inclusion of the adaptive decay of synaptic weights with standard STDP helps learn stable contextual dependencies between temporal sequences, while reducing the strong attractor states that emerge in recurrent models due to feedback loops.
A Spiking Network for Inference of Relations Trained with Neuromorphic Backpropagation
TLDR
The architecture is the first spiking neural network architecture with on-chip learning capabilities, which is able to perform relational inference on complex visual stimuli, which makes the system interesting for sensor fusion applications and embedded learning in autonomous neuromorphic agents.
First-Spike-Based Visual Categorization Using Reward-Modulated STDP
TLDR
For the first time, it is shown that RL can be used efficiently to train a spiking neural network (SNN) to perform object recognition in natural images without using an external classifier.
Combining STDP and Reward-Modulated STDP in Deep Convolutional Spiking Neural Networks for Digit Recognition
TLDR
A deep convolutional spiking neural network (DCSNN) and a latency-coding scheme is used that is biologically plausible, hardware friendly, and energy-efficient, and it is demonstrated that R-STDP extracts features that are diagnostic for the task at hand, and discards the other ones, whereas STDP extracts any feature that repeats.
Integration of sleep drive and navigation in Drosophila
TLDR
A ring attractor model that maintains activity at a setpoint in the face of plasticity is developed that captures features of their neural dynamics observed in flies and mice during wakefulness and sleep.
The overfitted brain: Dreams evolved to assist generalization
A wavelet-based neural network scheme for supervised and unsupervised learning
TLDR
A scheme for supervised and unsupervised learning based on successive decomposition of random inputs by means of wavelet basis is introduced, and it is shown that when the estimated means and covariances converge, the sequence of Gaussian distributions of the inner representation of the world in the scheme also converges.
Deep Belief Networks Based Toponym Recognition for Chinese Text
TLDR
An adapted toponym recognition approach based on deep belief network (DBN) is proposed by exploring two key issues: word representation and model interpretation and believes that the DBN-based approach is a promising and powerful method to extract geo-referenced information from text in the future.

References

SHOWING 1-10 OF 17 REFERENCES
Self-Organizing Spiking Neural Model for Learning Fault-Tolerant Spatio-Motor Transformations
TLDR
A spiking neural model in the form of a multilayered architecture consisting of integrate and fire neurons and synapses that employ spike-timing-dependent plasticity learning rule to learn spatio-motor transformations that lays the foundation for learning other complex functions and transformations in real-world scenarios.
Unsupervised Learning of Visual Features through Spike Timing Dependent Plasticity
TLDR
The results show that temporal codes may be a key to understanding the phenomenal processing speed achieved by the visual system and that STDP can lead to fast and selective responses.
Learning and Inferring Relations in Cortical Networks
TLDR
This work shows how uniform modules of excitatory and inhibitory neurons can be connected bidirectionally in a network that, when exposed to input in the form of population codes, learns the input encodings as well as the relationships between the inputs.
Triplets of Spikes in a Model of Spike Timing-Dependent Plasticity
TLDR
A triplet rule is examined, a rule which considers sets of three spikes and is possible to fit experimental data from visual cortical slices as well as from hippocampal cultures and can be mapped to a Bienenstock–Cooper–Munro learning rule.
‘Unlearning’ has a stabilizing effect in collective memories
TLDR
Although the model was not motivated by higher nervous function, the system displays behaviours which are strikingly parallel to those needed for the hypothesized role of ‘unlearning’ in rapid eye movement (REM) sleep.
The "wake-sleep" algorithm for unsupervised neural networks.
An unsupervised learning algorithm for a multilayer network of stochastic neurons is described. Bottom-up "recognition" connections convert the input into representations in successive hidden layers,
Unsupervised features extraction from asynchronous silicon retina through Spike-Timing-Dependent Plasticity
TLDR
A spiking neural network capable of performing multilayer unsupervised learning through Spike-Timing-Dependent Plasticity is introduced and shows exceptional performances at detecting cars passing on a freeway recorded with a dynamic vision sensor.
Learning Cross-Modal Spatial Transformations through Spike Timing-Dependent Plasticity
TLDR
Here it is shown that a network of spiking neurons can learn the coordinate transformation from one frame of reference to another, with connectivity that develops continuously in an unsupervised manner, based only on the correlations available in the environment and with a biologically realistic plasticity mechanism.
Brian: a simulator for spiking neural networks in Python
TLDR
A new simulator for spiking neural networks, written in Python, which will be especially valuable for working on non-standard neuron models not easily covered by existing software, and as an alternative to using Matlab or C for simulations.
...
...