Biologically Plausible Sequence Learning with Spiking Neural Networks

  title={Biologically Plausible Sequence Learning with Spiking Neural Networks},
  author={Zuozhu Liu and Thiparat Chotibut and Christopher J. Hillar and Shaowei Lin},
Motivated by the celebrated discrete-time model of nervous activity outlined by McCulloch and Pitts in 1943, we propose a novel continuous-time model, the McCulloch-Pitts network (MPN), for sequence learning in spiking neural networks. Our model has a local learning rule, such that the synaptic weight updates depend only on the information directly accessible by the synapse. By exploiting asymmetry in the connections between binary neurons, we show that MPN can be trained to robustly memorize… 
Hidden Hypergraphs, Error-Correcting Codes, and Critical Learning in Hopfield Networks
This work explores minimum energy flow (MEF) as a scalable convex objective for determining network parameters and catalogs various properties of MEF, such as biological plausibility, and then compares to classical approaches in the theory of learning.


Matching Recall and Storage in Sequence Learning with Spiking Neural Networks
A generic learning rule is derived that is matched to the neural dynamics by minimizing an upper bound on the Kullback–Leibler divergence from the target distribution to the model distribution and is consistent with spike-timing dependent plasticity.
Learning Precisely Timed Spikes
Long short-term memory and Learning-to-learn in networks of spiking neurons
This work includes neurons in their RSNN model that reproduce one prominent dynamical process of biological neurons that takes place at the behaviourally relevant time scale of seconds: neuronal adaptation, and denotes these networks as LSNNs because of their Long short-term memory.
Bayesian Computation Emerges in Generic Cortical Microcircuits through Spike-Timing-Dependent Plasticity
The results suggest that the experimentally observed spontaneous activity and trial-to-trial variability of cortical neurons are essential features of their information processing capability, since their functional role is to represent probability distributions rather than static neural codes.
Learning Probabilistic Inference through Spike-Timing-Dependent Plasticity123
It is shown that spike-timing-dependent plasticity in combination with intrinsic plasticity generates in ensembles of pyramidal cells with lateral inhibition a fundamental building block for that: probabilistic associations between neurons that represent through their firing current values of random variables.
Robust Exponential Memory in Hopfield Networks
This work discoveries a new set of low-density error-correcting codes that achieve Shannon’s noisy channel bound, and efficiently solve a variant of the hidden clique problem in computer science, opening new avenues for real-world applications of computational models originating from biology.
Variational Learning for Recurrent Spiking Networks
A plausible learning rule for feedforward, feedback and lateral connections in a recurrent network of spiking neurons is derived in the context of a generative model for distributions of spike sequences, derived from variational inference principles.
A neuronal learning rule for sub-millisecond temporal coding
A modelling study based on computer simulations of a neuron in the laminar nucleus of the barn owl shows that the necessary degree of coherence in the signal arrival times can be attained during ontogenetic development by virtue of an unsupervised hebbian learning rule.
Towards Biologically Plausible Deep Learning
The theory about the probabilistic interpretation of auto-encoders is extended to justify improved sampling schemes based on the generative interpretation of denoising auto- Encoder, and these ideas are validated on generative learning tasks.
Impact of synaptic unreliability on the information transmitted by spiking neurons.
  • A. Zador
  • Biology, Computer Science
    Journal of neurophysiology
  • 1998
This work considers a model in which a population of independent unreliable synapses provides the drive to an integrate-and-fire neuron, and considers two factors that govern the rate of information transfer: the synaptic reliability and the number of synapses connecting each presynaptic axon to its postsynaptic target.