Fatiguing STDP: Learning from spike-timing codes in the presence of rate codes

  title={Fatiguing STDP: Learning from spike-timing codes in the presence of rate codes},
  author={Timoleon Moraitis and Abu Sebastian and Irem Boybat and Manuel Le Gallo and Tomas Tuma and Evangelos Eleftheriou},
  journal={2017 International Joint Conference on Neural Networks (IJCNN)},
Spiking neural networks (SNNs) could play a key role in unsupervised machine learning applications, by virtue of strengths related to learning from the fine temporal structure of event-based signals. However, some spike-timing-related strengths of SNNs are hindered by the sensitivity of spike-timing-dependent plasticity (STDP) rules to input spike rates, as fine temporal correlations may be obstructed by coarser correlations between firing rates. In this article, we propose a spike-timing… 

Figures from this paper

Spiking Neural Networks Enable Two-Dimensional Neurons and Unsupervised Multi-Timescale Learning

This work demonstrates how input neurons can be two-dimensional (2D), and shows unsupervised learning from multiple timescales simultaneously, and suggests that through these unique features, SNNs may increase the performance and broaden the applicability of ANNs.

Neurosymbolic Spike Concept Learner towards Neuromorphic General Intelligence

A technique allowing dynamic formation of synapse (connections) in spiking neural networks, the basis of structural plasticity, is proposed and called Neurosymbolic Spike-Concept Learner (NS-SCL).

Deep Networks Incorporating Spiking Neural Dynamics

An alternative perspective on the spiking neuron as a particular ANN construct called Spiking Neural Unit (SNU) is proposed, which provides a systematic methodology for implementing and training deep networks incorporating spiking dynamics that achieve accuracies as high, or better than, state-of-the-art ANNs.

Deep learning incorporating biologically inspired neural dynamics and in-memory computing

The biologically inspired dynamics of spiking neurons are incorporated into conventional recurrent neural network units and in-memory computing, and it is shown how this allows for accurate and energy-efficient deep learning.

Deep learning incorporating biologically-inspired neural dynamics

The new generation of neural units introduced in this paper incorporate biologically-inspired neural dynamics in deep learning and provide a systematic methodology for training neuromorphic computing hardware, which opens a new avenue for a widespread adoption of SNNs in practical applications.

Deep Spiking Neural Network model for time-variant signals classification: a real-time speech recognition approach

A novel spiking neural network model has been proposed to adapt the network that has been trained with static images to a non-static processing approach, making it possible to classify audio signals and time series in real time.

Unsupervised Learning of Phase-Change-Based Neuromorphic Systems

This dissertation proposes phase- change-based neuromorphic architectures based on phase-change memristors combined with biologically-inspired synaptic learning rules, and experimentally demonstrates their pattern- and feature-learning capabilities.

Surrogate Gradients Design

It is shown how complex tasks and networks are more sensitive to SG choice, and a theoretical solution is provided to reduce the need of extensive gridsearch, to find SG shape and initializations that result in improved accuracy.

Building Brain-Inspired Computing Systems: Examining the Role of Nanoscale Devices

Emulating the immense parallelism and event-driven computational architecture in systems with comparable complexity and power budget as the brain, and in real time, remains a formidable challenge.

The Role of Short-Term Plasticity in Neuromorphic Learning: Learning from the Timing of Rate-Varying Events with Fatiguing Spike-Timing-Dependent Plasticity

Neural networks (NNs) have been able to provide record-breaking performance in several machine-learning tasks, such as image and speech recognition, natural-language processing, playing complex



STDP Allows Fast Rate-Modulated Coding with Poisson-Like Spike Trains

STDP provides an appealing mechanism to learn repeating rate-modulated patterns, which, beyond sensory processing, may also be involved in many cognitive tasks, and is demonstrated to be feasible provided significant covarying rate modulations occur within the typical timescale of STDP.

Learning Input Correlations through Nonlinear Temporally Asymmetric Hebbian Plasticity

It is demonstrated that by adjusting the weight dependence of the synaptic changes in TAH plasticity, it is possible to enhance the synaptic representation of temporal input correlations while maintaining the system in a stable learning regime, and the learning efficiency can be optimized.

Hebbian learning and spiking neurons

A correlation-based ~‘‘Hebbian’’ ! learning rule at a spike level with millisecond resolution is formulated, mathematically analyzed, and compared with learning in a firing-rate description. The

The neural code between neocortical pyramidal neurons depends on neurotransmitter release probability.

  • M. TsodyksH. Markram
  • Biology
    Proceedings of the National Academy of Sciences of the United States of America
  • 1997
By setting the rate of synaptic depression, release probability is an important factor in determining the neural code, suggesting that the relative contribution of rate and temporal signals varies along a continuum.

Short Term Synaptic Depression Imposes a Frequency Dependent Filter on Synaptic Information Transfer

This study provides strong evidence that the stochastic nature neurotransmitter vesicle dynamics must be considered when analyzing the information flow across a synapse.

Stochastic phase-change neurons.

This work shows that chalcogenide-based phase-change materials can be used to create an artificial neuron in which the membrane potential is represented by the phase configuration of the nanoscale phase- change device and shows that the temporal integration of postsynaptic potentials can be achieved on a nanosecond timescale.

Networks of Spiking Neurons: The Third Generation of Neural Network Models

  • W. Maass
  • Computer Science
    Electron. Colloquium Comput. Complex.
  • 1996

A reconfigurable on-line learning spiking neuromorphic processor comprising 256 neurons and 128K synapses

This paper presents a full-custom mixed-signal VLSI device with neuromorphic learning circuits that emulate the biophysics of real spiking neurons and dynamic synapses for exploring the properties of computational neuroscience models and for building brain-inspired computing systems.

Detecting Correlations Using Phase-Change Neurons and Synapses

It is demonstrated that the internal states of the neuron and of the synapses can be efficiently stored in nanoscale phase-change memory devices and show computations with collocated storage in an experimental setting.

Memristors with diffusive dynamics as synaptic emulators for neuromorphic computing.

The diffusive Ag-in-oxide memristor and its dynamics enable a direct emulation of both short- and long-term plasticity of biological synapses, representing an advance in hardware implementation of neuromorphic functionalities.