• Publications
  • Influence
Event-driven contrastive divergence for spiking neuromorphic systems
TLDR
We present an event-driven variation of Contrastive Divergence to train a RBM constructed with Integrate & Fire (I&F) neurons, that is constrained by the limitations of existing and near future neuromorphic hardware platforms. Expand
  • 173
  • 18
  • PDF
Conversion of artificial recurrent neural networks to spiking neural networks for low-power neuromorphic hardware
TLDR
In recent years the field of neuromorphic low-power systems gained significant momentum, spurring brain-inspired hardware systems which operate on principles that are fundamentally different from standard digital computers and thereby consume orders of magnitude less power. Expand
  • 102
  • 6
  • PDF
Surrogate Gradient Learning in Spiking Neural Networks
TLDR
Spiking neural networks are nature's versatile solution to fault-tolerant and energy efficient signal processing. Expand
  • 77
  • 6
  • PDF
Event-Driven Random Back-Propagation: Enabling Neuromorphic Deep Learning Machines
TLDR
We demonstrate an event-driven random BP (eRBP) rule that uses an error-modulated synaptic plasticity for learning deep representations. Expand
  • 125
  • 5
  • PDF
Synaptic Plasticity Dynamics for Deep Continuous Local Learning (DECOLLE)
TLDR
We introduce Deep Continuous Local Learning (DECOLLE), a spiking neural network equipped with local error functions for online learning with no memory overhead for computing gradients. Expand
  • 38
  • 5
  • PDF
A 65k-neuron 73-Mevents/s 22-pJ/event asynchronous micro-pipelined integrate-and-fire array transceiver
TLDR
We present a 65k-neuron integrate-and-fire array transceiver (IFAT) for spike-based neural computation with low-power, high-throughput connectivity. Expand
  • 60
  • 4
Contrastive Hebbian Learning with Random Feedback Weights
TLDR
Neural networks are commonly trained to make predictions through learning algorithms. Expand
  • 9
  • 3
  • PDF
Stochastic Synapses Enable Efficient Brain-Inspired Learning Machines
TLDR
We introduce Synaptic Sampling Machines (S2Ms), a class of neural network models that uses synaptic stochasticity as a means to Monte Carlo sampling and unsupervised learning. Expand
  • 78
  • 2
  • PDF
Event-driven random backpropagation: Enabling neuromorphic deep learning machines
TLDR
An event-driven random backpropagation (eRBP) rule that uses an error-modulated synaptic plasticity rule for learning deep representations in neuromorphic computing hardware. Expand
  • 43
  • 2
  • PDF
Forward table-based presynaptic event-triggered spike-timing-dependent plasticity
TLDR
We present a novel method for realizing both causal and acausal synaptic weight updates using only forward lookup access of the synaptic connectivity table, permitting memory-efficient implementation. Expand
  • 10
  • 2
  • PDF