Learn More
The brain is highly efficient in how it processes information and tolerates faults. Arguably, the basic processing units are neurons and synapses that are interconnected in a complex pattern. Computer scientists and engineers aim to harness this efficiency and build artificial neural systems that can emulate the key information processing principles of the(More)
This paper presents a Spiking Neural Network (SNN) architecture for mobile robot navigation. The SNN contains 4 layers where dynamic synapses route information to the appropriate neurons in each layer and the neurons are modeled using the Leaky Integrate and Fire (LIF) model. The SNN learns by self-organizing its connectivity as new environmental conditions(More)
This paper proposes a supervised training algorithm for Spiking Neural Networks (SNNs) which modifies the Spike Timing Dependent Plasticity (STDP)learning rule to support both local and network level training with multiple synaptic connections and axonal delays. The training algorithm applies the rule to two and three layer SNNs, and is benchmarked using(More)
This paper presents a synaptic weight association training (SWAT) algorithm for spiking neural networks (SNNs). SWAT merges the Bienenstock-Cooper-Munro (BCM) learning rule with spike timing dependent plasticity (STDP). The STDP/BCM rule yields a unimodal weight distribution where the height of the plasticity window associated with STDP is modulated causing(More)
Recommended by Michael Huebner FPGA devices have emerged as a popular platform for the rapid prototyping of biological Spiking Neural Networks (SNNs) applications, offering the key requirement of reconfigurability. However, FPGAs do not efficiently realise the biologically plausible neuron and synaptic models of SNNs, and current FPGA routing structures(More)