Corpus ID: 10545728

Noisy Spiking Neurons with Temporal Coding have more Computational Power than Sigmoidal Neurons

@inproceedings{Maass1996NoisySN,
  title={Noisy Spiking Neurons with Temporal Coding have more Computational Power than Sigmoidal Neurons},
  author={Wolfgang Maass},
  booktitle={NIPS},
  year={1996}
}
  • W. Maass
  • Published in NIPS 1996
  • Computer Science
We exhibit a novel way of simulating sigmoidal neural nets by networks of noisy spiking neurons in temporal coding. Furthermore it is shown that networks of noisy spiking neurons with temporal coding have a strictly larger computational power than sigmoidal neural nets with the same number of units. 
Spiking neural network for control chart pattern recognition
Temporal coding spiking neural networks are receiving wider attention due to their computational power. The coincidence detection property of a spiking neuron, which has no counterpart in a sigmoidalExpand
Backpropagation for Population-Temporal Coded Spiking Neural Networks
TLDR
A new learning rule for spiking neurons that uses the general population-temporal coding model, inspired by learning rules for locally recurrent analog neural networks, that is able to operate on a wide class of decoding schemes. Expand
Digital spiking neuron cells for real-time reconfigurable learning networks
TLDR
Experimental results indicate that the proposed real-time data-flow learning network architecture allows the capacity of over 2800 (depending on the model complexity) biophysically accurate neurons in a single FPGA device. Expand
Temporal pattern recognition using spiking neural networks for cortical neuronal spike train decoding
TLDR
Spiking neural networks (SNN) which consist of spiking neurons propagate information by the timing of spikes to analyze the cortical neural spike trains directly without temporal information lost demonstrate that temporal coding is a viable code for fast neural information processing. Expand
Error-backpropagation in temporally encoded networks of spiking neurons
TLDR
It is demonstrated that temporal coding requires significantly less neurons than instantaneous rate-coding, and a supervised learning rule, \emph{SpikeProp}, akin to traditional error-backpropagation, is derived. Expand
Investigating the computational power of spiking neurons with non-standard behaviors
  • S. Kampakis
  • Computer Science, Medicine
  • Neural Networks
  • 2013
TLDR
The computational power of neurons with different behaviors based on the previous analyses conducted by Maass and Schmitt are studied; the studied behaviors are rebound spiking, resonance and bursting. Expand
Improving SpikeProp: Enhancements to An Error-Backpropagation Rule for Spiking Neural Networks
TLDR
These enhancements to the SpikeProp learning algorithm provide additional learning rules for the synaptic delays and time constants and for the neurons’ thresholds, which results in smaller network topologies. Expand
An extended model for a spiking neuron class
This paper proposes an extension to the model of a spiking neuron for information processing in artificial neural networks, developing a new approach for the dynamic threshold of theExpand
An STDP Training Algorithm for a Spiking Neural Network with Dynamic Threshold Neurons
TLDR
A supervised training algorithm is proposed which modifies the Spike Timing Dependent Plasticity (STDP)learning rule to support both local and network level training with multiple synaptic connections and axonal delays for Spiking Neural Networks. Expand
A gradient descent rule for spiking neurons emitting multiple spikes
TLDR
A supervised learning rule for Spiking Neural Networks (SNNs) is presented that can cope with neurons that spike multiple times and is successfully tested on a classification task of Poisson spike trains. Expand
...
1
2
3
4
5
...

References

SHOWING 1-10 OF 12 REFERENCES
Reliability and information transmission in spiking neurons
TLDR
The results of these studies show that neural coding and computation in several systems approach fundamental physical and informational theoretic limits to performance. Expand
Fast Sigmoidal Networks via Spiking Neurons
  • W. Maass
  • Computer Science, Medicine
  • Neural Computation
  • 1997
TLDR
It is shown that networks of relatively realistic mathematical models for biological neurons in principle can simulate arbitrary feedforward sigmoidal neural nets in a way that has previously not been considered and are universal approximators in the sense that they can approximate with regard to temporal coding any given continuous function of several variables. Expand
Networks of spiking neurons: the third generation of neural network models
TLDR
It is shown that networks of spiking neurons are, with regard to the number of neurons that are needed, computationally more powerful than other neural network models based on McCulloch Pitts neurons and sigmoidal gates. Expand
On the Computational Power of Noisy Spiking Neurons
  • W. Maass
  • Mathematics, Computer Science
  • NIPS
  • 1995
TLDR
These constructions provide a possible explanation for the fact that biological neural systems can carry out quite complex computations within 100 msec and it turns out that the assumption that these constructions require about the shape of the EPSP's and the behaviour of the noise are surprisingly weak. Expand
Networks of Spiking Neurons: The Third Generation of Neural Network Models
  • W. Maass
  • Computer Science
  • Electron. Colloquium Comput. Complex.
  • 1996
TLDR
It is shown that networks of spiking neurons are, with regard to the number of neurons that are needed, computationally more powerful than these other neural network models based on McCulloch Pitts neurons, respectively, sigmoidal gates. Expand
Temporal Precision of Spike Trains in Extrastriate Cortex of the Behaving Macaque Monkey
TLDR
Single neurons recorded in a previous study from cortical area MT in the behaving monkey respond to dynamic and unpredictable motion stimuli with a markedly reproducible temporal modulation that is precise to a few milliseconds. Expand
Pattern recognition computation using action potential timing for stimulus representation
A computational model is described in which the sizes of variables are represented by the explicit times at which action potentials occur, rather than by the more usual 'firing rate' of neurons. TheExpand
Multilayer Feedforward Networks with a Non-Polynomial Activation Function Can Approximate Any Function
TLDR
It is shown that a standard multilayer feedforward network can approximate any continuous function to any degree of accuracy if and only if the network's activation functions are not polynomial. Expand
VC dimension in circuit complexity
  • P. Koiran
  • Computer Science, Mathematics
  • Proceedings of Computational Complexity (Formerly Structure in Complexity Theory)
  • 1996
TLDR
This is the first lower bound for the computation model of sigmoidal circuits with unbounded weights, and upper and lower bounds for the same function in a few other computation models: circuits of AND/OR gates, threshold circuits, and circuits of piecewise-rational gates. Expand
Real-time performance of a movement-sensitive neuron in the blowfly visual system
TLDR
A depth-measuring instrument of the membrane type has a metallic disc membrane deformations of which those of which are recorded by a pointer on a scale are recorded. Expand
...
1
2
...