• Computer Science
  • Published in ArXiv 2017

Noisy Softplus: an activation function that enables SNNs to be trained as ANNs

@article{Liu2017NoisySA,
  title={Noisy Softplus: an activation function that enables SNNs to be trained as ANNs},
  author={Qian Liu and Yunhua Chen and Stephen B. Furber},
  journal={ArXiv},
  year={2017},
  volume={abs/1706.03609}
}
We extended the work of proposed activation function, Noisy Softplus, to fit into training of layered up spiking neural networks (SNNs). Thus, any ANN employing Noisy Softplus neurons, even of deep architecture, can be trained simply by the traditional algorithm, for example Back Propagation (BP), and the trained weights can be directly used in the spiking version of the same network without any conversion. Furthermore, the training method can be generalised to other activation units, for… CONTINUE READING
5
Twitter Mentions

References

Publications referenced by this paper.
SHOWING 1-10 OF 21 REFERENCES

PyNN: A Common Interface for Neuronal Network Simulators

  • Front. Neuroinform.
  • 2008
VIEW 4 EXCERPTS
HIGHLY INFLUENTIAL

Delving Deep into Rectifiers: Surpassing Human-Level Performance on ImageNet Classification

  • 2015 IEEE International Conference on Computer Vision (ICCV)
  • 2015
VIEW 1 EXCERPT

Fast-classifying, high-accuracy spiking deep networks through weight and threshold balancing

  • 2015 International Joint Conference on Neural Networks (IJCNN)
  • 2015
VIEW 3 EXCERPTS