Evolving recurrent neural networks are super-Turing

  title={Evolving recurrent neural networks are super-Turing},
  author={J{\'e}r{\'e}mie Cabessa and Hava T. Siegelmann},
  journal={The 2011 International Joint Conference on Neural Networks},
The computational power of recurrent neural networks is intimately related to the nature of their synaptic weights. In particular, neural networks with static rational weights are known to be Turing equivalent, and recurrent networks with static real weights were proved to be super-Turing. Here, we study the computational power of a more biologically-oriented model where the synaptic weights can evolve rather than stay static. We prove that such evolving networks gain a super-Turing… CONTINUE READING
Highly Cited
This paper has 31 citations. REVIEW CITATIONS


Publications citing this paper.
Showing 1-10 of 17 extracted citations

Unconventional Computation and Natural Computation

Lecture Notes in Computer Science • 2015
View 9 Excerpts
Highly Influenced

Emulation of finite state automata with networks of synfire rings

2017 International Joint Conference on Neural Networks (IJCNN) • 2017

Quantum information processing with superconducting circuits: a review.

Reports on progress in physics. Physical Society • 2017

Algorithms simulating natural phenomena and hypercomputation

2016 IEEE Students' Conference on Electrical, Electronics and Computer Science (SCEECS) • 2016
View 1 Excerpt


Publications referenced by this paper.
Showing 1-8 of 8 references

On the computational power of neural nets

J. Comput. Syst. Sci., vol. 50, no. 1, pp. 132–150, 1995. • 1995
View 9 Excerpts
Highly Influenced

Synaptic plasticity: taming the beast

Nature Neuroscience • 2000
View 1 Excerpt

Introduction to Complexity Theory: Lecture notes

O. Goldreich
Unpublished lecture notes, • 1999
View 1 Excerpt

Neural Networks and Analog Computation

Progress in Theoretical Computer Science • 1999
View 2 Excerpts

Similar Papers

Loading similar papers…