The Computational Power of Interactive Recurrent Neural Networks

  title={The Computational Power of Interactive Recurrent Neural Networks},
  author={J{\'e}r{\'e}mie Cabessa and Hava T. Siegelmann},
  journal={Neural Computation},
In classical computation, rational- and real-weighted recurrent neural networks were shown to be respectively equivalent to and strictly more powerful than the standard Turing machine model. [] Key Result It follows from these results that interactive real-weighted neural networks can perform uncountably many more translations of information than interactive Turing machines, making them capable of super-Turing capabilities.

Recurrent Neural Networks and Super-Turing Interactive Computation

The results show that the computational powers of neural nets involved in a classical or in an interactive computational framework follow similar patterns of characterization, and suggest that some intrinsic computational capabilities of the brain might lie beyond the scope of Turing-equivalent models of computation.

The Super-Turing Computational Power of Interactive Evolving Recurrent Neural Networks

It is proved that the so-called interactive evolving recurrent neural networks are computationally equivalent to interactive Turing machines with advice, hence capable of super-Turing potentialities and a precise characterisation of the ω-translations realised by these networks.

Interactive Evolving Recurrent Neural Networks Are Super-turing

This work shows that interactive evolving recurrent neural networks are not only super-Turing, but also capable of simulating any other possible interactive deterministic system, irrespective of whether their synaptic weights are rational or real.

Recurrent Neural Networks - A Natural Model of Computation beyond the Turing Limits

It is shown that recurrent neural networks provide a suitable and natural model of computation beyond the Turing limits, and doesn’t draw any hasty conclusion about the controversial issue of a possible predominance of biological intelligence over the potentialities of artificial intelligence.

On Super-Turing Neural Computation

A historical survey of the most significant results concerning the computational power of neural models, and the recent results by Cabessa, Siegelmann and Villa revealing the super-Turing computational potentialities of interactive and evolving recurrent neural networks.

Computational capabilities of recurrent neural networks based on their attractor dynamics

It is shown that rational-weighted neural networks are computationally equivalent to deterministic Muller Turing machines, whereas all other models of real- Weighted or evolving neural Networks are equivalent to each other, and strictly more powerful than deterministic Müller Turing machines.

Expressive power of first-order recurrent neural networks determined by their attractor dynamics

The Super-Turing Computational Power of plastic Recurrent Neural Networks

The results support the claim that the general mechanism of plasticity is crucially involved in the computational and dynamical capabilities of biological neural networks and show that the super-Turing level of computation reflects in a suitable way the capabilities of brain-like models of computation.

Expressive Power of Non-deterministic Evolving Recurrent Neural Networks in Terms of Their Attractor Dynamics

It is proved that the two models of rational-weighted and real- Weighted nondeterministic hybrid neural networks are computationally equivalent, and recognize precisely the set of all analytic neural \(\omega \)-languages.



Analog computation via neural networks

The authors pursue a particular approach to analog computation, based on dynamical systems of the type used in neural networks research, which exhibit at least some robustness with respect to noise and implementation errors.

On the Computational Power of Neural Nets

It is proved that one may simulate all Turing Machines by rational nets in linear time, and there is a net made up of about 1,000 processors which computes a universal partial-recursive function.

The Dynamic Universality of Sigmoidal Neural Networks

The techniques can be applied to a much more general class of “sigmoidal-like” activation functions, suggesting that Turing universality is a relatively common property of recurrent neural network models.

Turing on Super-Turing and adaptivity.

  • H. Siegelmann
  • Biology, Computer Science
    Progress in biophysics and molecular biology
  • 2013

Interactive Foundations of Computing

  • P. Wegner
  • Computer Science
    Theor. Comput. Sci.
  • 1998

Beyond the Turing Limit: Evolving Interactive Systems

It is argued that ITM's with advice can serve as an adequate reference model for capturing the essence of computations by evolving interactive systems, showing that 'in theory' the latter are provably more powerful than classical systems.

How We Think of Computing Today

Two models inspired from key mechanisms of current systems in both artificial and natural environments are proposed: evolving automata and interactive Turing machines with advice, which are shown to be equivalent and both are provably computationally more powerful than the models covered by the old computing paradigm.

Computation Beyond the Turing Limit

A simply described but highly chaotic dynamical system called the analog shift map is presented here, which has computational power beyond the Turing limit (super-Turing); it computes exactly like neural networks and analog machines.

Persistent Turing Machines as a Model of Interactive Computation

The methods and tools for formalizing PTM computation developed in this paper can serve as a basis for a more comprehensive theory of interactive computation.

Turing Machines, Transition Systems, and Interaction