• Corpus ID: 7830328

Turing Computation with Recurrent Artificial Neural Networks

@article{Carmantini2015TuringCW,
  title={Turing Computation with Recurrent Artificial Neural Networks},
  author={Giovanni Sirio Carmantini and Peter beim Graben and Mathieu Desroches and Serafim Rodrigues},
  journal={ArXiv},
  year={2015},
  volume={abs/1511.01427}
}
We improve the results by Siegelmann & Sontag (1995) by providing a novel and parsimonious constructive mapping between Turing Machines and Recurrent Artificial Neural Networks, based on recent developments of Nonlinear Dynamical Automata. The architecture of the resulting R-ANNs is simple and elegant, stemming from its transparent relation with the underlying NDAs. These characteristics yield promise for developments in machine learning methods and symbolic computation with continuous time… 

Figures from this paper

Dynamical systems theory for transparent symbolic computation in neuronal networks
TLDR
This thesis presents a new dynamical framework for computation in neuronal networks based on the slow-fast dynamics paradigm, and discusses the consequences of the results for future work, specifically for what concerns the fields of interactive computation and Artificial Intelligence.
Can Biological Quantum Networks Solve NP‐Hard Problems?
  • G. Wendin
  • Biology
    Advanced Quantum Technologies
  • 2019
TLDR
The conclusion is that biological quantum networks can only approximately solve small instances of nonpolynomial (NP)‐hard problems, and artificial intelligence and machine learning implemented in complex dynamical systems based on genuine quantum Networks can certainly be expected to show enhanced performance and quantum advantage compared with classical networks.
A Survey on Analog Models of Computation
TLDR
A survey on analog models of computations, which considers both approaches, often intertwined, with a point of view mostly oriented by computation theory.

References

SHOWING 1-10 OF 20 REFERENCES
Neural Turing Machines
TLDR
A combined system is analogous to a Turing Machine or Von Neumann architecture but is differentiable end-toend, allowing it to be efficiently trained with gradient descent.
Universal neural field computation
TLDR
This chapter implements universal Turing computation in a neural field environment using the canonical symbologram representation of a Turing machine obtained from a Godel encoding of its symbolic repertoire and generalized shifts to implement a nonlinear dynamical automaton.
Language Processing by Dynamical Systems
TLDR
It is argued that ERP components are indicators of these bifurcations and proposed an ERP-like measure of the parsing model, which can be modeled by a switching of the control parameter in analogy to phase transitions observed in brain dynamics.
Interactive Foundations of Computing
  • P. Wegner
  • Computer Science
    Theor. Comput. Sci.
  • 1998
Unpredictability and undecidability in dynamical systems.
  • Moore
  • Physics
    Physical review letters
  • 1990
TLDR
It is shown that motion with as few as three degrees of freedom can be equivalent to a Turing machine, and so be capable of universal computation.
On computable numbers, with an application to the Entscheidungsproblem
  • A. Turing
  • Computer Science
    Proc. London Math. Soc.
  • 1937
TLDR
This chapter discusses the application of the diagonal process of the universal computing machine, which automates the calculation of circle and circle-free numbers.
Transient Cognitive Dynamics, Metastability, and Decision Making
TLDR
An effective solution for the problem of sequential decision making, represented as a fixed time game: a player takes sequential actions in a changing noisy environment so as to maximize a cumulative reward.
Canards, Folded Nodes, and Mixed-Mode Oscillations in Piecewise-Linear Slow-Fast Systems
TLDR
This paper analyzes canonical PWL systems that display folded singularities, primary and secondary canards, with a similar control of the maximal winding number as in the smooth case, and shows that the singular phase portraits are compatible in both frameworks.
Inflection, canards and excitability threshold in neuronal models
TLDR
The concept of inflection set gives a good approximation of the threshold in both the so-called resonator and integrator neuronal cases, and the topological changes that inflection sets undergo upon parameter variation are investigated.
Computability with polynomial differential equations
...
...