• Corpus ID: 7830328

Turing Computation with Recurrent Artificial Neural Networks

  title={Turing Computation with Recurrent Artificial Neural Networks},
  author={Giovanni Sirio Carmantini and Peter beim Graben and Mathieu Desroches and Serafim Rodrigues},
We improve the results by Siegelmann & Sontag (1995) by providing a novel and parsimonious constructive mapping between Turing Machines and Recurrent Artificial Neural Networks, based on recent developments of Nonlinear Dynamical Automata. The architecture of the resulting R-ANNs is simple and elegant, stemming from its transparent relation with the underlying NDAs. These characteristics yield promise for developments in machine learning methods and symbolic computation with continuous time… 

Figures from this paper

Dynamical systems theory for transparent symbolic computation in neuronal networks
This thesis presents a new dynamical framework for computation in neuronal networks based on the slow-fast dynamics paradigm, and discusses the consequences of the results for future work, specifically for what concerns the fields of interactive computation and Artificial Intelligence.
Can Biological Quantum Networks Solve NP‐Hard Problems?
  • G. Wendin
  • Biology
    Advanced Quantum Technologies
  • 2019
The conclusion is that biological quantum networks can only approximately solve small instances of nonpolynomial (NP)‐hard problems, and artificial intelligence and machine learning implemented in complex dynamical systems based on genuine quantum Networks can certainly be expected to show enhanced performance and quantum advantage compared with classical networks.
A Survey on Analog Models of Computation
A survey on analog models of computations, which considers both approaches, often intertwined, with a point of view mostly oriented by computation theory.


On the Computational Power of Neural Nets
It is proved that one may simulate all Turing Machines by rational nets in linear time, and there is a net made up of about 1,000 processors which computes a universal partial-recursive function.
Neural Turing Machines
A combined system is analogous to a Turing Machine or Von Neumann architecture but is differentiable end-toend, allowing it to be efficiently trained with gradient descent.
Universal neural field computation
This chapter implements universal Turing computation in a neural field environment using the canonical symbologram representation of a Turing machine obtained from a Godel encoding of its symbolic repertoire and generalized shifts to implement a nonlinear dynamical automaton.
Quantum Representation Theory for Nonlinear Dynamical Automata
An extension of NDAs is outlined that is able to encode only a “working memory” by a set of initial conditions in the system’s phase space, while incoming new material then acts like “quantum operators” upon the phase space thus mapping a setof initial conditions onto another set.
Language Processing by Dynamical Systems
It is argued that ERP components are indicators of these bifurcations and proposed an ERP-like measure of the parsing model, which can be modeled by a switching of the control parameter in analogy to phase transitions observed in brain dynamics.
Turing computability with neural nets
Generalized shifts: unpredictability and undecidability in dynamical systems Nonlinearity 4 199-230
A class of shift-like dynamical systems is presented that displays a wide variety of behaviours, including periodic points, basins of attraction, and time series, and it is shown that they can be embedded in smooth maps in R2, or smooth flows in R3.
Interactive Foundations of Computing
  • P. Wegner
  • Computer Science
    Theor. Comput. Sci.
  • 1998
Toward an interpretation of dynamic neural activity in terms of chaotic dynamical systems.
  • I. Tsuda
  • Biology
    The Behavioral and brain sciences
  • 2001
A new coding scheme of information in chaos-driven contracting systems the authors refer to as Cantor coding is proposed, found in the hippocampal formation and also in the olfactory system, which should be of biological significance.
Inverse problems in dynamic cognitive modeling.
A Tikhonov-Hebbian learning method is suggested as regularization technique and its validity and robustness are demonstrated for basic examples of cognitive computations.