On the computational power of neural nets

@inproceedings{Siegelmann1992OnTC,
  title={On the computational power of neural nets},
  author={Hava T. Siegelmann and Eduardo Sontag},
  booktitle={COLT '92},
  year={1992}
}
This paper deals with finite networks which consist of interconnections of synchronously evolving processors. Each processor updates its state by applying a “sigmoidal” scalar nonlinearity to a linear combination of the previous states of all units. We prove that one may simulate all Turing Machines by rational nets. In particular, one can do this in linear time, and there is a net made up of about 1,000 processors which computes a universal partial-recursive function. Products (high order nets… 
RECURRENT NEURAL NETWORKS AND FINITE AUTOMATA
TLDR
Finite size networks that consist of interconnections of synchronously evolving processors are studied to prove that any function for which the left and right limits exist can be applied to the neurons to yield a network which is at least as strong computationally as a finite automaton.
Analog computation via neural networks
TLDR
The authors pursue a particular approach to analog computation, based on dynamical systems of the type used in neural networks research, which exhibit at least some robustness with respect to noise and implementation errors.
Turing's analysis of computation and artificial neural networks
TLDR
The proposed simulation is in agreement with the correct interpretation of Turing's analysis of computation; compatible with the current approaches to analyze cognition as an interactive agent-environment process; and physically realizable since it does not use connection weights with unbounded precision.
Computational Power of Neuroidal Nets
TLDR
It is shown that the computational power of neuroidal nets crucially depends on the size of allowable weights, and that the former nets are computationally equivalent to standard, non-programmable discrete neural nets, while, quite surprisingly, the latter nets are computation equivalent to a certain kind of analog neural nets.
Some structural complexity aspects of neural computation
TLDR
Connections to space-bounded classes, simulation of parallel computational models such as Vector Machines, and a discussion of the characterizations of various nonuniform classes in terms of Kolmogorov complexity are presented.
The Power of Extra Analog Neuron
TLDR
A finite automaton with a register is introduced which is shown to be computationally equivalent to a hybrid binary-state network with an extra analog unit and a sufficient condition for a language accepted by this automaton to be regular is found.
Neural networks between integer and rational weights
  • Jirí Síma
  • Computer Science
    2017 International Joint Conference on Neural Networks (IJCNN)
  • 2017
TLDR
An intermediate model of binary-state neural networks with integer weights, corresponding to finite automata, is studied, which is extended with an extra analog unit with rational weights, as already two additional analog units allow for Turing universality.
Computational power of neural networks: a Kolmogorov complexity characterization
TLDR
This work proves that the Kolmogorov complexity of the weights of neural networks is infinite, and shows that neural networks can be classified into an infinite hierarchy of different computing capabilities.
Foundations of recurrent neural networks
TLDR
This dissertation focuses on the "recurrent network" model, in which the underlying graph is not subject to any constraints, and establishes a precise correspondence between the mathematical and computing choices.
...
...

References

SHOWING 1-10 OF 62 REFERENCES
Analog computation via neural networks
TLDR
The authors pursue a particular approach to analog computation, based on dynamical systems of the type used in neural networks research, which exhibit at least some robustness with respect to noise and implementation errors.
Turing equivalence of neural networks with second order connection weights
  • G. Sun, H. Chen, Y. Lee
  • Computer Science
    IJCNN-91-Seattle International Joint Conference on Neural Networks
  • 1991
TLDR
It is proven that for any given Turing machine there exists a recurrent neural network with local, second-order, and uniformly connected weights which can simulate it.
Neural computability. II
TLDR
The authors show that neural networks are at least as powerful as cellular automata and that the converse is true for finite networks, and suggest that the full classes are probably identical.
Finite State Automata and Simple Recurrent Networks
TLDR
A network architecture introduced by Elman (1988) for predicting successive elements of a sequence and shows that long distance sequential contingencies can be encoded by the network even if only subtle statistical properties of embedded strings depend on the early information.
Representation of Events in Nerve Nets and Finite Automata
TLDR
This memorandum is devoted to an elementary exposition of the problems and of results obtained on the McCulloch-Pitts nerve net during investigations in August 1951.
Second-order recurrent neural networks for grammatical inference
TLDR
It is shown that a recurrent, second-order neural network using a real-time, feedforward training algorithm readily learns to infer regular grammars from positive and negative string training samples and many of the neural net state machines are dynamically stable and correctly classify long unseen strings.
Learning and Extracting Finite State Automata with Second-Order Recurrent Neural Networks
TLDR
It is shown that a recurrent, second-order neural network using a real-time, forward training algorithm readily learns to infer small regular grammars from positive and negative string training samples, and many of the neural net state machines are dynamically stable, that is, they correctly classify many long unseen strings.
An introduction to computing with neural nets
TLDR
This paper provides an introduction to the field of artificial neural nets by reviewing six important neural net models that can be used for pattern classification and exploring how some existing classification and clustering algorithms can be performed using simple neuron-like components.
A logical calculus of the ideas immanent in nervous activity
TLDR
It is shown that many particular choices among possible neurophysiological assumptions are equivalent, in the sense that for every net behaving under one assumption, there exists another net which behaves under the other and gives the same results, although perhaps not in the same time.
Artificial neural network on a SIMD architecture
TLDR
The proof-of-concept neural network is a multilayered perceptron model which uses the back-propagation learning paradigm and has fewer than 100 nodes in three layers and is trained to recognize letters of the alphabet.
...
...