On the computational power of neural nets
@inproceedings{Siegelmann1992OnTC, title={On the computational power of neural nets}, author={Hava T. Siegelmann and Eduardo Sontag}, booktitle={COLT '92}, year={1992} }
This paper deals with finite networks which consist of interconnections of synchronously evolving processors. Each processor updates its state by applying a “sigmoidal” scalar nonlinearity to a linear combination of the previous states of all units. We prove that one may simulate all Turing Machines by rational nets. In particular, one can do this in linear time, and there is a net made up of about 1,000 processors which computes a universal partial-recursive function. Products (high order nets…
114 Citations
RECURRENT NEURAL NETWORKS AND FINITE AUTOMATA
- Computer ScienceComput. Intell.
- 1996
Finite size networks that consist of interconnections of synchronously evolving processors are studied to prove that any function for which the left and right limits exist can be applied to the neurons to yield a network which is at least as strong computationally as a finite automaton.
Analog computation via neural networks
- Computer Science[1993] The 2nd Israel Symposium on Theory and Computing Systems
- 1993
The authors pursue a particular approach to analog computation, based on dynamical systems of the type used in neural networks research, which exhibit at least some robustness with respect to noise and implementation errors.
Turing's analysis of computation and artificial neural networks
- Computer ScienceJ. Intell. Fuzzy Syst.
- 2002
The proposed simulation is in agreement with the correct interpretation of Turing's analysis of computation; compatible with the current approaches to analyze cognition as an interactive agent-environment process; and physically realizable since it does not use connection weights with unbounded precision.
Computational Power of Neuroidal Nets
- Computer ScienceSOFSEM
- 1999
It is shown that the computational power of neuroidal nets crucially depends on the size of allowable weights, and that the former nets are computationally equivalent to standard, non-programmable discrete neural nets, while, quite surprisingly, the latter nets are computation equivalent to a certain kind of analog neural nets.
Some structural complexity aspects of neural computation
- Computer Science[1993] Proceedings of the Eigth Annual Structure in Complexity Theory Conference
- 1993
Connections to space-bounded classes, simulation of parallel computational models such as Vector Machines, and a discussion of the characterizations of various nonuniform classes in terms of Kolmogorov complexity are presented.
The Power of Extra Analog Neuron
- Computer ScienceTPNC
- 2014
A finite automaton with a register is introduced which is shown to be computationally equivalent to a hybrid binary-state network with an extra analog unit and a sufficient condition for a language accepted by this automaton to be regular is found.
Neural networks between integer and rational weights
- Computer Science2017 International Joint Conference on Neural Networks (IJCNN)
- 2017
An intermediate model of binary-state neural networks with integer weights, corresponding to finite automata, is studied, which is extended with an extra analog unit with rational weights, as already two additional analog units allow for Turing universality.
Computational power of neural networks: a Kolmogorov complexity characterization
- Computer Science
- 1993
This work proves that the Kolmogorov complexity of the weights of neural networks is infinite, and shows that neural networks can be classified into an infinite hierarchy of different computing capabilities.
On the Computational Power of Recurrent Neural Networks for Structures
- Computer ScienceNeural Networks
- 1997
Foundations of recurrent neural networks
- Computer Science
- 1993
This dissertation focuses on the "recurrent network" model, in which the underlying graph is not subject to any constraints, and establishes a precise correspondence between the mathematical and computing choices.
References
SHOWING 1-10 OF 62 REFERENCES
Analog computation via neural networks
- Computer Science[1993] The 2nd Israel Symposium on Theory and Computing Systems
- 1993
The authors pursue a particular approach to analog computation, based on dynamical systems of the type used in neural networks research, which exhibit at least some robustness with respect to noise and implementation errors.
Turing equivalence of neural networks with second order connection weights
- Computer ScienceIJCNN-91-Seattle International Joint Conference on Neural Networks
- 1991
It is proven that for any given Turing machine there exists a recurrent neural network with local, second-order, and uniformly connected weights which can simulate it.
Neural computability. II
- Computer ScienceInternational 1989 Joint Conference on Neural Networks
- 1989
The authors show that neural networks are at least as powerful as cellular automata and that the converse is true for finite networks, and suggest that the full classes are probably identical.
Finite State Automata and Simple Recurrent Networks
- Computer ScienceNeural Computation
- 1989
A network architecture introduced by Elman (1988) for predicting successive elements of a sequence and shows that long distance sequential contingencies can be encoded by the network even if only subtle statistical properties of embedded strings depend on the early information.
Representation of Events in Nerve Nets and Finite Automata
- Computer Science
- 1951
This memorandum is devoted to an elementary exposition of the problems and of results obtained on the McCulloch-Pitts nerve net during investigations in August 1951.
Second-order recurrent neural networks for grammatical inference
- Computer ScienceIJCNN-91-Seattle International Joint Conference on Neural Networks
- 1991
It is shown that a recurrent, second-order neural network using a real-time, feedforward training algorithm readily learns to infer regular grammars from positive and negative string training samples and many of the neural net state machines are dynamically stable and correctly classify long unseen strings.
Learning and Extracting Finite State Automata with Second-Order Recurrent Neural Networks
- Computer ScienceNeural Computation
- 1992
It is shown that a recurrent, second-order neural network using a real-time, forward training algorithm readily learns to infer small regular grammars from positive and negative string training samples, and many of the neural net state machines are dynamically stable, that is, they correctly classify many long unseen strings.
An introduction to computing with neural nets
- Computer ScienceIEEE ASSP Magazine
- 1987
This paper provides an introduction to the field of artificial neural nets by reviewing six important neural net models that can be used for pattern classification and exploring how some existing classification and clustering algorithms can be performed using simple neuron-like components.
A logical calculus of the ideas immanent in nervous activity
- PsychologyThe Philosophy of Artificial Intelligence
- 1990
It is shown that many particular choices among possible neurophysiological assumptions are equivalent, in the sense that for every net behaving under one assumption, there exists another net which behaves under the other and gives the same results, although perhaps not in the same time.
Artificial neural network on a SIMD architecture
- Computer ScienceProceedings., 2nd Symposium on the Frontiers of Massively Parallel Computation
- 1988
The proof-of-concept neural network is a multilayered perceptron model which uses the back-propagation learning paradigm and has fewer than 100 nodes in three layers and is trained to recognize letters of the alphabet.