An evolutionary algorithm that constructs recurrent neural networks

Abstract

Standard methods for simultaneously inducing the structure and weights of recurrent neural networks limit every task to an assumed class of architectures. Such a simplification is necessary since the interactions between network structure and function are not well understood. Evolutionary computations, which include genetic algorithms and evolutionary programming, are population-based search methods that have shown promise in many similarly complex tasks. This paper argues that genetic algorithms are inappropriate for network acquisition and describes an evolutionary program, called GNARL, that simultaneously acquires both the structure and weights for recurrent networks. GNARL's empirical acquisition method allows for the emergence of complex behaviors and topologies that are potentially excluded by the artificial architectural constraints imposed in standard network induction methods.

DOI: 10.1109/72.265960

Extracted Key Phrases

20 Figures and Tables

020406080'95'98'01'04'07'10'13'16
Citations per Year

911 Citations

Semantic Scholar estimates that this publication has 911 citations based on the available data.

See our FAQ for additional information.

Cite this paper

@article{Angeline1994AnEA, title={An evolutionary algorithm that constructs recurrent neural networks}, author={Peter J. Angeline and Gregory M. Saunders and Jordan B. Pollack}, journal={IEEE transactions on neural networks}, year={1994}, volume={5 1}, pages={54-65} }