In classical computation, rational- and real-weighted recurrent neural networks were shown to be respectively equivalent to and strictly more powerful than the standard Turing machine model. Here, we… (More)

We study the computational capabilities of a biologically inspired neural model where the synaptic weights, the connectivity pattern, and the number of neurons can evolve over time rather than stay… (More)

The 2011 International Joint Conference on Neural…

2011

The computational power of recurrent neural networks is intimately related to the nature of their synaptic weights. In particular, neural networks with static rational weights are known to be Turing… (More)

We consider analog recurrent neural networks working on infinite input streams, provide a complete topological characterization of their expressive power, and compare it to the expressive power of… (More)

We consider a model of evolving recurrent neural networks where the synaptic weights can change over time, and we study the computational power of such networks in a basic context of interactive… (More)

We provide a novel refined attractor-based complexity measurement for Boolean recurrent neural networks that represents an assessment of their computational power in terms of the significance of… (More)

We consider a model of so-called hybrid recurrent neural networks composed with Boolean input and output cells as well as sigmoid internal cells. When subjected to some infinite binary input stream,… (More)

According to the Church-Turing Thesis, the classical Turing machine model is capable of capturing all possible aspects of algorithmic computation. However, in neural computation, several basic neural… (More)