Long short-term memory (LSTM; Hochreiter & Schmidhuber, 1997) can solve numerous tasks not solvable by previous learning algorithms for recurrent neural networks (RNNs). We identify a weakness of… (More)
Previous work on learning regular languages from exemplary training sequences showed that long short-term memory (LSTM) outperforms traditional recurrent neural networks (RNNs). We demonstrate LSTMs… (More)
The temporal distance between events conveys information essential for numerous sequential tasks such as motor control and rhythm detection. While Hidden Markov Models tend to ignore this… (More)
This paper presents some simpliications to our recently introduced \CoDi-model", which we use to evolve Cellular Automata based neural network modules for ATR's artiicial brain project \CAM-Brain"… (More)
The long short-term memory (LSTM) network trained by gradient descent solves difficult problems which traditional recurrent neural networks in general cannot. We have recently observed that the… (More)
This paper reports on recent progress made in ATR's attempt to build a 10,000 evolved neural net module artificial brain to control the behaviors of a life sized robot kitten.
Long Short-Term Memory (LSTM) is able to solve many time series tasks unsolvable by feed-forward networks using xed size time windows. Here we nd that LSTM's superiority does not carry over to… (More)
We report results on benchmarking Open Information Extraction (OIE) systems using RelVis, a toolkit for benchmarking Open Information Extraction systems. Our comprehensive benchmark contains three… (More)
In response to Rodriguez's recent article (2001), we compare the performance of simple recurrent nets and long short-term memory recurrent nets on context-free and context-sensitive languages.