Learning to Forget: Continual Prediction with LSTM

@article{Gers2000LearningTF,
  title={Learning to Forget: Continual Prediction with LSTM},
  author={Felix A. Gers and J{\"u}rgen Schmidhuber and Fred A. Cummins},
  journal={Neural Computation},
  year={2000},
  volume={12},
  pages={2451-2471}
}
Long short-term memory (LSTM; Hochreiter & Schmidhuber, 1997) can solve numerous tasks not solvable by previous learning algorithms for recurrent neural networks (RNNs). We identify a weakness of LSTM networks processing continual input streams that are not a priori segmented into subsequences with explicitly marked ends at which the network's internal state could be reset. Without resets, the state may grow indefinitely and eventually cause the network to break down. Our remedy is a novel… CONTINUE READING
Highly Influential
This paper has highly influenced 108 other papers. REVIEW HIGHLY INFLUENTIAL CITATIONS
Highly Cited
This paper has 1,510 citations. REVIEW CITATIONS

From This Paper

Figures, tables, and topics from this paper.

Citations

Publications citing this paper.
Showing 1-10 of 900 extracted citations

Multimodal Content Analysis for Effective Advertisements on YouTube

2017 IEEE International Conference on Data Mining (ICDM) • 2017
View 11 Excerpts
Highly Influenced

Unsupervised Video Understanding by Reconciliation of Posture Similarities

2017 IEEE International Conference on Computer Vision (ICCV) • 2017
View 9 Excerpts
Highly Influenced

Cells in Multidimensional Recurrent Neural Networks

Journal of Machine Learning Research • 2016
View 7 Excerpts
Highly Influenced

Alternative time representation in dopamine models

Journal of Computational Neuroscience • 2009
View 6 Excerpts
Highly Influenced

1,510 Citations

0200400600'99'03'08'13'18
Citations per Year
Semantic Scholar estimates that this publication has 1,510 citations based on the available data.

See our FAQ for additional information.

References

Publications referenced by this paper.

Similar Papers

Loading similar papers…