On the Practical Computational Power of Finite Precision RNNs for Language Recognition

@article{Weiss2018OnTP,
  title={On the Practical Computational Power of Finite Precision RNNs for Language Recognition},
  author={Gail Weiss and Y. Goldberg and Eran Yahav},
  journal={ArXiv},
  year={2018},
  volume={abs/1805.04908}
}
  • Gail Weiss, Y. Goldberg, Eran Yahav
  • Published 2018
  • Computer Science, Mathematics
  • ArXiv
  • While Recurrent Neural Networks (RNNs) are famously known to be Turing complete, this relies on infinite precision in the states and unbounded computation time. [...] Key Result We show empirically that the LSTM does indeed learn to effectively use the counting mechanism.Expand Abstract
    On the Computational Power of RNNs
    • 9
    • PDF
    Neural Arithmetic Logic Units
    • 72
    • PDF
    On Evaluating the Generalization of LSTM Models in Formal Languages
    • 17
    • Highly Influenced
    • PDF
    Analysis Methods in Neural Language Processing: A Survey
    • 118
    • PDF
    LSTM Networks Can Perform Dynamic Counting
    • 16
    • Highly Influenced
    • PDF
    On the Turing Completeness of Modern Neural Network Architectures
    • 16
    • Highly Influenced
    • PDF
    Sequential Neural Networks as Automata
    • 16
    • Highly Influenced
    • PDF
    Rational Recurrences
    • 19
    • PDF
    Can Recurrent Neural Networks Learn Nested Recursion?
    • 13

    References

    Publications referenced by this paper.
    SHOWING 1-10 OF 19 REFERENCES
    Long Short-Term Memory
    • 30,937
    • PDF
    Empirical Evaluation of Gated Recurrent Neural Networks on Sequence Modeling
    • 4,569
    • PDF
    LSTM: A Search Space Odyssey
    • 2,053
    • PDF
    Learning Phrase Representations using RNN Encoder-Decoder for Statistical Machine Translation
    • 9,487
    • Highly Influential
    • PDF
    On the Computational Power of Neural Nets
    • 739
    Recurrent Neural Networks as Weighted Language Recognizers
    • 28
    • Highly Influential
    • PDF
    Recurrent nets that time and count
    • 371
    LSTM recurrent networks learn simple context-free and context-sensitive languages
    • 430
    • PDF
    Finding Structure in Time
    • 8,412
    • Highly Influential
    • PDF
    A Simple Way to Initialize Recurrent Networks of Rectified Linear Units
    • 452
    • PDF