On the Practical Computational Power of Finite Precision RNNs for Language Recognition

@inproceedings{Weiss2018OnTP,
  title={On the Practical Computational Power of Finite Precision RNNs for Language Recognition},
  author={Gail Weiss and Yoav Goldberg and Eran Yahav},
  booktitle={ACL},
  year={2018}
}
While Recurrent Neural Networks (RNNs) are famously known to be Turing complete, this relies on infinite precision in the states and unbounded computation time. We consider the case of RNNs with finite precision whose computation time is linear in the input length. Under these limitations, we show that different RNN variants have different computational power. In particular, we show that the LSTM and the Elman-RNN with ReLU activation are strictly stronger than the RNN with a squashing… CONTINUE READING

Similar Papers

Citations

Publications citing this paper.
SHOWING 1-10 OF 28 CITATIONS

MODERN NEURAL NETWORK ARCHITECTURES

VIEW 13 EXCERPTS
CITES BACKGROUND
HIGHLY INFLUENCED

Sequential neural networks as automata

VIEW 10 EXCERPTS
CITES BACKGROUND & METHODS
HIGHLY INFLUENCED

State-Regularized Recurrent Neural Networks

VIEW 14 EXCERPTS
CITES METHODS & BACKGROUND
HIGHLY INFLUENCED

LSTM Networks Can Perform Dynamic Counting

VIEW 4 EXCERPTS
CITES BACKGROUND & RESULTS
HIGHLY INFLUENCED

Theoretical Limitations of Self-Attention in Neural Sequence Models

  • ArXiv
  • 2019
VIEW 4 EXCERPTS
CITES BACKGROUND & METHODS
HIGHLY INFLUENCED

References

Publications referenced by this paper.