Recurrent Neural Networks as Weighted Language Recognizers

@inproceedings{Chen2018RecurrentNN,
  title={Recurrent Neural Networks as Weighted Language Recognizers},
  author={Yining Chen and Sorcha Gilroy and A. Maletti and Jonathan May and Kevin Knight},
  booktitle={NAACL-HLT},
  year={2018}
}
We investigate the computational complexity of various problems for simple recurrent neural networks (RNNs) as formal models for recognizing weighted languages. We focus on the single-layer, ReLU-activation, rational-weight RNNs with softmax, which are commonly used in natural language processing applications. We show that most problems for such RNNs are undecidable, including consistency, equivalence, minimization, and the determination of the highest-weighted string. However, for consistent… Expand
On the Practical Computational Power of Finite Precision RNNs for Language Recognition
On the Computational Power of RNNs
Rational Recurrences
Connecting Weighted Automata and Recurrent Neural Networks through Spectral Learning
SoPa: Bridging CNNs, RNNs, and Weighted Finite-State Machines
Training RNNs as Fast as CNNs
Provably Stable Interpretable Encodings of Context Free Grammars in RNNs with a Differentiable Stack
On the Turing Completeness of Modern Neural Network Architectures
...
1
2
3
4
...

References

SHOWING 1-10 OF 34 REFERENCES
Context dependent recurrent neural network language model
Exploring the Limits of Language Modeling
On the Computational Power of Neural Nets
On the power of sigmoid neural networks
Sequence to Sequence Learning with Neural Networks
Long Short-Term Memory
Sequence-Level Knowledge Distillation
...
1
2
3
4
...