Corpus ID: 59291892

State-Regularized Recurrent Neural Networks

@inproceedings{Wang2019StateRegularizedRN,
  title={State-Regularized Recurrent Neural Networks},
  author={Cheng Wang and Mathias Niepert},
  booktitle={ICML},
  year={2019}
}
  • Cheng Wang, Mathias Niepert
  • Published in ICML 2019
  • Mathematics, Computer Science
  • Recurrent neural networks are a widely used class of neural architectures. They have, however, two shortcomings. First, it is difficult to understand what exactly they learn. Second, they tend to work poorly on sequences requiring long-term memorization, despite having this capacity in principle. We aim to address both shortcomings with a class of recurrent networks that use a stochastic state transition mechanism between cell applications. This mechanism, which we term state-regularization… CONTINUE READING

    Figures, Tables, and Topics from this paper.

    References

    Publications referenced by this paper.
    SHOWING 1-10 OF 76 REFERENCES

    Learning Finite State Machines With Self-Clustering Recurrent Networks

    VIEW 6 EXCERPTS
    HIGHLY INFLUENTIAL

    Second-order recurrent neural networks for grammatical inference

    VIEW 4 EXCERPTS
    HIGHLY INFLUENTIAL

    Learning to Execute

    VIEW 1 EXCERPT

    Independently Recurrent Neural Network (IndRNN): Building A Longer and Deeper RNN

    Unitary Evolution Recurrent Neural Networks

    VIEW 3 EXCERPTS