Corpus ID: 219531782

Do RNN and LSTM have Long Memory?

@inproceedings{Zhao2020DoRA,
  title={Do RNN and LSTM have Long Memory?},
  author={Jingyu Zhao and Feiqing Huang and J. Lv and Y. Duan and Zhen Qin and Guodong Li and Guangjian Tian},
  booktitle={ICML},
  year={2020}
}
  • Jingyu Zhao, Feiqing Huang, +4 authors Guangjian Tian
  • Published in ICML 2020
  • Computer Science, Mathematics
  • The LSTM network was proposed to overcome the difficulty in learning long-term dependence, and has made significant advancements in applications. With its success and drawbacks in mind, this paper raises the question - do RNN and LSTM have long memory? We answer it partially by proving that RNN and LSTM do not have long memory from a statistical perspective. A new definition for long memory networks is further introduced, and it requires the model weights to decay at a polynomial rate. To… CONTINUE READING
    5 Citations
    Echo Memory-Augmented Network for time series classification
    Noisy Recurrent Neural Networks
    • 1
    • PDF
    Deep Inertial Odometry with Accurate IMU Preintegration
    • PDF
    Cross-Positional Attention for Debiasing Clicks
    • PDF

    References

    SHOWING 1-10 OF 49 REFERENCES
    Analyzing and Exploiting NARX Recurrent Neural Networks for Long-Term Dependencies
    • 16
    • Highly Influential
    • PDF
    The Statistical Recurrent Unit
    • 25
    • PDF
    Long Short-Term Memory-Networks for Machine Reading
    • 574
    • PDF
    Dilated Recurrent Neural Networks
    • 121
    • PDF
    A Statistical Investigation of Long Memory in Language and Music
    • 3
    • PDF
    Long Short-Term Memory
    • 35,831
    • PDF
    Long-term Forecasting using Tensor-Train RNNs
    • 65
    • PDF
    Learning long-term dependencies with gradient descent is difficult
    • 4,969
    • PDF
    Hierarchical Recurrent Neural Networks for Long-Term Dependencies
    • 285
    • PDF