Learning to update Auto-associative Memory in Recurrent Neural Networks for Improving Sequence Memorization
@article{Zhang2017LearningTU, title={Learning to update Auto-associative Memory in Recurrent Neural Networks for Improving Sequence Memorization}, author={Wei Zhang and Bowen Zhou}, journal={ArXiv}, year={2017}, volume={abs/1709.06493} }
Learning to remember long sequences remains a challenging task for recurrent neural networks. Register memory and attention mechanisms were both proposed to resolve the issue with either high computational cost to retain memory differentiability, or by discounting the RNN representation learning towards encoding shorter local contexts than encouraging long sequence encoding. Associative memory, which studies the compression of multiple patterns in a fixed size memory, were rarely considered in… Expand
8 Citations
Rotational Unit of Memory: A Novel Representation Unit for RNNs with Scalable Applications
- Computer Science
- Transactions of the Association for Computational Linguistics
- 2019
- 9
- Highly Influenced
- PDF
MIDS: End-to-End Personalized Response Generation in Untrimmed Multi-Role Dialogue*
- Computer Science
- 2019 International Joint Conference on Neural Networks (IJCNN)
- 2019
References
SHOWING 1-10 OF 44 REFERENCES
Learning dynamic Boltzmann machines with spike-timing dependent plasticity
- Computer Science, Mathematics
- ArXiv
- 2015
- 14
- PDF