State gradients for analyzing memory in LSTM language models
@article{Verwimp2020StateGF, title={State gradients for analyzing memory in LSTM language models}, author={Lyan Verwimp and H. V. hamme and P. Wambacq}, journal={Comput. Speech Lang.}, year={2020}, volume={61}, pages={101034} }
Abstract Gradients can be used to train neural networks, but they can also be used to interpret them. We investigate how well the inputs of RNNs are remembered by their state by calculating ‘state gradients’, and applying SVD on the gradient matrix to reveal which directions in embedding space are remembered and to what extent. Our method can be applied to any RNN and reveals which properties in the embedding space influence the state space, without the need to know and label the properties… Expand
Topics from this paper
One Citation
Classification of Live/Lifeless Assets with Laser Beams in Different Humidity Environments
- Environmental Science, Computer Science
- 2020 8th International Symposium on Digital Forensics and Security (ISDFS)
- 2020
References
SHOWING 1-10 OF 68 REFERENCES
Fine-grained Analysis of Sentence Embeddings Using Auxiliary Prediction Tasks
- Computer Science
- ICLR
- 2017
- 286
- PDF
Evaluating the Ability of LSTMs to Learn Context-Free Grammars
- Computer Science
- BlackboxNLP@EMNLP
- 2018
- 28
- PDF
Investigation on LSTM Recurrent N-gram Language Models for Speech Recognition
- Computer Science
- INTERSPEECH
- 2018
- 18
- PDF
LSTMVis: A Tool for Visual Analysis of Hidden State Dynamics in Recurrent Neural Networks
- Computer Science, Medicine
- IEEE Transactions on Visualization and Computer Graphics
- 2018
- 203