State gradients for analyzing memory in LSTM language models

@article{Verwimp2020StateGF,
  title={State gradients for analyzing memory in LSTM language models},
  author={Lyan Verwimp and H. V. hamme and P. Wambacq},
  journal={Comput. Speech Lang.},
  year={2020},
  volume={61},
  pages={101034}
}
Abstract Gradients can be used to train neural networks, but they can also be used to interpret them. We investigate how well the inputs of RNNs are remembered by their state by calculating ‘state gradients’, and applying SVD on the gradient matrix to reveal which directions in embedding space are remembered and to what extent. Our method can be applied to any RNN and reveals which properties in the embedding space influence the state space, without the need to know and label the properties… Expand
1 Citations
Classification of Live/Lifeless Assets with Laser Beams in Different Humidity Environments
  • Nevzat Olgun, Ibrahim Türkoglu
  • Environmental Science, Computer Science
  • 2020 8th International Symposium on Digital Forensics and Security (ISDFS)
  • 2020

References

SHOWING 1-10 OF 68 REFERENCES
State Gradients for RNN Memory Analysis
  • 1
  • PDF
Visualizing and Understanding Neural Models in NLP
  • 418
  • PDF
Evaluating the Ability of LSTMs to Learn Context-Free Grammars
  • 28
  • PDF
Visualizing and Understanding Recurrent Networks
  • 793
  • PDF
LSTMVis: A Tool for Visual Analysis of Hidden State Dynamics in Recurrent Neural Networks
  • 203
...
1
2
3
4
5
...