Seq2seq-Vis: A Visual Debugging Tool for Sequence-to-Sequence Models
@article{Strobelt2019Seq2seqVisAV, title={Seq2seq-Vis: A Visual Debugging Tool for Sequence-to-Sequence Models}, author={Hendrik Strobelt and Sebastian Gehrmann and Michael Behrisch and Adam Perer and H. Pfister and Alexander M. Rush}, journal={IEEE Transactions on Visualization and Computer Graphics}, year={2019}, volume={25}, pages={353-363} }
Neural sequence-to-sequence models have proven to be accurate and robust for many sequence prediction tasks, and have become the standard approach for automatic translation of text. The models work with a five-stage blackbox pipeline that begins with encoding a source sequence to a vector space and then decoding out to a new target sequence. This process is now standard, but like many deep learning methods remains quite difficult to understand or debug. In this work, we present a visual… CONTINUE READING
Supplemental Code
Github Repo
Via Papers with Code
Visualization for Sequential Neural Networks with Attention
Figures and Topics from this paper
Paper Mentions
89 Citations
A Gray Box Interpretable Visual Debugging Approach for Deep Sequence Learning Model
- Computer Science, Engineering
- 2019 IEEE Region 10 Symposium (TENSYMP)
- 2019
- PDF
ProtoSteer: Steering Deep Sequence Model with Prototypes
- Computer Science, Medicine
- IEEE Transactions on Visualization and Computer Graphics
- 2020
- 8
AttViz: Online exploration of self-attention for transparent neural language modeling
- Computer Science, Mathematics
- ArXiv
- 2020
- PDF
Ablate, Variate, and Contemplate: Visual Analytics for Discovering Neural Architectures
- Computer Science, Medicine
- IEEE Transactions on Visualization and Computer Graphics
- 2020
- 6
- PDF
Dimension Reduction Approach for Interpretability of Sequence to Sequence Recurrent Neural Networks
- Computer Science, Mathematics
- ArXiv
- 2019
- 2
Understanding and Improving Hidden Representations for Neural Machine Translation
- Computer Science
- NAACL-HLT
- 2019
- PDF
exBERT: A Visual Analysis Tool to Explore Learned Representations in Transformers Models
- Computer Science
- ACL
- 2020
- 27
- PDF
An Analysis of Encoder Representations in Transformer-Based Machine Translation
- Computer Science
- BlackboxNLP@EMNLP
- 2018
- 100
- PDF
References
SHOWING 1-10 OF 55 REFERENCES
LSTMVis: A Tool for Visual Analysis of Hidden State Dynamics in Recurrent Neural Networks
- Computer Science, Medicine
- IEEE Transactions on Visualization and Computer Graphics
- 2018
- 193
A causal framework for explaining the predictions of black-box sequence-to-sequence models
- Computer Science, Mathematics
- EMNLP
- 2017
- 104
- PDF
Google's Neural Machine Translation System: Bridging the Gap between Human and Machine Translation
- Computer Science
- ArXiv
- 2016
- 3,303
- PDF
Abstractive Text Summarization using Sequence-to-sequence RNNs and Beyond
- Computer Science
- CoNLL
- 2016
- 1,104
- PDF
A Convolutional Encoder Model for Neural Machine Translation
- Computer Science
- ACL
- 2017
- 267
- Highly Influential
- PDF