Corpus ID: 221949068

Attention Meets Perturbations: Robust and Interpretable Attention with Adversarial Training

@article{Kitada2020AttentionMP,
  title={Attention Meets Perturbations: Robust and Interpretable Attention with Adversarial Training},
  author={Shunsuke Kitada and H. Iyatomi},
  journal={ArXiv},
  year={2020},
  volume={abs/2009.12064}
}
  • Shunsuke Kitada, H. Iyatomi
  • Published 2020
  • Computer Science
  • ArXiv
  • In recent years, deep learning models have placed more emphasis on the interpretability and robustness of models. The attention mechanism is an important technique that contributes to these elements and is widely used, especially in the natural language processing (NLP) field. Adversarial training (AT) is a powerful regularization technique for enhancing the robustness of neural networks and has been successful in many applications. The application of AT to the attention mechanism is expected… CONTINUE READING

    Figures and Tables from this paper.

    References

    SHOWING 1-10 OF 45 REFERENCES
    Robust Multilingual Part-of-Speech Tagging via Adversarial Training
    • 49
    • PDF
    Interpretable Adversarial Perturbation in Input Embedding Space for Text
    • 51
    • Highly Influential
    • PDF
    Explaining and Harnessing Adversarial Examples
    • 5,788
    • Highly Influential
    • PDF
    Attention is All you Need
    • 13,248
    • Highly Influential
    • PDF
    Adversarial Training Methods for Semi-Supervised Text Classification
    • 309
    • Highly Influential
    • PDF
    A Theoretical Framework for Robustness of (Deep) Classifiers against Adversarial Samples
    • 33
    • PDF
    Understanding adversarial training: Increasing local stability of supervised models through robust optimization
    • 116
    • PDF
    Delving into Transferable Adversarial Examples and Black-box Attacks
    • 726
    • PDF
    Attention is not Explanation
    • 223
    • Highly Influential
    • PDF
    Effective Approaches to Attention-based Neural Machine Translation
    • 4,216
    • PDF