Contextualized Word Representations for Self-Attention Network

@article{Essam2018ContextualizedWR,
  title={Contextualized Word Representations for Self-Attention Network},
  author={M. Essam and S. Eldawlatly and Hazem M. Abbas},
  journal={2018 13th International Conference on Computer Engineering and Systems (ICCES)},
  year={2018},
  pages={116-121}
}
Transfer learning is one approach that could be used to better train deep neural networks. [...] Key Result In this paper, we demonstrate that a free RNN/CNN self-attention model used for sentiment analysis can be improved with 2.53% by using contextualized word representation learned in a language modeling task.Expand

References

SHOWING 1-10 OF 27 REFERENCES
Learned in Translation: Contextualized Word Vectors
  • 577
  • PDF
Deep contextualized word representations
  • 5,630
  • PDF
A Convolutional Neural Network for Modelling Sentences
  • 2,603
  • Highly Influential
  • PDF
Fine-tuned Language Models for Text Classification
  • 184
  • PDF
Recurrent Convolutional Neural Networks for Text Classification
  • 1,265
  • Highly Influential
  • PDF
Sequence to Sequence Learning with Neural Networks
  • 12,109
  • PDF
DiSAN: Directional Self-Attention Network for RNN/CNN-free Language Understanding
  • 399
  • Highly Influential
  • PDF
A Neural Attention Model for Abstractive Sentence Summarization
  • 1,733
  • PDF
Recursive Deep Models for Semantic Compositionality Over a Sentiment Treebank
  • 4,295
  • Highly Influential
  • PDF
Effective Approaches to Attention-based Neural Machine Translation
  • 4,742
  • PDF
...
1
2
3
...