Corpus ID: 6771196

Fine-grained Analysis of Sentence Embeddings Using Auxiliary Prediction Tasks

@article{Adi2017FinegrainedAO,
  title={Fine-grained Analysis of Sentence Embeddings Using Auxiliary Prediction Tasks},
  author={Y. Adi and Einat Kermany and Yonatan Belinkov and O. Lavi and Y. Goldberg},
  journal={ArXiv},
  year={2017},
  volume={abs/1608.04207}
}
  • Y. Adi, Einat Kermany, +2 authors Y. Goldberg
  • Published 2017
  • Computer Science
  • ArXiv
  • There is a lot of research interest in encoding variable length sentences into fixed length vectors, in a way that preserves the sentence meanings. Two common methods include representations based on averaging word vectors, and representations based on the hidden states of recurrent neural networks such as LSTMs. The sentence vectors are used as features for subsequent machine learning tasks or for pre-training in the context of deep learning. However, not much is known about the properties… CONTINUE READING
    258 Citations

    Paper Mentions

    Capturing Word Order in Averaging Based Sentence Embeddings
    Probing Linguistic Features of Sentence-Level Representations in Neural Relation Extraction
    • 4
    • Highly Influenced
    • PDF
    On the Use of Word Embeddings Alone to Represent Natural Language Sequences
    • 5
    Probing Linguistic Features of Sentence-Level Representations in Relation Extraction
    How LSTM Encodes Syntax: Exploring Context Vectors and Semi-Quantization on Natural Text
    Learning Universal Representations from Word to Sentence
    A Systematic Study of Inner-Attention-Based Sentence Representations in Multilingual Neural Machine Translation
    • 5
    • Highly Influenced
    An Analysis of Encoder Representations in Transformer-Based Machine Translation
    • 84
    • Highly Influenced
    • PDF

    References

    SHOWING 1-10 OF 63 REFERENCES
    Skip-Thought Vectors
    • 1,551
    • Highly Influential
    • PDF
    Representation of Linguistic Form and Function in Recurrent Neural Networks
    • 101
    • PDF
    Sequence to Sequence Learning with Neural Networks
    • 10,939
    • PDF
    A Hierarchical Neural Autoencoder for Paragraphs and Documents
    • 465
    • Highly Influential
    • PDF
    Efficient Estimation of Word Representations in Vector Space
    • 15,662
    • PDF
    Learning Distributed Representations of Sentences from Unlabelled Data
    • 362
    • PDF
    Visualizing and Understanding Recurrent Networks
    • 744
    • PDF
    Distributed Representations of Words and Phrases and their Compositionality
    • 19,590
    • PDF