Revisiting Recurrent Networks for Paraphrastic Sentence Embeddings

@inproceedings{Wieting2017RevisitingRN,
  title={Revisiting Recurrent Networks for Paraphrastic Sentence Embeddings},
  author={J. Wieting and Kevin Gimpel},
  booktitle={ACL},
  year={2017}
}
  • J. Wieting, Kevin Gimpel
  • Published in ACL 2017
  • Computer Science
  • We consider the problem of learning general-purpose, paraphrastic sentence embeddings, revisiting the setting of Wieting et al. (2016b). While they found LSTM recurrent networks to underperform word averaging, we present several developments that together produce the opposite conclusion. These include training on sentence pairs rather than phrase pairs, averaging states to represent sequences, and regularizing aggressively. These improve LSTMs in both transfer learning and supervised settings… CONTINUE READING
    62 Citations
    Learning Paraphrastic Sentence Embeddings from Back-Translated Bitext
    • 51
    • PDF
    Convolutional Neural Network for Universal Sentence Embeddings
    • 6
    • Highly Influenced
    • PDF
    Sequential Network Transfer: Adapting Sentence Embeddings to Human Activities and Beyond
    • 2
    • Highly Influenced
    Pushing the Limits of Paraphrastic Sentence Embeddings with Millions of Machine Translations
    • 136
    • PDF
    Advancing Seq2seq with Joint Paraphrase Learning
    • Highly Influenced
    • PDF
    Direct Network Transfer: Transfer Learning of Sentence Embeddings for Semantic Similarity
    • 1
    • Highly Influenced
    • PDF
    Paraphrase Detection on Noisy Subtitles in Six Languages
    • 5
    • PDF

    References

    SHOWING 1-10 OF 50 REFERENCES
    Towards Universal Paraphrastic Sentence Embeddings
    • 418
    • PDF
    From Paraphrase Database to Compositional Paraphrase Model and Back
    • 227
    • PDF
    Skip-Thought Vectors
    • 1,629
    • PDF
    Deep Unordered Composition Rivals Syntactic Methods for Text Classification
    • 542
    • Highly Influential
    • PDF
    Dynamic Pooling and Unfolding Recursive Autoencoders for Paraphrase Detection
    • 820
    • PDF
    A Convolutional Neural Network for Modelling Sentences
    • 2,550
    • Highly Influential
    • PDF
    A Simple but Tough-to-Beat Baseline for Sentence Embeddings
    • 741
    Improved Semantic Representations From Tree-Structured Long Short-Term Memory Networks
    • 2,024
    • PDF
    Charagram: Embedding Words and Sentences via Character n-grams
    • 152
    • PDF
    Unsupervised Learning of Sentence Embeddings using Compositional n-Gram Features
    • 345
    • PDF