Corpus ID: 5882977

Towards Universal Paraphrastic Sentence Embeddings

@article{Wieting2016TowardsUP,
  title={Towards Universal Paraphrastic Sentence Embeddings},
  author={J. Wieting and Mohit Bansal and Kevin Gimpel and Karen Livescu},
  journal={CoRR},
  year={2016},
  volume={abs/1511.08198}
}
  • J. Wieting, Mohit Bansal, +1 author Karen Livescu
  • Published 2016
  • Computer Science
  • CoRR
  • We consider the problem of learning general-purpose, paraphrastic sentence embeddings based on supervision from the Paraphrase Database (Ganitkevitch et al., 2013). We compare six compositional architectures, evaluating them on annotated textual similarity datasets drawn both from the same distribution as the training data and from a wide range of other domains. We find that the most complex architectures, such as long short-term memory (LSTM) recurrent neural networks, perform best on the in… CONTINUE READING
    418 Citations
    Revisiting Recurrent Networks for Paraphrastic Sentence Embeddings
    • 62
    • PDF
    Pushing the Limits of Paraphrastic Sentence Embeddings with Millions of Machine Translations
    • 136
    • PDF
    Learning Paraphrastic Sentence Embeddings from Back-Translated Bitext
    • 51
    • PDF
    Towards Generalizable Sentence Embeddings
    • 17
    • PDF
    THE IMPORTANCE OF SUBWORD EMBEDDINGS IN SENTENCE PAIR MODELING
    • 2
    • PDF
    DCU-SEManiacs at SemEval-2016 Task 1: Synthetic Paragram Embeddings for Semantic Textual Similarity
    • 1
    • PDF
    Character-Based Neural Networks for Sentence Pair Modeling
    • 5
    • PDF
    A S IMPLE BUT T OUGH-TOB EAT
    • Highly Influenced
    Convolutional Neural Network for Universal Sentence Embeddings
    • 6
    • Highly Influenced
    • PDF

    References

    SHOWING 1-10 OF 98 REFERENCES
    From Paraphrase Database to Compositional Paraphrase Model and Back
    • 227
    • PDF
    Recursive Deep Models for Semantic Compositionality Over a Sentiment Treebank
    • 4,163
    • Highly Influential
    • PDF
    Skip-Thought Vectors
    • 1,627
    • Highly Influential
    • PDF
    Deep Unordered Composition Rivals Syntactic Methods for Text Classification
    • 542
    • Highly Influential
    • PDF
    Sequence to Sequence Learning with Neural Networks
    • 11,664
    • PDF
    A Convolutional Neural Network for Modelling Sentences
    • 2,548
    • Highly Influential
    • PDF
    A Hierarchical Neural Autoencoder for Paragraphs and Documents
    • 482
    • PDF
    Dynamic Pooling and Unfolding Recursive Autoencoders for Paraphrase Detection
    • 820
    • Highly Influential
    • PDF
    Tailoring Continuous Word Representations for Dependency Parsing
    • 275
    • PDF
    Word Representations: A Simple and General Method for Semi-Supervised Learning
    • 2,008
    • PDF