Learning Composition Models for Phrase Embeddings

@article{Yu2015LearningCM,
  title={Learning Composition Models for Phrase Embeddings},
  author={Mo Yu and Mark Dredze},
  journal={TACL},
  year={2015},
  volume={3},
  pages={227-242}
}
Lexical embeddings can serve as useful representations for words for a variety of NLP tasks, but learning embeddings for phrases can be challenging. While separate embeddings are learned for each word, this is infeasible for every phrase. We construct phrase embeddings by learning how to compose word embeddings using features that capture phrase structure and context. We propose efficient unsupervised and task-specific learning objectives that scale our model to large datasets. We demonstrate… CONTINUE READING
Highly Cited
This paper has 61 citations. REVIEW CITATIONS

11 Figures & Tables

Topics

Statistics

0102030201320142015201620172018
Citations per Year

61 Citations

Semantic Scholar estimates that this publication has 61 citations based on the available data.

See our FAQ for additional information.