context2vec: Learning Generic Context Embedding with Bidirectional LSTM

@inproceedings{Melamud2016context2vecLG,
  title={context2vec: Learning Generic Context Embedding with Bidirectional LSTM},
  author={Oren Melamud and J. Goldberger and I. Dagan},
  booktitle={CoNLL},
  year={2016}
}
  • Oren Melamud, J. Goldberger, I. Dagan
  • Published in CoNLL 2016
  • Computer Science
  • Context representations are central to various NLP tasks, such as word sense disambiguation, named entity recognition, coreference resolution, and many more. In this work we present a neural model for efficiently learning a generic context embedding function from large corpora, using bidirectional LSTM. With a very simple application of our context representations, we manage to surpass or nearly reach state-of-the-art results on sentence completion, lexical substitution and word sense… CONTINUE READING
    268 Citations
    Quasi Bidirectional Encoder Representations from Transformers for Word Sense Disambiguation
    • 5
    • PDF
    Semi-supervised sequence tagging with bidirectional language models
    • 347
    • PDF
    Improved Word Sense Disambiguation Using Pre-Trained Contextualized Word Representations
    • 13
    • Highly Influenced
    • PDF
    Deep contextualized word embeddings from character language models for neural sequence labeling
    • 1
    • Highly Influenced
    Towards semantic-rich word embeddings
    Deep contextualized word representations
    • 4,608
    • Highly Influenced
    • PDF
    Learning to Embed Words in Context for Syntactic Tasks
    • 15
    • PDF

    References

    SHOWING 1-10 OF 37 REFERENCES
    Improving Word Representations via Global Context and Multiple Word Prototypes
    • 1,066
    • PDF
    Tailoring Continuous Word Representations for Dependency Parsing
    • 273
    • PDF
    A Simple Word Embedding Model for Lexical Substitution
    • 91
    • PDF
    Word Representations: A Simple and General Method for Semi-Supervised Learning
    • 1,965
    • PDF
    Neural context embeddings for automatic discovery of word senses
    • 25
    • PDF
    The Role of Context Types and Dimensionality in Learning Word Embeddings
    • 93
    • PDF
    Not All Contexts Are Created Equal: Better Word Representations with Variable Attention
    • 117
    • PDF
    Learning to Rank Lexical Substitutions
    • 15
    • PDF