context2vec: Learning Generic Context Embedding with Bidirectional LSTM

@inproceedings{Melamud2016context2vecLG,
  title={context2vec: Learning Generic Context Embedding with Bidirectional LSTM},
  author={Oren Melamud and Jacob Goldberger and Ido Dagan},
  booktitle={CoNLL},
  year={2016}
}
Context representations are central to various NLP tasks, such as word sense disambiguation, named entity recognition, coreference resolution, and many more. In this work we present a neural model for efficiently learning a generic context embedding function from large corpora, using bidirectional LSTM. With a very simple application of our context representations, we manage to surpass or nearly reach state-of-the-art results on sentence completion, lexical substitution and word sense… CONTINUE READING

Citations

Publications citing this paper.
SHOWING 1-10 OF 124 CITATIONS

A Deep Dive into Word Sense Disambiguation with LSTM

VIEW 8 EXCERPTS
CITES METHODS, BACKGROUND & RESULTS
HIGHLY INFLUENCED

Knowledge-based Supervision for Domain-adaptive Semantic Role Labeling

VIEW 4 EXCERPTS
CITES METHODS & BACKGROUND
HIGHLY INFLUENCED

Deep contextualized word representations

VIEW 4 EXCERPTS
CITES METHODS & BACKGROUND
HIGHLY INFLUENCED

Word RNN as a Baseline for Sentence Completion

  • 2018 IEEE 5th International Congress on Information Science and Technology (CiSt)
  • 2018
VIEW 4 EXCERPTS
CITES BACKGROUND & RESULTS
HIGHLY INFLUENCED

FILTER CITATIONS BY YEAR

2015
2019

CITATION STATISTICS

  • 19 Highly Influenced Citations

  • Averaged 40 Citations per year from 2017 through 2019

References

Publications referenced by this paper.