Word Sense Induction with Neural biLM and Symmetric Patterns

@inproceedings{Amrami2018WordSI,
  title={Word Sense Induction with Neural biLM and Symmetric Patterns},
  author={Asaf Amrami and Yoav Goldberg},
  booktitle={EMNLP},
  year={2018}
}
An established method for Word Sense Induction (WSI) uses a language model to predict probable substitutes for target words, and induces senses by clustering these resulting substitute vectors. We replace the ngram-based language model (LM) with a recurrent one. Beyond being more accurate, the use of the recurrent LM allows us to effectively query it in a creative way, using what we call dynamic symmetric patterns. The combination of the RNN-LM and the dynamic symmetric patterns results in… CONTINUE READING
17
Twitter Mentions

Citations

Publications citing this paper.
SHOWING 1-5 OF 5 CITATIONS

Still a Pain in the Neck: Evaluating Text Representations on Lexical Composition

VIEW 1 EXCERPT
CITES BACKGROUND

Towards better substitution-based word sense induction

VIEW 1 EXCERPT
CITES METHODS

References

Publications referenced by this paper.
SHOWING 1-10 OF 17 REFERENCES

Deep contextualized word representations

VIEW 4 EXCERPTS
HIGHLY INFLUENTIAL

A Sense-Topic Model for Word Sense Induction with Unsupervised Data Enrichment

VIEW 4 EXCERPTS
HIGHLY INFLUENTIAL