Language Modelling Makes Sense: Propagating Representations through WordNet for Full-Coverage Word Sense Disambiguation

@inproceedings{Loureiro2019LanguageMM,
  title={Language Modelling Makes Sense: Propagating Representations through WordNet for Full-Coverage Word Sense Disambiguation},
  author={Daniel Loureiro and Al{\'i}pio M{\'a}rio Jorge},
  booktitle={ACL},
  year={2019}
}
Contextual embeddings represent a new generation of semantic representations learned from Neural Language Modelling (NLM) that addresses the issue of meaning conflation hampering traditional word embeddings. In this work, we show that contextual embeddings can be used to achieve unprecedented gains in Word Sense Disambiguation (WSD) tasks. Our approach focuses on creating sense-level embeddings with full-coverage of WordNet, and without recourse to explicit knowledge of sense distributions or… CONTINUE READING
1
Twitter Mention

Citations

Publications citing this paper.

References

Publications referenced by this paper.
SHOWING 1-10 OF 30 REFERENCES

Enriching Word Vectors with Subword Information

  • Transactions of the Association for Computational Linguistics
  • 2016
VIEW 5 EXCERPTS
HIGHLY INFLUENTIAL

Deep Contextualized Word Representations

VIEW 10 EXCERPTS
HIGHLY INFLUENTIAL