Corpus ID: 12909464

Breaking Sticks and Ambiguities with Adaptive Skip-gram

@article{Bartunov2016BreakingSA,
  title={Breaking Sticks and Ambiguities with Adaptive Skip-gram},
  author={Sergey Bartunov and D. Kondrashkin and A. Osokin and D. Vetrov},
  journal={ArXiv},
  year={2016},
  volume={abs/1502.07257}
}
Recently proposed Skip-gram model is a powerful method for learning high-dimensional word representations that capture rich semantic relationships between words. However, Skip-gram as well as most prior work on learning word representations does not take into account word ambiguity and maintain only single representation per word. Although a number of Skip-gram modifications were proposed to overcome this limitation and learn multi-prototype word representations, they either require a known… Expand
Disambiguated skip-gram model
MUSE: Modularizing Unsupervised Sense Embeddings
Adaptive Probabilistic Word Embedding
Combining Neural Language Models for Word Sense Induction
Context-Dependent Sense Embedding
A Simple Approach to Learn Polysemous Word Embeddings
Integrating Weakly Supervised Word Sense Disambiguation into Neural Machine Translation
Word Sense Disambiguation for 158 Languages using Word Embeddings Only
Infinite Dimensional Word Embeddings
...
1
2
3
4
5
...

References

SHOWING 1-10 OF 40 REFERENCES
Efficient Non-parametric Estimation of Multiple Embeddings per Word in Vector Space
A Unified Model for Word Sense Representation and Disambiguation
A Scalable Hierarchical Distributed Language Model
Efficient Estimation of Word Representations in Vector Space
...
1
2
3
4
...