Corpus ID: 16447573

Distributed Representations of Words and Phrases and their Compositionality

@inproceedings{Mikolov2013DistributedRO,
  title={Distributed Representations of Words and Phrases and their Compositionality},
  author={Tomas Mikolov and Ilya Sutskever and Kai Chen and G. Corrado and J. Dean},
  booktitle={NIPS},
  year={2013}
}
The recently introduced continuous Skip-gram model is an efficient method for learning high-quality distributed vector representations that capture a large number of precise syntactic and semantic word relationships. In this paper we present several extensions that improve both the quality of the vectors and the training speed. By subsampling of the frequent words we obtain significant speedup and also learn more regular word representations. We also describe a simple alternative to the… Expand
"The Sum of Its Parts": Joint Learning of Word and Phrase Representations with Autoencoders
Rehabilitation of Count-Based Models for Word Vector Representations
Towards Learning Word Representation
Enriching Word Vectors with Subword Information
Glove: Global Vectors for Word Representation
Improved Word Embeddings with Implicit Structure Information
Supervised and unsupervised methods for learning representations of linguistic units
Distributed Representations of Sentences and Documents
A New Method for the Construction of Evolving Embedded Representations of Words
...
1
2
3
4
5
...

References

SHOWING 1-10 OF 34 REFERENCES
Semantic Compositionality through Recursive Matrix-Vector Spaces
Linguistic Regularities in Continuous Space Word Representations
Efficient Estimation of Word Representations in Vector Space
Distributional Semantics Beyond Words: Supervised Learning of Analogy and Paraphrase
  • Peter D. Turney
  • Computer Science, Mathematics
  • Transactions of the Association for Computational Linguistics
  • 2013
Continuous space language models
...
1
2
3
4
...