Corpus ID: 9266291

How exactly does word 2 vec work ?

@inproceedings{Meyer2016HowED,
  title={How exactly does word 2 vec work ?},
  author={David N. Meyer},
  year={2016}
}
  • David N. Meyer
  • Published 2016
  • Perhaps the most amazing property of these word embeddings is that somehow these vector encodings effectively capture the semantic meanings of the words. The question one might ask is how or why? The answer is that because the vectors adhere surprisingly well to our intuition. For instance, words that we know to be synonyms tend to have similar vectors in terms of cosine similarity and antonyms tend to have dissimilar vectors. Even more surprisingly, word vectors tend to obey the laws of… CONTINUE READING
    3 Citations

    Figures from this paper.

    Multi-Lingual Information Retrieval Using Deep Learning
    • 2
    Sentiment Analysis of Product Reviews using Deep Learning
    • 3

    References

    SHOWING 1-5 OF 5 REFERENCES
    Efficient Estimation of Word Representations in Vector Space
    • 15,670
    • Highly Influential
    • PDF
    Distributed Representations of Words and Phrases and their Compositionality
    • 19,600
    • PDF
    A fast and simple algorithm for training neural probabilistic language models
    • 452
    • PDF
    Noise-Contrastive Estimation of Unnormalized Statistical Models, with Applications to Natural Image Statistics
    • 443
    • Highly Influential
    • PDF
    Gutmann and Aapo Hyvärinen . Noise - contrastive estimation of unnormal - ized statistical models , with applications to natural image statistics
    • J . Mach . Learn . Res .
    • 2012