Word2vec

Word2vec is a group of related models that are used to produce word embeddings. These models are shallow, two-layer neural networks that are trained… (More)
Wikipedia

Topic mentions per year

Topic mentions per year

2004-2018
05010015020042018

Papers overview

Semantic Scholar uses AI to extract papers important to this topic.
2017
2017
We present a semantic vector space model for capturing complex polyphonic musical context. A word2vec model based on a skip-gram… (More)
  • figure 2
  • figure 3
  • figure 4
Is this relevant?
2016
2016
Word2Vec is a widely used algorithm for extracting low-dimensional vector representations of words. It generated considerable… (More)
  • figure 1
  • figure 2
  • figure 3
  • table II
  • table I
Is this relevant?
2016
2016
Perhaps the most amazing property of these word embeddings is that somehow these vector encodings effectively capture the… (More)
  • figure 1
  • figure 2
  • figure 3
Is this relevant?
Highly Cited
2015
Highly Cited
2015
We present two simple modifications to the models in the popular Word2Vec tool, in order to generate embeddings more suited to… (More)
  • figure 1
  • figure 2
  • table 1
  • table 2
  • table 3
Is this relevant?
2015
2015
Big data is a broad data set that has been used in many fields. To process huge data set is a time consuming work, not only due… (More)
  • table I
  • table II
  • table III
Is this relevant?
2015
2015
In this paper we explore how word vectors built using word2vec can be used to improve the performance of a classifier during… (More)
  • table 1
  • table 2
  • figure 2
  • figure 1
Is this relevant?
Review
2014
Review
2014
The word2vec model and application by Mikolov et al. have attracted a great amount of attention in recent two years. The vector… (More)
  • figure 1
  • figure 2
  • figure 3
  • figure 4
  • figure 5
Is this relevant?
Highly Cited
2014
Highly Cited
2014
The word2vec software of Tomas Mikolov and colleagues has gained a lot of traction lately, and provides state-of-the-art word… (More)
Is this relevant?
2014
2014
We extend the word2vec framework to capture meaning across languages. The input consists of a source text and a word-aligned… (More)
  • figure 2
  • table 1
  • table 2
  • table 3
Is this relevant?
2014
2014
The Global Vectors for word representation (GloVe), introduced by Jeffrey Pennington et al. [3] is reported to be an efficient… (More)
  • figure 1
  • figure 2
Is this relevant?