Word embedding

Known as: Word vector space, Thought vectors, Word vectors 
Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words… (More)
Wikipedia

Papers overview

Semantic Scholar uses AI to extract papers important to this topic.
Highly Cited
2018
Highly Cited
2018
We introduce a new type of deep contextualized word representation that models both (1) complex characteristics of word use (e.g… (More)
  • table 1
  • table 2
  • table 3
  • table 4
  • table 6
Is this relevant?
Highly Cited
2016
Highly Cited
2016
The authors analyze three critical components in training word embeddings: model, corpus, and training parameters. They… (More)
  • table 1
  • table 2
  • table 3
  • table 4
  • table 5
Is this relevant?
Highly Cited
2015
Highly Cited
2015
Recent trends suggest that neural-network-inspired word embedding models outperform traditional count-based distributional models… (More)
  • table 1
  • table 2
  • table 3
  • table 4
  • table 5
Is this relevant?
Highly Cited
2015
Highly Cited
2015
Word embedding has been found to be highly powerful to translate words from one language to another by a simple linear transform… (More)
  • figure 1
  • figure 2
  • table 2
  • table 1
Is this relevant?
Highly Cited
2014
Highly Cited
2014
While continuous word embeddings are gaining popularity, current models are based solely on linear contexts. In this work, we… (More)
  • figure 1
  • table 1
  • figure 2
  • table 2
Is this relevant?
Highly Cited
2014
Highly Cited
2014
We present a method that learns word embedding for Twitter sentiment classification in this paper. Most existing algorithms for… (More)
Is this relevant?
Highly Cited
2014
Highly Cited
2014
We analyze skip-gram with negative-sampling (SGNS), a word embedding method introduced by Mikolov et al., and show that it is… (More)
  • table 1
  • table 2
Is this relevant?
Highly Cited
2014
Highly Cited
2014
The word2vec software of Tomas Mikolov and colleagues 1 has gained a lot of traction lately, and provides state-of-the-art word… (More)
Is this relevant?
Highly Cited
2014
Highly Cited
2014
The basis of applying deep learning to solve natural language processing tasks is to obtain high-quality distributed… (More)
  • figure 1
  • figure 2
  • figure 4
  • figure 3
  • figure 5
Is this relevant?
Highly Cited
2014
Highly Cited
2014
Recent methods for learning vector space representations of words have succeeded in capturing fine-grained semantic and syntactic… (More)
  • figure 1
  • table 2
  • figure 2
  • table 3
  • figure 3
Is this relevant?