Skip to search formSkip to main content
You are currently offline. Some features of the site may not work correctly.

Word embedding

Known as: Word vector space, Thought vectors, Word vectors 
Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words… Expand
Wikipedia

Papers overview

Semantic Scholar uses AI to extract papers important to this topic.
Highly Cited
2016
Highly Cited
2016
The blind application of machine learning runs the risk of amplifying biases present in data. Such a danger is facing us with… Expand
  • figure 1
  • figure 2
  • figure 3
  • figure 4
  • figure 5
Highly Cited
2015
Highly Cited
2015
We present the Word Mover's Distance (WMD), a novel distance function between text documents. Our work is based on recent results… Expand
  • figure 1
  • figure 2
  • table 1
  • figure 3
  • figure 4
Highly Cited
2015
Highly Cited
2015
Recent trends suggest that neural-network-inspired word embedding models outperform traditional count-based distributional models… Expand
Highly Cited
2014
Highly Cited
2014
We analyze skip-gram with negative-sampling (SGNS), a word embedding method introduced by Mikolov et al., and show that it is… Expand
  • table 1
  • table 2
Highly Cited
2014
Highly Cited
2014
We present a method that learns word embedding for Twitter sentiment classification in this paper. Most existing algorithms for… Expand
  • figure 1
  • table 1
  • table 2
  • table 3
  • figure 3
Highly Cited
2014
Highly Cited
2014
The word2vec software of Tomas Mikolov and colleagues (this https URL ) has gained a lot of traction lately, and provides state… Expand
Highly Cited
2014
Highly Cited
2014
While continuous word embeddings are gaining popularity, current models are based solely on linear contexts. In this work, we… Expand
  • figure 1
  • table 1
  • figure 2
  • table 2
Highly Cited
2014
Highly Cited
2014
Recent methods for learning vector space representations of words have succeeded in capturing fine-grained semantic and syntactic… Expand
  • figure 1
  • table 2
  • figure 2
  • table 3
  • figure 3
Highly Cited
2013
Highly Cited
2013
We introduce bilingual word embeddings: semantic embeddings associated across two languages in the context of neural language… Expand
  • figure 1
  • table 1
  • table 2
  • table 3
  • table 4
Highly Cited
2013
Highly Cited
2013
We propose two novel model architectures for computing continuous vector representations of words from very large data sets. The… Expand
  • figure 1
  • table 1
  • table 2
  • table 3
  • table 4