Skip to search formSkip to main content
You are currently offline. Some features of the site may not work correctly.

Word embedding

Known as: Word vector space, Thought vectors, Word vectors 
Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words… Expand
Wikipedia

Papers overview

Semantic Scholar uses AI to extract papers important to this topic.
Review
2018
Review
2018
We argue word embedding models are a useful tool for the study of culture using a historical analysis of shared understandings of… Expand
  • table 1
  • table 2
  • table 3
  • table 4
  • figure 1
Is this relevant?
Review
2017
Review
2017
Cross-lingual representations of words enable us to reason about word meaning in multilingual contexts and are a key facilitator… Expand
  • figure 1
  • table 1
  • table 2
  • figure 2
  • table 3
Is this relevant?
Highly Cited
2016
Highly Cited
2016
The blind application of machine learning runs the risk of amplifying biases present in data. Such a danger is facing us with… Expand
  • figure 1
  • figure 2
  • figure 3
  • figure 4
  • figure 5
Is this relevant?
Highly Cited
2015
Highly Cited
2015
We present the Word Mover's Distance (WMD), a novel distance function between text documents. Our work is based on recent results… Expand
  • figure 1
  • figure 2
  • table 1
  • figure 3
  • figure 4
Is this relevant?
Highly Cited
2015
Highly Cited
2015
Recent trends suggest that neural-network-inspired word embedding models outperform traditional count-based distributional models… Expand
Is this relevant?
Highly Cited
2014
Highly Cited
2014
We analyze skip-gram with negative-sampling (SGNS), a word embedding method introduced by Mikolov et al., and show that it is… Expand
  • table 1
  • table 2
Is this relevant?
Highly Cited
2014
Highly Cited
2014
We present a method that learns word embedding for Twitter sentiment classification in this paper. Most existing algorithms for… Expand
  • figure 1
  • table 1
  • table 2
  • table 3
  • figure 3
Is this relevant?
Highly Cited
2014
Highly Cited
2014
While continuous word embeddings are gaining popularity, current models are based solely on linear contexts. In this work, we… Expand
  • figure 1
  • table 1
  • figure 2
  • table 2
Is this relevant?
Highly Cited
2014
Highly Cited
2014
The word2vec software of Tomas Mikolov and colleagues (this https URL ) has gained a lot of traction lately, and provides state… Expand
Is this relevant?
Highly Cited
2013
Highly Cited
2013
We propose two novel model architectures for computing continuous vector representations of words from very large data sets. The… Expand
  • figure 1
  • table 1
  • table 2
  • table 3
  • table 4
Is this relevant?