Skip to search formSkip to main contentSkip to account menu

Word embedding

Known as: Word vector space, Thought vectors, Word vectors 
Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words… 
Wikipedia (opens in a new tab)

Papers overview

Semantic Scholar uses AI to extract papers important to this topic.
Highly Cited
2018
Highly Cited
2018
Cross-lingual transfer of word embeddings aims to establish the semantic mappings among words in different languages by learning… 
Highly Cited
2018
Highly Cited
2018
While the celebrated Word2Vec technique yields semantically rich representations for individual words, there has been relatively… 
Review
2018
Review
2018
Word embeddings are real-valued word representations able to capture lexical semantics and trained on natural language corpora… 
Highly Cited
2017
Highly Cited
2017
Word embeddings have been found to provide meaningful representations for words in an efficient way; therefore, they have become… 
Highly Cited
2016
Highly Cited
2016
Mapping word embeddings of different languages into a single space has multiple applications. In order to map from a source space… 
Review
2016
Review
2016
The application of information retrieval techniques to search tasks in software engineering is made difficult by the lexical gap… 
Highly Cited
2016
Highly Cited
2016
We present the Siamese Continuous Bag of Words (Siamese CBOW) model, a neural network for efficient estimation of high-quality… 
Highly Cited
2015
Highly Cited
2015
The authors analyze three critical components in training word embeddings: model, corpus, and training parameters. They… 
Highly Cited
2015
Highly Cited
2015
We propose a new unified framework for monolingual (MoIR) and cross-lingual information retrieval (CLIR) which relies on the… 
Review
2013
Review
2013
Word embeddings resulting from neural language models have been shown to be a great asset for a large variety of NLP tasks…