Skip to search form
Skip to main content
Skip to account menu
Semantic Scholar
Semantic Scholar's Logo
Search 218,237,886 papers from all fields of science
Search
Sign In
Create Free Account
Word embedding
Known as:
Word vector space
, Thought vectors
, Word vectors
Expand
Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words…
Expand
Wikipedia
(opens in a new tab)
Create Alert
Alert
Related topics
Related topics
18 relations
Bioinformatics
Brown clustering
Co-occurrence matrix
Deep learning
Expand
Papers overview
Semantic Scholar uses AI to extract papers important to this topic.
Highly Cited
2018
Highly Cited
2018
Unsupervised Cross-lingual Transfer of Word Embedding Spaces
Ruochen Xu
,
Yiming Yang
,
Naoki Otani
,
Yuexin Wu
Conference on Empirical Methods in Natural…
2018
Corpus ID: 52186890
Cross-lingual transfer of word embeddings aims to establish the semantic mappings among words in different languages by learning…
Expand
Highly Cited
2018
Highly Cited
2018
Word Mover’s Embedding: From Word2Vec to Document Embedding
Lingfei Wu
,
I. E. Yen
,
+5 authors
M. Witbrock
Conference on Empirical Methods in Natural…
2018
Corpus ID: 53081529
While the celebrated Word2Vec technique yields semantically rich representations for individual words, there has been relatively…
Expand
Review
2018
Review
2018
A Survey of Word Embeddings Evaluation Methods
Amir Bakarov
arXiv.org
2018
Corpus ID: 9340872
Word embeddings are real-valued word representations able to capture lexical semantics and trained on natural language corpora…
Expand
Highly Cited
2017
Highly Cited
2017
Portuguese Word Embeddings: Evaluating on Word Analogies and Natural Language Tasks
N. Hartmann
,
Erick Rocha Fonseca
,
C. Shulby
,
Marcos Vinícius Treviso
,
Jéssica S. Rodrigues
,
S. Aluísio
Brazilian Symposium in Information and Human…
2017
Corpus ID: 1541076
Word embeddings have been found to provide meaningful representations for words in an efficient way; therefore, they have become…
Expand
Highly Cited
2016
Highly Cited
2016
Learning principled bilingual mappings of word embeddings while preserving monolingual invariance
Mikel Artetxe
,
Gorka Labaka
,
Eneko Agirre
Conference on Empirical Methods in Natural…
2016
Corpus ID: 1040556
Mapping word embeddings of different languages into a single space has multiple applications. In order to map from a source space…
Expand
Review
2016
Review
2016
From Word Embeddings to Document Similarities for Improved Information Retrieval in Software Engineering
Xin Ye
,
Hui Shen
,
Xiao Ma
,
Razvan C. Bunescu
,
Chang Liu
International Conference on Software Engineering
2016
Corpus ID: 4280759
The application of information retrieval techniques to search tasks in software engineering is made difficult by the lexical gap…
Expand
Highly Cited
2016
Highly Cited
2016
Siamese CBOW: Optimizing Word Embeddings for Sentence Representations
Tom Kenter
,
Alexey Borisov
,
M. de Rijke
Annual Meeting of the Association for…
2016
Corpus ID: 12998432
We present the Siamese Continuous Bag of Words (Siamese CBOW) model, a neural network for efficient estimation of high-quality…
Expand
Highly Cited
2015
Highly Cited
2015
How to Generate a Good Word Embedding
Siwei Lai
,
Kang Liu
,
Shizhu He
,
Jun Zhao
IEEE Intelligent Systems
2015
Corpus ID: 10201625
The authors analyze three critical components in training word embeddings: model, corpus, and training parameters. They…
Expand
Highly Cited
2015
Highly Cited
2015
Monolingual and Cross-Lingual Information Retrieval Models Based on (Bilingual) Word Embeddings
Ivan Vulic
,
Marie-Francine Moens
Annual International ACM SIGIR Conference on…
2015
Corpus ID: 2583305
We propose a new unified framework for monolingual (MoIR) and cross-lingual information retrieval (CLIR) which relies on the…
Expand
Review
2013
Review
2013
Word Embeddings through Hellinger PCA
R. Lebret
,
R. Collobert
Conference of the European Chapter of the…
2013
Corpus ID: 1104123
Word embeddings resulting from neural language models have been shown to be a great asset for a large variety of NLP tasks…
Expand
By clicking accept or continuing to use the site, you agree to the terms outlined in our
Privacy Policy
(opens in a new tab)
,
Terms of Service
(opens in a new tab)
, and
Dataset License
(opens in a new tab)
ACCEPT & CONTINUE