Skip to search formSkip to main contentSkip to account menu

Word embedding

Known as: Word vector space, Thought vectors, Word vectors 
Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words… 
Wikipedia (opens in a new tab)

Papers overview

Semantic Scholar uses AI to extract papers important to this topic.
2018
2018
Comunicacio presentada al congres BIRNDL 2018, 3rd Joint Workshop on Bibliometric-enhanced Information Retrieval and Natural… 
2017
2017
In this study, Turkish texts belonging to different categories were classified by using word2vec word vectors. Firstly, vectors… 
2017
2017
In this paper, we present a novel unsupervised algorithm for word sense disambiguation (WSD) at the document level. Our algorithm… 
2017
2017
This paper describes our methods and experiments applied for CLSciSumm-17. We try Convolutional Neural Network, word vectors and… 
2017
2017
With the exponential growth of web meta-data, exploiting multimodal online sources via standard search engine has become a trend… 
Review
2016
Review
2016
This paper describes our participation in the SemEval-2016 task 5, Aspect Based Sentiment Analysis (ABSA). We participated in two… 
2015
2015
This paper describes the IBM systems for the Trilingual Entity Discovery and Linking (EDL) for the TAC 2016 Knowledge-Base… 
Highly Cited
2015
Highly Cited
2015
We introduce a neural machine translation model that views the input and output sentences as sequences of characters rather than… 
2014
2014
Distributed representations have gained a lot of interests in natural language processing community. In this paper, we propose a… 
2001
2001
Tile representation of documents and queries as vectors in space is a well-known information retrieval paradigm (Salton and…