Skip to search formSkip to main contentSkip to account menu

Word embedding

Known as: Word vector space, Thought vectors, Word vectors 
Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words… 
Wikipedia (opens in a new tab)

Papers overview

Semantic Scholar uses AI to extract papers important to this topic.
2020
2020
We propose FrameAxis, a method of characterizing the framing of a given text by identifying the most relevant semantic axes… 
2017
2017
In this study, Turkish texts belonging to different categories were classified by using word2vec word vectors. Firstly, vectors… 
2017
2017
In this paper, we present a novel unsupervised algorithm for word sense disambiguation (WSD) at the document level. Our algorithm… 
2017
2017
This paper describes our methods and experiments applied for CLSciSumm-17. We try Convolutional Neural Network, word vectors and… 
2017
2017
With the exponential growth of web meta-data, exploiting multimodal online sources via standard search engine has become a trend… 
Review
2016
Review
2016
This paper describes our participation in the SemEval-2016 task 5, Aspect Based Sentiment Analysis (ABSA). We participated in two… 
Highly Cited
2015
Highly Cited
2015
We introduce a neural machine translation model that views the input and output sentences as sequences of characters rather than… 
2014
2014
Distributed representations have gained a lot of interests in natural language processing community. In this paper, we propose a… 
2008
2008
Text Summarization is very effective in relevant assessment tasks. The Multiple Document Summarizer presents a novel approach to… 
2001
2001
Tile representation of documents and queries as vectors in space is a well-known information retrieval paradigm (Salton and…