Skip to search formSkip to main contentSkip to account menu

Word embedding

Known as: Word vector space, Thought vectors, Word vectors 
Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words… 
Wikipedia (opens in a new tab)

Papers overview

Semantic Scholar uses AI to extract papers important to this topic.
2019
2019
  • Enam BiswasA. Das
  • 2019
  • Corpus ID: 202700175
Natural language processing (NLP) and automatic detection of the disease have become popular in the recent era. Several research… 
2017
2017
In this study, Turkish texts belonging to different categories were classified by using word2vec word vectors. Firstly, vectors… 
2017
2017
Recently, FPGA has been increasingly applied to problems such as speech recognition, machine learning, and cloud computation such… 
2017
2017
This paper describes our methods and experiments applied for CLSciSumm-17. We try Convolutional Neural Network, word vectors and… 
Highly Cited
2015
Highly Cited
2015
We introduce a neural machine translation model that views the input and output sentences as sequences of characters rather than… 
2015
2015
This paper describes the IBM systems for the Trilingual Entity Discovery and Linking (EDL) for the TAC 2016 Knowledge-Base… 
2014
2014
Distributed representations have gained a lot of interests in natural language processing community. In this paper, we propose a… 
2010
2010
In this paper, we introduce an efficient method to substantially increase the recognition performance of object recognition by… 
2009
2009
It is a challenging and important task to retrieve images from a large and highly varied image data set based on their visual… 
2001
2001
Tile representation of documents and queries as vectors in space is a well-known information retrieval paradigm (Salton and…