Skip to search formSkip to main contentSkip to account menu

Word embedding

Known as: Word vector space, Thought vectors, Word vectors 
Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words… 
Wikipedia (opens in a new tab)

Papers overview

Semantic Scholar uses AI to extract papers important to this topic.
2018
2018
This paper addresses the issue of multi-word expression (MWE) detection by employing a new decoding strategy inspired after graph… 
2017
2017
Recently, FPGA has been increasingly applied to problems such as speech recognition, machine learning, and cloud computation such… 
2016
2016
In this paper we take a state-of-the-art model for distributed word representation that explicitly factorizes the positive… 
Highly Cited
2015
Highly Cited
2015
We introduce a neural machine translation model that views the input and output sentences as sequences of characters rather than… 
2015
2015
This paper describes the IBM systems for the Trilingual Entity Discovery and Linking (EDL) for the TAC 2016 Knowledge-Base… 
2014
2014
Distributed representations have gained a lot of interests in natural language processing community. In this paper, we propose a… 
2010
2010
In this paper, we introduce an efficient method to substantially increase the recognition performance of object recognition by… 
2009
2009
It is a challenging and important task to retrieve images from a large and highly varied image data set based on their visual… 
2006
2006
This paper proposes a method for retrieving similar question articles from Web bulletin boards, which basically use the cosine… 
2002
2002
This paper presents a method that measures the similarity between compound nouns in different languages to locate translation…