Semantic Scholar uses AI to extract papers important to this topic.
The blind application of machine learning runs the risk of amplifying biases present in data. Such a danger is facing us with… Expand We present the Word Mover's Distance (WMD), a novel distance function between text documents. Our work is based on recent results… Expand Recent trends suggest that neural-network-inspired word embedding models outperform traditional count-based distributional models… Expand We analyze skip-gram with negative-sampling (SGNS), a word embedding method introduced by Mikolov et al., and show that it is… Expand We present a method that learns word embedding for Twitter sentiment classification in this paper. Most existing algorithms for… Expand The word2vec software of Tomas Mikolov and colleagues (this https URL ) has gained a lot of traction lately, and provides state… Expand While continuous word embeddings are gaining popularity, current models are based solely on linear contexts. In this work, we… Expand Recent methods for learning vector space representations of words have succeeded in capturing fine-grained semantic and syntactic… Expand We introduce bilingual word embeddings: semantic embeddings associated across two languages in the context of neural language… Expand We propose two novel model architectures for computing continuous vector
representations of words from very large data sets. The… Expand