• Publications
  • Influence
Glove: Global Vectors for Word Representation
Recent methods for learning vector space representations of words have succeeded in capturing fine-grained semantic and syntactic regularities using vector arithmetic, but the origin of theseExpand
  • 14,103
  • 2392
ImageNet: A large-scale hierarchical image database
The explosion of image data on the Internet has the potential to foster more sophisticated and robust models and algorithms to index, retrieve, organize and interact with images and multimedia data.Expand
  • 11,478
  • 1999
ImageNet: A large-scale hierarchical image database
The explosion of image data on the Internet has the potential to foster more sophisticated and robust models and algorithms to index, retrieve, organize and interact with images and multimedia data.Expand
  • 7,092
  • 1665
Recursive Deep Models for Semantic Compositionality Over a Sentiment Treebank
Semantic word spaces have been very useful but cannot express the meaning of longer phrases in a principled way. Further progress towards understanding compositionality in tasks such as sentimentExpand
  • 3,719
  • 580
Improved Semantic Representations From Tree-Structured Long Short-Term Memory Networks
Because of their superior ability to preserve sequence information over time, Long Short-Term Memory (LSTM) networks, a type of recurrent neural network with a more complex computational unit, haveExpand
  • 1,806
  • 314
Reasoning With Neural Tensor Networks for Knowledge Base Completion
Knowledge bases are an important resource for question answering and other tasks but often suffer from incompleteness and lack of ability to reason over their discrete entities and relationships. InExpand
  • 1,149
  • 223
Improving Word Representations via Global Context and Multiple Word Prototypes
Unsupervised word representations are very useful in NLP tasks both as inputs to learning algorithms and as extra word features in NLP systems. However, most of these models are built with only localExpand
  • 1,038
  • 139
Regularizing and Optimizing LSTM Language Models
Recurrent neural networks (RNNs), such as long short-term memory networks (LSTMs), serve as a fundamental building block for many sequence learning tasks, including machine translation, languageExpand
  • 555
  • 137
Semi-Supervised Recursive Autoencoders for Predicting Sentiment Distributions
We introduce a novel machine learning framework based on recursive autoencoders for sentence-level prediction of sentiment label distributions. Our method learns vector space representations forExpand
  • 1,100
  • 123
A Deep Reinforced Model for Abstractive Summarization
Attentional, RNN-based encoder-decoder models for abstractive summarization have achieved good performance on short input and output sequences. For longer documents and summaries however these modelsExpand
  • 636
  • 123