• Publications
  • Influence
Composition in Distributional Models of Semantics
TLDR
Vector-based models of word meaning have become increasingly popular in cognitive science. Expand
  • 866
  • 93
  • PDF
Modeling Local Coherence: An Entity-Based Approach
TLDR
This article proposes a novel framework for representing and measuring local coherence. Expand
  • 625
  • 90
  • PDF
Vector-based Models of Semantic Composition
TLDR
This paper proposes a framework for representing the meaning of phrases and sentences in vector space. Expand
  • 689
  • 78
  • PDF
Neural Summarization by Extracting Sentences and Words
TLDR
We develop a general framework for single-document summarization composed of a hierarchical document encoder and an attention-based extractor. Expand
  • 486
  • 63
  • PDF
Language to Logical Form with Neural Attention
TLDR
We present a general method based on an attention-enhanced encoder-decoder model for semantic parsing using recurrent neural networks with long short-term memory units. Expand
  • 451
  • 63
  • PDF
Text Summarization with Pretrained Encoders
TLDR
Bidirectional Encoder Representations from Transformers (BERT) represents the latest incarnation of pretrained language models which have recently advanced a wide range of natural language processing tasks. Expand
  • 294
  • 59
  • PDF
Don’t Give Me the Details, Just the Summary! Topic-Aware Convolutional Neural Networks for Extreme Summarization
TLDR
We introduce “extreme summarization”, a new single-document summarization task which does not favor extractive strategies and calls for an abstractive modeling approach. Expand
  • 210
  • 56
  • PDF
Long Short-Term Memory-Networks for Machine Reading
TLDR
We propose a machine reading simulator which processes text incrementally from left to right and performs shallow reasoning with memory and attention. Expand
  • 610
  • 55
  • PDF
Dependency-Based Construction of Semantic Space Models
TLDR
We introduce a formalization for this class of models, which allows linguistic knowledge to guide the construction process. Expand
  • 673
  • 47
  • PDF
Ranking Sentences for Extractive Summarization with Reinforcement Learning
TLDR
We use our algorithm to train a neural summarization model on the CNN and DailyMail datasets and demonstrate experimentally that it outperforms state-of-the-art extractive and abstractive systems when evaluated automatically and by humans. Expand
  • 258
  • 45
  • PDF
...
1
2
3
4
5
...