• Publications
  • Influence
The Summary Loop: Learning to Write Abstractive Summaries Without Examples
TLDR
This work introduces a novel method that encourages the inclusion of key terms from the original document into the summary that attains higher levels of abstraction with copied passages roughly two times shorter than prior work, and learns to compress and merge sentences without supervision. Expand
What’s The Latest? A Question-driven News Chatbot
TLDR
The algorithmic framework for an automatic news chatbot is described and the results of a usability study are presented that shows that news readers using the system successfully engage in multi-turn conversations about specific news stories. Expand
newsLens: building and visualizing long-ranging news stories
We propose a method to aggregate and organize a large, multi-source dataset of news articles into a collection of major stories, and automatically name and visualize these stories in a workingExpand
News Headline Grouping as a Challenging NLU Task
TLDR
A novel unsupervised Headline Generator Swap model is proposed for the task of HeadLine Grouping that achieves within 3 F-1 of the best supervised model, and high-performing models are analyzed with consistency tests, finding that models are not consistent in their predictions, revealing modeling limits of current architectures. Expand
Can Transformer Models Measure Coherence In Text? Re-Thinking the Shuffle Test
TLDR
It is suggested that the Shuffle Test should be approached in a ZeroShot setting: models should be evaluated without being trained on the task itself, and that larger architectures achieve highperformance out-of-the-box. Expand
A framework for a text-centric user interface for navigating complex news stories
Many news articles are part of larger news stories that unfold over a period of time. Detecting these news stories, and presenting them to news readers is appealing, as it allows the reader to accessExpand
Abstractive News Summarization via Copying and Transforming
ive News Summarization via Copying and Transforming Philippe Laban UC Berkeley phillab@berkeley.edu John Canny UC Berkeley canny@berkeley.edu Marti Hearst UC Berkeley
From Brooklyn Barbers To Movie Stars: Using Introductions To Construct Embeddings Of People
People are central to narrative texts such as news; a given news story can contain mentions of hundreds of distinct individuals, some famous and some not. We introduce a novel method for representingExpand
Keep it Simple: Unsupervised Simplification of Multi-Paragraph Text
TLDR
A new approach to unsupervised text simplification which learns to balance a reward across three properties: fluency, salience and simplicity, which can help people complete a comprehension task an average of 18% faster while retaining accuracy. Expand