BottleSum: Unsupervised and Self-supervised Sentence Summarization using the Information Bottleneck Principle

@article{West2019BottleSumUA,
  title={BottleSum: Unsupervised and Self-supervised Sentence Summarization using the Information Bottleneck Principle},
  author={Peter West and Ari Holtzman and Jan Buys and Yejin Choi},
  journal={ArXiv},
  year={2019},
  volume={abs/1909.07405}
}
  • Peter West, Ari Holtzman, +1 author Yejin Choi
  • Published 2019
  • Computer Science
  • ArXiv
  • The principle of the Information Bottleneck (Tishby et al. 1999) is to produce a summary of information X optimized to predict some other relevant information Y. In this paper, we propose a novel approach to unsupervised sentence summarization by mapping the Information Bottleneck principle to a conditional language modelling objective: given a sentence, our approach seeks a compressed sentence that can best predict the next sentence. [...] Key Method Our iterative algorithm under the Information Bottleneck…Expand Abstract
    10 Citations

    Figures, Tables, and Topics from this paper.

    Discrete Optimization for Unsupervised Sentence Summarization with Word-Level Extraction
    • 4
    • PDF
    Unsupervised Opinion Summarization as Copycat-Review Generation
    • 10
    • PDF
    The Summary Loop: Learning to Write Abstractive Summaries Without Examples
    • 5
    • PDF
    Self-Supervised and Controlled Multi-Document Opinion Summarization
    • 2
    • PDF
    An Information Bottleneck Approach for Controlling Conciseness in Rationale Extraction
    • 4
    • Highly Influenced
    • PDF
    Unsupervised Abstractive Dialogue Summarization for Tete-a-Tetes
    Make Lead Bias in Your Favor: A Simple and Effective Method for News Summarization
    • 9
    • Highly Influenced
    How Domain Terminology Affects Meeting Summarization Performance
    • 2
    • PDF

    References

    SHOWING 1-10 OF 26 REFERENCES
    Neural Summarization by Extracting Sentences and Words
    • 432
    • PDF
    Unsupervised Sentence Compression using Denoising Auto-Encoders
    • 31
    • PDF
    A Neural Attention Model for Abstractive Sentence Summarization
    • 1,582
    • Highly Influential
    • PDF
    Language Models are Unsupervised Multitask Learners
    • 1,986
    • Highly Influential
    • PDF
    Get To The Point: Summarization with Pointer-Generator Networks
    • 1,433
    • PDF
    Language as a Latent Variable: Discrete Generative Models for Sentence Compression
    • 152
    • PDF
    Deep Recurrent Generative Decoder for Abstractive Text Summarization
    • 106
    • PDF
    Headline Generation Based on Statistical Translation
    • 217
    • PDF