Extractive Opinion Summarization in Quantized Transformer Spaces
@article{Angelidis2020ExtractiveOS, title={Extractive Opinion Summarization in Quantized Transformer Spaces}, author={Stefanos Angelidis and Reinald Kim Amplayo and Yoshihiko Suhara and Xiaolan Wang and Mirella Lapata}, journal={ArXiv}, year={2020}, volume={abs/2012.04443} }
We present the Quantized Transformer (QT), an unsupervised system for extractive opinion summarization. QT is inspired by Vector-Quantized Variational Autoencoders, which we repurpose for popularity-driven summarization. It uses a clustering interpretation of the quantized space and a novel extraction algorithm to discover popular opinions among hundreds of reviews, a significant step towards opinion summarization of practical scope. In addition, QT enables controllable summarization without… Expand
Figures and Tables from this paper
References
SHOWING 1-10 OF 46 REFERENCES
Unsupervised Aspect-Based Multi-Document Abstractive Summarization
- Computer Science
- EMNLP
- 2019
- 7
- Highly Influential
- PDF
MeanSum: A Neural Model for Unsupervised Multi-Document Abstractive Summarization
- Computer Science
- ICML
- 2019
- 58
- Highly Influential
- PDF
Opinions Summarization: Aspect Similarity Recognition Relaxes The Constraint of Predefined Aspects
- Computer Science
- RANLP
- 2019
- 2
- PDF
Unsupervised Opinion Summarization as Copycat-Review Generation
- Computer Science
- ACL
- 2020
- 16
- Highly Influential
- PDF
Ranking Sentences for Extractive Summarization with Reinforcement Learning
- Computer Science
- NAACL-HLT
- 2018
- 253
- PDF
Aspect and Opinion Aware Abstractive Review Summarization with Reinforced Hard Typed Decoder
- Computer Science
- CIKM
- 2019
- 5
- Highly Influential
- PDF