Corpus and Evaluation Measures for Multiple Document Summarization with Multiple Sources

@inproceedings{Hirao2004CorpusAE,
  title={Corpus and Evaluation Measures for Multiple Document Summarization with Multiple Sources},
  author={Tsutomu Hirao and Takahiro Fukusima and Manabu Okumura and Chikashi Nobata and Hidetsugu Nanba},
  booktitle={COLING},
  year={2004}
}
In thispaper , we introducea large-scaletestcollection for multipledocumentsummarization, theText SummarizationChallenge3 (TSC3) corpus. We detail the corpusconstructionandevaluationmeasures.Thesignificantfeatureof thecorpusis thatit annotatesnotonly theimportantsentencesin adocumentset,but alsothoseamongthemthathave the samecontent.Moreover, we definenew evaluation metricstakingredundanc y into accountanddiscuss theeffectivenessof redundanc y minimization. 

From This Paper

Figures, tables, and topics from this paper.
10 Citations
8 References
Similar Papers

Citations

Publications citing this paper.
Showing 1-10 of 10 extracted citations

References

Publications referenced by this paper.
Showing 1-8 of 8 references

Multiple Document Summarization using Sequential Pattern Mining ( in Japanese )

  • K. McKeown
  • 2003

A New Approach to Unsupervised Text Summarization

  • T. Fukusima, H. Nanba
  • 2001

An Extrinsic Evaluation for Question - Biased Text Summarization on QA tasks

  • H. Isozaki, E. Maeda
  • 2001

Text Summarization Challenge : Text Summarization Evaluation in Japan

  • Y. Sasaki, H. Isozaki
  • Proc . of the NAACL 2001 Workshop on Automatic…
  • 2001

The Decomposition of Human - Written Summary Sentences

  • J. Kupiec, J Petersen, F. Chen
  • Proc . of the 22 nd ACM - SIGIR
  • 1999

Similar Papers

Loading similar papers…