Entropy-based Pruning of Backoff Language Models

  title={Entropy-based Pruning of Backoff Language Models},
  author={Andreas Stolcke},
A criterion for pruning parameters from N-gram backoff lang ua e models is developed, based on the relative entropy between t h original and the pruned model. It is shown that the relative entro py resulting from pruning a single N-gram can be computed exactly and efficiently for backoff models. The relative entropy measur e can be expressed as a relative change in training set perplexity. T his leads to a simple pruning criterion whereby all N-grams that chang e perplexity by less than a… CONTINUE READING
Highly Influential
This paper has highly influenced 13 other papers. REVIEW HIGHLY INFLUENTIAL CITATIONS
Highly Cited
This paper has 378 citations. REVIEW CITATIONS

From This Paper

Figures, tables, results, connections, and topics extracted from this paper.
249 Extracted Citations
8 Extracted References
Similar Papers

Citing Papers

Publications influenced by this paper.
Showing 1-10 of 249 extracted citations

379 Citations

Citations per Year
Semantic Scholar estimates that this publication has 379 citations based on the available data.

See our FAQ for additional information.

Similar Papers

Loading similar papers…