Language model size reduction by pruning and clustering

@inproceedings{Goodman2000LanguageMS,
  title={Language model size reduction by pruning and clustering},
  author={Joshua Goodman and Jianfeng Gao},
  booktitle={INTERSPEECH},
  year={2000}
}
Several techniques are known for reducing the size of language models, including count cutoffs [1], Weighted Difference pruning [2], Stolcke pruning [3], and clustering [4]. We compare all of these techniques and show some surprising results. For instance, at low pruning thresholds, Weighted Difference and Stolcke pruning underperform count cutoffs. We then show novel clustering techniques that can be combined with Stolcke pruning to produce the smallest models at a given perplexity. The… CONTINUE READING
Highly Cited
This paper has 57 citations. REVIEW CITATIONS

From This Paper

Figures, tables, and topics from this paper.
43 Citations
5 References
Similar Papers

Citations

Publications citing this paper.

58 Citations

0510'01'04'08'12'16
Citations per Year
Semantic Scholar estimates that this publication has 58 citations based on the available data.

See our FAQ for additional information.

Similar Papers

Loading similar papers…