Efficient estimation of maximum entropy language models with n-gram features: an SRILM extension

@inproceedings{Alume2010EfficientEO,
  title={Efficient estimation of maximum entropy language models with n-gram features: an SRILM extension},
  author={Tanel Alum{\"a}e and Mikko Kurimo},
  booktitle={INTERSPEECH},
  year={2010}
}
We present an extension to the SRILM toolkit for training maximum entropy language models with N -gram features. The extension uses a hierarchical parameter estimation procedure [1] for making the training time and memory consumption feasible for moderately large training data (hundreds of millions of words). Experiments on two speech recognition tasks indicate that the models trained with our implementation perform equally to or better than N -gram models built with interpolated Kneser-Ney… CONTINUE READING
Highly Cited
This paper has 50 citations. REVIEW CITATIONS

3 Figures & Tables

Topics

Statistics

051020112012201320142015201620172018
Citations per Year

51 Citations

Semantic Scholar estimates that this publication has 51 citations based on the available data.

See our FAQ for additional information.