Scaling recurrent neural network language models

@article{Williams2015ScalingRN,
  title={Scaling recurrent neural network language models},
  author={Will Williams and N. Prasad and D. Mrva and T. Ash and T. Robinson},
  journal={2015 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP)},
  year={2015},
  pages={5391-5395}
}
  • Will Williams, N. Prasad, +2 authors T. Robinson
  • Published 2015
  • Computer Science
  • 2015 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP)
  • This paper investigates the scaling properties of Recurrent Neural Network Language Models (RNNLMs. [...] Key Result We also present the new lowest perplexities on the recently released billion word language modelling benchmark, 1 BLEU point gain on machine translation and a 17% relative hit rate gain in word prediction.Expand Abstract
    Exploring the Limits of Language Modeling
    • 738
    • Highly Influenced
    • Open Access
    Transfer Learning for Low-Resource Neural Machine Translation
    • 322
    • Open Access
    Batch normalized recurrent neural networks
    • 140
    • Open Access
    BlackOut: Speeding up Recurrent Neural Network Language Models With Very Large Vocabularies
    • 64
    • Highly Influenced
    • Open Access
    Frustratingly Short Attention Spans in Neural Language Modeling
    • 69
    • Open Access
    A Survey of the Usages of Deep Learning in Natural Language Processing
    • 44
    • Open Access
    Generalizing and Hybridizing Count-based and Neural Language Models
    • 20
    • Open Access

    References

    Publications referenced by this paper.
    SHOWING 1-10 OF 27 REFERENCES
    Recurrent neural network based language model
    • 3,935
    • Open Access
    Bidirectional recurrent neural networks
    • 3,301
    • Open Access
    Generating Text with Recurrent Neural Networks
    • 987
    • Open Access
    One billion word benchmark for measuring progress in statistical language modeling
    • 686
    • Highly Influential
    • Open Access
    KenLM: Faster and Smaller Language Model Queries
    • 940
    • Open Access
    Statistical Language Models Based on Neural Networks
    • 510
    • Highly Influential
    • Open Access
    Building high-level features using large scale unsupervised learning
    • 1,870
    • Open Access
    Advances in optimizing recurrent networks
    • 360
    • Open Access
    RNNLM - Recurrent Neural Network Language Modeling Toolkit
    • 284
    • Open Access
    Learning word embeddings efficiently with noise-contrastive estimation
    • 422
    • Open Access