Scaling recurrent neural network language models

@article{Williams2015ScalingRN,
  title={Scaling recurrent neural network language models},
  author={Will Williams and N. Prasad and D. Mrva and T. Ash and T. Robinson},
  journal={2015 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP)},
  year={2015},
  pages={5391-5395}
}
  • Will Williams, N. Prasad, +2 authors T. Robinson
  • Published 2015
  • Computer Science
  • 2015 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP)
  • This paper investigates the scaling properties of Recurrent Neural Network Language Models (RNNLMs. [...] Key Result We also present the new lowest perplexities on the recently released billion word language modelling benchmark, 1 BLEU point gain on machine translation and a 17% relative hit rate gain in word prediction.Expand Abstract
    Exploring the Limits of Language Modeling
    716
    Slim Embedding Layers for Recurrent Neural Language Models
    6
    BlackOut: Speeding up Recurrent Neural Network Language Models With Very Large Vocabularies
    64
    Batch normalized recurrent neural networks
    137
    Discriminative training of RNNLMs with the average word error criterion
    Frustratingly Short Attention Spans in Neural Language Modeling
    67

    References

    Publications referenced by this paper.
    SHOWING 1-10 OF 27 REFERENCES
    Recurrent neural network based language model
    3853
    One billion word benchmark for measuring progress in statistical language modeling
    668
    Efficient GPU-based training of recurrent neural network language models using spliced sentence bunch
    59
    Generating Text with Recurrent Neural Networks
    971
    Statistical Language Models Based on Neural Networks
    502
    RNNLM - Recurrent Neural Network Language Modeling Toolkit
    283
    Large scale recurrent neural network on GPU
    44
    Advances in optimizing recurrent networks
    353
    Bidirectional recurrent neural networks
    3188