Context dependent recurrent neural network language model

@article{Mikolov2012ContextDR,
  title={Context dependent recurrent neural network language model},
  author={Tomas Mikolov and Geoffrey Zweig},
  journal={2012 IEEE Spoken Language Technology Workshop (SLT)},
  year={2012},
  pages={234-239}
}
Recurrent neural network language models (RNNLMs) have recently demonstrated state-of-the-art performance across a variety of tasks. In this paper, we improve their performance by providing a contextual real-valued input vector in association with each word. This vector is used to convey contextual information about the sentence being modeled. By performing Latent Dirichlet Allocation using a block of preceding text, we achieve a topic-conditioned RNNLM. This approach has the key advantage of… CONTINUE READING

Citations

Publications citing this paper.
SHOWING 1-10 OF 276 CITATIONS, ESTIMATED 39% COVERAGE

TRANSFORMER-XL: LANGUAGE MODELING

  • 2018
VIEW 6 EXCERPTS
CITES METHODS, BACKGROUND & RESULTS
HIGHLY INFLUENCED

A Generalized Language Model in Tensor Space

  • AAAI 2019
  • 2019
VIEW 5 EXCERPTS
CITES METHODS
HIGHLY INFLUENCED

Better Long-Range Dependency By Bootstrapping A Mutual Information Regularizer

VIEW 8 EXCERPTS
CITES METHODS & BACKGROUND
HIGHLY INFLUENCED

Low-Rank RNN Adaptation for Context-Aware Language Modeling

  • Transactions of the Association for Computational Linguistics
  • 2018
VIEW 15 EXCERPTS
CITES METHODS & BACKGROUND
HIGHLY INFLUENCED

MARKOV RECURRENT NEURAL NETWORKS

  • 2018 IEEE 28th International Workshop on Machine Learning for Signal Processing (MLSP)
  • 2018
VIEW 7 EXCERPTS
CITES METHODS & RESULTS
HIGHLY INFLUENCED

Modeling Non-Linguistic Contextual Signals in LSTM Language Models Via Domain Adaptation

  • 2018 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP)
  • 2018
VIEW 14 EXCERPTS
CITES METHODS & BACKGROUND
HIGHLY INFLUENCED

Personalized Language Model for Query Auto-Completion

VIEW 7 EXCERPTS
CITES METHODS
HIGHLY INFLUENCED

Dialog context language modeling with recurrent neural networks

  • 2017 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP)
  • 2017
VIEW 4 EXCERPTS
CITES BACKGROUND & METHODS
HIGHLY INFLUENCED

Effective keyword search for low-resourced conversational speech

  • 2017 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP)
  • 2017
VIEW 6 EXCERPTS
CITES METHODS & BACKGROUND
HIGHLY INFLUENCED

FILTER CITATIONS BY YEAR

2013
2019

CITATION STATISTICS

  • 35 Highly Influenced Citations

  • Averaged 60 Citations per year over the last 3 years

References

Publications referenced by this paper.
SHOWING 1-10 OF 38 REFERENCES

Improving arabic broadcast transcription using automatic topic clustering

  • 2012 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP)
  • 2012
VIEW 2 EXCERPTS

suring the influence of long range dependencies with neural network language models

Le Hai Son, Alexandre Allauzen, Francois Yvon
  • Proceedings of the Workshop on the Future of Language Modeling for HLT ( NAACL / HLT
  • 2012

Extensions of recurrent neural network language model

  • 2011 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP)
  • 2011
VIEW 2 EXCERPTS

Similar Papers

Loading similar papers…