Context dependent recurrent neural network language model

@article{Mikolov2012ContextDR,
  title={Context dependent recurrent neural network language model},
  author={Tomas Mikolov and Geoffrey Zweig},
  journal={2012 IEEE Spoken Language Technology Workshop (SLT)},
  year={2012},
  pages={234-239}
}
Recurrent neural network language models (RNNLMs) have recently demonstrated state-of-the-art performance across a variety of tasks. In this paper, we improve their performance by providing a contextual real-valued input vector in association with each word. This vector is used to convey contextual information about the sentence being modeled. By performing Latent Dirichlet Allocation using a block of preceding text, we achieve a topic-conditioned RNNLM. This approach has the key advantage of… CONTINUE READING
Highly Influential
This paper has highly influenced 31 other papers. REVIEW HIGHLY INFLUENTIAL CITATIONS
Highly Cited
This paper has 351 citations. REVIEW CITATIONS

Citations

Publications citing this paper.
Showing 1-10 of 247 extracted citations

Transformer-xl: Language Modeling

2018
View 6 Excerpts
Highly Influenced

Markov Recurrent Neural Networks

2018 IEEE 28th International Workshop on Machine Learning for Signal Processing (MLSP) • 2018
View 7 Excerpts
Highly Influenced

Modeling Non-Linguistic Contextual Signals in LSTM Language Models Via Domain Adaptation

2018 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP) • 2018
View 14 Excerpts
Highly Influenced

Dialog context language modeling with recurrent neural networks

2017 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP) • 2017
View 4 Excerpts
Highly Influenced

Effective keyword search for low-resourced conversational speech

2017 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP) • 2017
View 6 Excerpts
Highly Influenced

Improving Context Aware Language Models

View 18 Excerpts
Highly Influenced

352 Citations

050100'14'16'18
Citations per Year
Semantic Scholar estimates that this publication has 352 citations based on the available data.

See our FAQ for additional information.

References

Publications referenced by this paper.
Showing 1-10 of 44 references

Latent Dirichlet Allocation

View 4 Excerpts
Highly Influenced

Statistical Language Models Based on Neural Networks

VYSOKÉ UČENÍ, TECHNICKÉ V BRNĚ, GRAFIKY A MULTIMÉDIÍ, DISERTAČNÍ PRÁCE
2012
View 4 Excerpts
Highly Influenced

Discriminative Language Modeling With Linguistic and Statistically Derived Features

IEEE Transactions on Audio, Speech, and Language Processing • 2012
View 1 Excerpt

Improving arabic broadcast transcription using automatic topic clustering

2012 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP) • 2012
View 2 Excerpts

suring the influence of long range dependencies with neural network language models

Le Hai Son, Alexandre Allauzen, Francois Yvon
Proceedings of the Workshop on the Future of Language Modeling for HLT ( NAACL / HLT • 2012

Extensions of recurrent neural network language model

2011 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP) • 2011
View 2 Excerpts

Similar Papers

Loading similar papers…