Statistical Language Models Based on Neural Networks

@inproceedings{Uen2012StatisticalLM,
  title={Statistical Language Models Based on Neural Networks},
  author={Vysok{\'e} U{\vc}en{\'i} and Technick{\'e} V Brně and Grafiky A Multim{\'e}di{\'i} and Diserta{\vc}n{\'i} Pr{\'a}ce},
  year={2012}
}
Statistical language models are crucial part of many successful applications, such as automatic speech recognition and statistical machine translation (for example well-known Google Translate). Traditional techniques for estimating these models are based on N gram counts. Despite known weaknesses of N -grams and huge efforts of research communities across many fields (speech recognition, machine translation, neuroscience, artificial intelligence, natural language processing, data compression… CONTINUE READING

Similar Papers

Citations

Publications citing this paper.
SHOWING 1-10 OF 392 CITATIONS

A unified deep modeling approach to simultaneous speech dereverberation and recognition for the reverb challenge

  • 2017 Hands-free Speech Communications and Microphone Arrays (HSCMA)
  • 2017
VIEW 4 EXCERPTS
CITES METHODS
HIGHLY INFLUENCED

Using LSTMs to Improve Dutch Language Models

VIEW 16 EXCERPTS
CITES RESULTS, BACKGROUND & METHODS
HIGHLY INFLUENCED

Higher Order Recurrent Neural Networks

VIEW 11 EXCERPTS
CITES BACKGROUND & METHODS
HIGHLY INFLUENCED

CAPS: Context Aware Personalized POI Sequence Recommender System

VIEW 5 EXCERPTS
CITES METHODS & BACKGROUND
HIGHLY INFLUENCED

Gated Recurrent Neural Network with Tensor Product

VIEW 4 EXCERPTS
CITES METHODS
HIGHLY INFLUENCED

A Study on Neural Network Language Modeling

  • ArXiv
  • 2017
VIEW 6 EXCERPTS
CITES METHODS & BACKGROUND
HIGHLY INFLUENCED

FILTER CITATIONS BY YEAR

2010
2019

CITATION STATISTICS

  • 45 Highly Influenced Citations

  • Averaged 53 Citations per year from 2017 through 2019

References

Publications referenced by this paper.
SHOWING 1-10 OF 79 REFERENCES

Scaling shrinkage-based language models

  • 2009 IEEE Workshop on Automatic Speech Recognition & Understanding
  • 2009
VIEW 7 EXCERPTS
HIGHLY INFLUENTIAL

Shrinking Exponential Language Models

  • HLT-NAACL
  • 2009
VIEW 7 EXCERPTS
HIGHLY INFLUENTIAL

A Bit of Progress in Language Modeling

  • Computer Speech & Language
  • 2001
VIEW 12 EXCERPTS
HIGHLY INFLUENTIAL

Classes for fast maximum entropy training

  • 2001 IEEE International Conference on Acoustics, Speech, and Signal Processing. Proceedings (Cat. No.01CH37221)
  • 2001
VIEW 7 EXCERPTS
HIGHLY INFLUENTIAL

A

W. Xu
  • Rudnicky. Can Artificial Neural Networks Learn Language Models? International Conference on Statistical Language Processing
  • 2000
VIEW 8 EXCERPTS
HIGHLY INFLUENTIAL

Extensions of recurrent neural network language model

  • 2011 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP)
  • 2011
VIEW 10 EXCERPTS
HIGHLY INFLUENTIAL

Structured Output Layer neural network language model

  • 2011 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP)
  • 2011
VIEW 6 EXCERPTS
HIGHLY INFLUENTIAL

Neural network based language models for highly inflective languages

  • 2009 IEEE International Conference on Acoustics, Speech and Signal Processing
  • 2009
VIEW 4 EXCERPTS
HIGHLY INFLUENTIAL