Empirical study of neural network language models for Arabic speech recognition

@article{Emami2007EmpiricalSO,
  title={Empirical study of neural network language models for Arabic speech recognition},
  author={Ahmad Emami and Lidia Mangu},
  journal={2007 IEEE Workshop on Automatic Speech Recognition & Understanding (ASRU)},
  year={2007},
  pages={147-152}
}
In this paper we investigate the use of neural network language models for Arabic speech recognition. By using a distributed representation of words, the neural network model allows for more robust generalization and is better able to fight the data sparseness problem. We investigate different configurations of the neural probabilistic model, experimenting with such parameters as N-gram order, output vocabulary, normalization method, and model size and parameters. Experiments were carried out… CONTINUE READING

Citations

Publications citing this paper.
SHOWING 1-10 OF 36 CITATIONS

Syntactic features for Arabic speech recognition

  • 2009 IEEE Workshop on Automatic Speech Recognition & Understanding
  • 2009
VIEW 4 EXCERPTS
CITES METHODS

Rich morphology based n-gram language models for Arabic

VIEW 4 EXCERPTS
CITES METHODS & BACKGROUND

Gaussian Process Lstm Recurrent Neural Network Language Models for Speech Recognition

  • ICASSP 2019 - 2019 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP)
  • 2019
VIEW 1 EXCERPT
CITES METHODS

Limited-Memory BFGS Optimization of Recurrent Neural Network Language Models for Speech Recognition

  • 2018 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP)
  • 2018
VIEW 1 EXCERPT
CITES BACKGROUND

Recurrent neural network language models for keyword search

  • 2017 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP)
  • 2017
VIEW 1 EXCERPT
CITES METHODS