Martin Matysiak

Learn More
—In this paper we investigate different n-gram language models that are defined over an open lexicon. We introduce a character-level language model and combine it with a standard word-level language model in a backoff fashion. The character-level language model is redefined and renormalized to assign zero probability to words from a fixed vocabulary.(More)
  • 1