Gibbs sampling for parsimonious Markov models with latent variables

Abstract

Parsimonious Markov models have been recently developed as a generalization of variable order Markov models. Many practical applications involve a setting with latent variables, with a common example being mixture models. Here, we propose a Bayesian model averaging approach for learning mixtures of parsimonious Markov models that is based on Gibbs sampling. The challenging problem is sampling one out of a large number of model structures. We solve it by an efficient dynamic programming algorithm. We apply the resulting Gibbs sampling algorithm to splice site classification, an important problem from computational biology, and find the Bayesian approach to be superior to the non-Bayesian classification.

6 Figures and Tables

Cite this paper

@inproceedings{Eggeling2012GibbsSF, title={Gibbs sampling for parsimonious Markov models with latent variables}, author={Ralf Eggeling and Pierre-Yves Bourguignon and Andr{\'e} Gohr and Ivo Grosse}, year={2012} }