Corpus ID: 5820361

An Infinite Hidden Markov Model With Similarity-Biased Transitions

  title={An Infinite Hidden Markov Model With Similarity-Biased Transitions},
  author={C. Dawson and Chaofan Huang and C. Morrison},
We describe a generalization of the Hierarchical Dirichlet Process Hidden Markov Model (HDP-HMM) which is able to encode prior information that state transitions are more likely between "nearby" states. This is accomplished by defining a similarity function on the state space and scaling transition probabilities by pair-wise similarities, thereby inducing correlations among the transition distributions. We present an augmented data representation of the model as a Markov Jump Process in which… Expand


An HDP-HMM for systems with state persistence
A sampling algorithm is developed that employs a truncated approximation of the DP to jointly resample the full state sequence, greatly improving mixing rates and demonstrating the advantages of the sticky extension, and the utility of the HDP-HMM in real-world applications. Expand
The Infinite Hidden Markov Model
We show that it is possible to extend hidden Markov models to have a countably infinite number of hidden states. By using the theory of Dirichlet processes we can implicitly integrate out theExpand
Factorial Hidden Markov Models
A generalization of HMMs in which this state is factored into multiple state variables and is therefore represented in a distributed manner, and a structured approximation in which the the state variables are decoupled, yielding a tractable algorithm for learning the parameters of the model. Expand
Hidden Markov models with discrete infinite logistic normal distribution priors
  • Hao Zhu, Jinsong Hu, H. Leung
  • Computer Science, Mathematics
  • 2016 19th International Conference on Information Fusion (FUSION)
  • 2016
A discrete infinite logistic normal distribution (DILN) to estimate the number of states in a hidden Markov model (HMM) and a variational Bayesian framework is proposed to infer the posterior distribution of the parameters of DILN-HMM. Expand
The Discrete Innite Logistic Normal Distribution
We present the discrete innite logistic normal distribution (DILN), a Bayesian nonparametric prior for mixed membership models. DILN generalizes the hierarchical Dirichlet process (HDP) to modelExpand
Infinite latent feature models and the Indian buffet process
We define a probability distribution over equivalence classes of binary matrices with a finite number of rows and an unbounded number of columns. This distribution is suitable for use as a prior inExpand
Infinite Factorial Dynamical Model
The experimental results show that the iFDM approach for source separation does not only outperform previous approaches, but it can also handle problems that were computationally intractable for existing approaches. Expand
Beam sampling for the infinite hidden Markov model
This paper introduces a new inference algorithm for the infinite Hidden Markov model called beam sampling, which typically outperforms the Gibbs sampler and is more robust. Expand
Hierarchical Dirichlet Processes
We consider problems involving groups of data where each observation within a group is a draw from a mixture model and where it is desirable to share mixture components between groups. We assume thatExpand
The Infinite Factorial Hidden Markov Model
After constructing an inference scheme which combines slice sampling and dynamic programming, it is demonstrated how the infinite factorial hidden Markov model can be used for blind source separation. Expand