Corpus ID: 236428919

Inference of collective Gaussian hidden Markov models

  title={Inference of collective Gaussian hidden Markov models},
  author={Rahul Singh and Yongxin Chen},
We consider inference problems for a class of continuous state collective hidden Markov models, where the data is recorded in aggregate (collective) form generated by a large population of individuals following the same dynamics. We propose an aggregate inference algorithm called collective Gaussian forward-backward algorithm, extending recently proposed Sinkhorn belief propagation algorithm to models characterized by Gaussian densities. Our algorithm enjoys convergence guarantee. In addition… Expand


Incremental Inference of Collective Graphical Models
A sliding window Sinkhorn belief propagation (SW-SBP) algorithm that utilizes a sliding window filter of the most recent noisy aggregate observations along with encoded information from discarded observations to solve inference problems from aggregate data. Expand
Estimating ensemble flows on a hidden Markov chain
A new framework to estimate the evolution of an ensemble of indistinguishable agents on a hidden Markov chain using only aggregate output data is proposed, which has a convex formulation at the infinite-particle limit. Expand
Message Passing for Collective Graphical Models
A novel Belief Propagation style algorithm is derived for collective graphical models that can be viewed as an extension to minimize the Bethe free energy plus additional energy terms that are non-linear functions of the marginals. Expand
Collective Graphical Models
A highly-efficient Gibbs sampling algorithm for sampling from the posterior distribution of the sufficient statistics conditioned on noisy aggregate observations is derived and proved to be correctness and effectiveness experimentally. Expand
Approximate Inference in Collective Graphical Models
A tractable convex approximation to the NP-hard MAP inference problem in CGMs is developed, and it is demonstrated empirically that these approximation techniques can reduce the computational cost of inference and the cost of learning by at least an order of magnitude while providing solutions of equal or better quality. Expand
An Iterative Ensemble Kalman Filter
An essential stopping criteria is introduced for the proposed iterative extension to the ensemble Kalman filter to improve the estimates in the cases where the relationship between the model and observations is not linear. Expand
Graphical Models, Exponential Families, and Variational Inference
The variational approach provides a complementary alternative to Markov chain Monte Carlo as a general source of approximation methods for inference in large-scale statistical models. Expand
On optimal ℓ∞ to ℓ∞ filtering
Taking a model matching approach, suboptimal solutions are presented that stem from the resulting l ∞ -induced norm-minimization problem. Expand
Probabilistic reasoning in intelligent systems - networks of plausible inference
  • J. Pearl
  • Computer Science
  • Morgan Kaufmann series in representation and reasoning
  • 1989
The author provides a coherent explication of probability as a language for reasoning with partial belief and offers a unifying perspective on other AI approaches to uncertainty, such as the Dempster-Shafer formalism, truth maintenance systems, and nonmonotonic logic. Expand
A Tutorial on Hidden Markov Models and Selected Applications
The fabric comprises a novel type of netting which will have particular utility in screening out mosquitoes and like insects and pests. The fabric is defined of voids having depth as well as widthExpand