Corpus ID: 236428919

Inference of collective Gaussian hidden Markov models

@article{Singh2021InferenceOC,
  title={Inference of collective Gaussian hidden Markov models},
  author={Rahul Singh and Yongxin Chen},
  journal={ArXiv},
  year={2021},
  volume={abs/2107.11662}
}
We consider inference problems for a class of continuous state collective hidden Markov models, where the data is recorded in aggregate (collective) form generated by a large population of individuals following the same dynamics. We propose an aggregate inference algorithm called collective Gaussian forward-backward algorithm, extending recently proposed Sinkhorn belief propagation algorithm to models characterized by Gaussian densities. Our algorithm enjoys convergence guarantee. In addition… Expand

References

SHOWING 1-10 OF 12 REFERENCES
Incremental Inference of Collective Graphical Models
TLDR
A sliding window Sinkhorn belief propagation (SW-SBP) algorithm that utilizes a sliding window filter of the most recent noisy aggregate observations along with encoded information from discarded observations to solve inference problems from aggregate data. Expand
Estimating ensemble flows on a hidden Markov chain
TLDR
A new framework to estimate the evolution of an ensemble of indistinguishable agents on a hidden Markov chain using only aggregate output data is proposed, which has a convex formulation at the infinite-particle limit. Expand
Message Passing for Collective Graphical Models
TLDR
A novel Belief Propagation style algorithm is derived for collective graphical models that can be viewed as an extension to minimize the Bethe free energy plus additional energy terms that are non-linear functions of the marginals. Expand
Collective Graphical Models
TLDR
A highly-efficient Gibbs sampling algorithm for sampling from the posterior distribution of the sufficient statistics conditioned on noisy aggregate observations is derived and proved to be correctness and effectiveness experimentally. Expand
Approximate Inference in Collective Graphical Models
TLDR
A tractable convex approximation to the NP-hard MAP inference problem in CGMs is developed, and it is demonstrated empirically that these approximation techniques can reduce the computational cost of inference and the cost of learning by at least an order of magnitude while providing solutions of equal or better quality. Expand
An Iterative Ensemble Kalman Filter
TLDR
An essential stopping criteria is introduced for the proposed iterative extension to the ensemble Kalman filter to improve the estimates in the cases where the relationship between the model and observations is not linear. Expand
Graphical Models, Exponential Families, and Variational Inference
TLDR
The variational approach provides a complementary alternative to Markov chain Monte Carlo as a general source of approximation methods for inference in large-scale statistical models. Expand
On optimal ℓ∞ to ℓ∞ filtering
TLDR
Taking a model matching approach, suboptimal solutions are presented that stem from the resulting l ∞ -induced norm-minimization problem. Expand
Probabilistic reasoning in intelligent systems - networks of plausible inference
  • J. Pearl
  • Computer Science
  • Morgan Kaufmann series in representation and reasoning
  • 1989
TLDR
The author provides a coherent explication of probability as a language for reasoning with partial belief and offers a unifying perspective on other AI approaches to uncertainty, such as the Dempster-Shafer formalism, truth maintenance systems, and nonmonotonic logic. Expand
A Tutorial on Hidden Markov Models and Selected Applications
The fabric comprises a novel type of netting which will have particular utility in screening out mosquitoes and like insects and pests. The fabric is defined of voids having depth as well as widthExpand
...
1
2
...