# Inferring a Gaussian distribution

@inproceedings{Minka2001InferringAG, title={Inferring a Gaussian distribution}, author={Thomas P. Minka}, year={2001} }

A common question in statistical modeling is “which out of a continuum of models are likely to have generated this data?” For the Gaussian class of models, this question can be answered completely and exactly. This paper derives the exact posterior distribution over the mean and variance of the generating distribution, i.e. p(m, V|X), as well as the marginals p(m|X) and p(V|X). It also derives p(X|Gaussian), the probability that the data came from any Gaussian whatsoever. From this we can get…

## 40 Citations

How to use KL-divergence to construct conjugate priors, with well-defined non-informative limits, for the multivariate Gaussian

- Mathematics, Computer Science
- 2021

It is shown how to use the scaled KL-divergence between multivariate Gaussians as an energy function to construct Wishart and normal-Wishart conjugate priors, and the scale factor can be taken down to the limit at zero, to form noninformative priors that do not violate the restrictions on the Wishart shape parameter.

Approximate Variational Inference For Mixture Models

- 2021

Learning truths behind real, relevant data is faced with uncertainty. A probabilistic view on unsupervised learning considers this uncertainty in its learning objectives through probability…

Dynamic bayesian networks: representation, inference and learning

- Computer Science
- 2002

This thesis will discuss how to represent many different kinds of models as DBNs, how to perform exact and approximate inference in Dbns, and how to learn DBN models from sequential data.

Discriminative, generative and imitative learning

- Computer Science
- 2002

It is demonstrated that imitative learning can be adequately addressed as a discriminative prediction task which outperforms the usual generative approach and is applied with a generative perceptual system to synthesize a real-time agent that learns to engage in social interactive behavior.

Machine learning - a probabilistic perspective

- Computer ScienceAdaptive computation and machine learning series
- 2012

This textbook offers a comprehensive and self-contained introduction to the field of machine learning, based on a unified, probabilistic approach, and is suitable for upper-level undergraduates with an introductory-level college math background and beginning graduate students.

The Rational Basis of Representativeness

- 2001

Representativeness is a central explanatory construct in cognitive science but suffers from the lack of a principled theoretical account. Here we present a formal definition of one sense of…

Maximum Entropy Discrimination

- Computer Science, MathematicsNIPS
- 1999

A general framework for discriminative estimation based on the maximum entropy principle and its extensions is presented and preliminary experimental results are indicative of the potential in these techniques.

The Rational Basisof Representativeness

- 2001

Representativeness is a central explanatory construct in cognitive science but suffers from the lack of a principled theoretical account. Here we present a formal definition of one sense of…

From mere coincidences to meaningful discoveries q

- 2006

People’s reactions to coincidences are often cited as an illustration of the irrationality of human reasoning about chance. We argue that coincidences may be better understood in terms of rational…

Classification via Minimum Incremental Coding Length

- Mathematics, Computer ScienceSIAM J. Imaging Sci.
- 2009

A simple new criterion for classification, based on principles from lossy data compression, and its kernel and local versions perform competitively on synthetic examples, as well as on real imagery data such as handwritten digits and face images.

## References

SHOWING 1-4 OF 4 REFERENCES

Bayesian inference in statistical analysis

- Computer Science, Mathematics
- 1973

This chapter discusses Bayesian Assessment of Assumptions, which investigates the effect of non-Normality on Inferences about a Population Mean with Generalizations in the context of a Bayesian inference model.

Developments in Probabilistic Modelling with Neural Networks - Ensemble Learning

- Computer ScienceSNN Symposium on Neural Networks
- 1995

This paper presents a framework for statistical inference in which an ensemble of parameter vectors is optimized rather than a single parameter vector and approximates the posterior probability distribution of the parameters.

Box and George C . Tiao . Bayesian Inference in Statistical Analysis

- 1973