A Unifying Review of Linear Gaussian Models

@article{Roweis1999AUR,
  title={A Unifying Review of Linear Gaussian Models},
  author={Sam T. Roweis and Zoubin Ghahramani},
  journal={Neural Computation},
  year={1999},
  volume={11},
  pages={305-345}
}
Factor analysis, principal component analysis, mixtures of gaussian clusters, vector quantization, Kalman filter models, and hidden Markov models can all be unified as variations of unsupervised learning under a single basic generative model. This is achieved by collecting together disparate observations and derivations made by many previous authors and introducing a new way of linking discrete and continuous state models using a simple nonlinearity. Through the use of other nonlinearities, we… 
Tied and Regularized Conditional Gaussian Graphical Models for Acoustic Modeling in ASR
TLDR
This chapter explores how graphical models can help describe a variety of tied and regularized Gaussian mixture systems and finds that for certain combinations of regularization and/or tying, it is no longer the case that one may achieve a closed-form analytic solution to the EM update equations.
Independent Factor Analysis
  • H. Attias
  • Computer Science
    Neural Computation
  • 1999
TLDR
An expectation-maximization (EM) algorithm is presented, which performs unsupervised learning of an associated probabilistic model of the mixing situation and is shown to be superior to ICA since it can learn arbitrary source densities from the data.
Gaussian sampling by local perturbations
TLDR
A technique for exact simulation of Gaussian Markov random fields (GMRFs), which can be interpreted as locally injecting noise to each Gaussian factor independently, followed by computing the mean/mode of the perturbed GMRF, which leads to an efficient unbiased estimator of marginal variances.
Unsupervised Learning
TLDR
The aim of this chapter is to provide a high-level view of the field of unsupervised learning from the perspective of statistical modeling and derive the EM algorithm and give an overview of fundamental concepts in graphical models, and inference algorithms on graphs.
Independent Factor Analysis 1 Statistical Modeling and Blind Source Separation
TLDR
This work introduces the independent factor analysis (IFA) method for recovering independent hidden sources from their observed mixtures, and presents an expectation-maximization (EM) algorithm, which performs unsupervised learning of an associated probabilistic model of the mixing situation.
Resolution-Based Complexity Control for Gaussian Mixture Models
TLDR
This work presents a complexity control scheme, which provides an effective means for avoiding the problem of overfitting usually encountered with unconstrained (mixtures of) gaussians in high dimensions within a common deterministic annealing framework.
Factoring Gaussian precision matrices for linear dynamic models
Variational Bayes Latent Variable Models And Mixture Extensions
TLDR
One of the aims of this thesis is to investigate how the Bayesian framework infers the wanted parameters, e.g. number of components in a mixture model, given a model, and how it succeeds in solving the problems related to overfitting.
Gaussian Models in Automatic Speech Recognition
TLDR
This chapter explores how graphical models can help describe a variety of tied and regularized Gaussian mixture systems, and finds that for certain combinations of regularization and/or tying, it is no longer the case that a closed-form analytic solution to the EM update equations is achieved.
Hidden Markov Independent Component Analysis
We propose a generative model for the analysis of non-stationary multivariate time series. The model uses a hidden Markov process to switch between independent component models where the components
...
1
2
3
4
5
...

References

SHOWING 1-10 OF 119 REFERENCES
Resolution-Based Complexity Control for Gaussian Mixture Models
TLDR
This work presents a complexity control scheme, which provides an effective means for avoiding the problem of overfitting usually encountered with unconstrained (mixtures of) gaussians in high dimensions within a common deterministic annealing framework.
Variational Learning for Switching State-Space Models
TLDR
A new statistical model for time series that iteratively segments data into regimes with approximately linear dynamics and learns the parameters of each of these linear regimes is introduced and the results suggest that variational approximations are a viable method for inference and learning in switching state-space models.
Mixtures of Probabilistic Principal Component Analyzers
TLDR
PCA is formulated within a maximum likelihood framework, based on a specific form of gaussian latent variable model, which leads to a well-defined mixture model for probabilistic principal component analyzers, whose parameters can be determined using an expectation-maximization algorithm.
Switching State-Space Models
TLDR
A statistical model for times series data with nonlinear dynamics which iteratively segments the data into regimes with approximately linear dynamics and learns the parameters of each of those regimes, and presents a variational approximation which maximizes a lower bound on the log likelihood.
Probabilistic Independence Networks for Hidden Markov Probability Models
TLDR
It is shown that the well-known forward-backward and Viterbi algorithms for HMMs are special cases of more general inference algorithms for arbitrary PINs and the existence of inference and estimation algorithms for more general graphical models provides a set of analysis tools for HMM practitioners who wish to explore a richer class of HMM structures.
Blind source separation and deconvolution: the dynamic component analysis algorithm
We derive a novel family of unsupervised learning algorithms for blind separation of mixed and convolved sources. Our approach is based on formulating the separation problem as a learning task of a
Modeling Acoustic Correlations by Factor Analysis
TLDR
This work evaluates the combined use of mixture densities and factor analysis in HMMs that recognize alphanumeric strings and finds that these methods, properly combined, yield better models than either method on its own.
Parameter estimation for linear dynamical systems
TLDR
The Expectation Maximization (EM) algorithm for estimating the parameters of linear systems (LDS) is introduced and its relation to factor analysis and other data modeling techniques is pointed out.
Regression with Input-dependent Noise: A Gaussian Process Treatment
TLDR
This paper shows that prior uncertainty about the parameters controlling both processes can be handled and that the posterior distribution of the noise rate can be sampled from using Markov chain Monte Carlo methods and gives a posterior noise variance that well-approximates the true variance.
Maximum Likelihood and Covariant Algorithms for Independent Component Analysis
TLDR
It is shown that Bell and Sejnowski’s (1995) algorithm can be viewed as a maximum likelihood algorithm for the optimization of a linear generative model and a covariant version of the algorithm is derived.
...
1
2
3
4
5
...