#### Filter Results:

- Full text PDF available (37)

#### Publication Year

1999

2016

#### Publication Type

#### Co-author

#### Publication Venue

#### Data Set Used

#### Key Phrases

#### Organism

Learn More

a grant from Darpa in support of the CALO program. The authors wish to acknowledge helpful discussions with Lancelot James and Jim Pitman and the referees for useful comments. Abstract We consider problems involving groups of data, where each observation within a group is a draw from a mixture model, and where it is desirable to share mixture components… (More)

- Matthew J. Beal
- 2003

The Bayesian framework for machine learning allows for the incorporation of prior knowledge in a coherent way, avoids overfitting problems, and provides a principled basis for selecting between alternative models. Unfortunately the computations required are usually intractable. This thesis presents a unified variational Bayesian (VB) framework which… (More)

- Zoubin Ghahramani, Matthew J. Beal
- NIPS
- 1999

We present an algorithm that infers the model structure of a mixture of factor analysers using an efficient and deterministic varia-tional approximation to full Bayesian integration over model parameters. This procedure can automatically determine the optimal number of components and the local dimensionality of each component (Le. the number of factors in… (More)

We show that it is possible to extend hidden Markov models to have a countably infinite number of hidden states. By using the theory of Dirichlet processes we can implicitly integrate out the infinitely many transition parameters, leaving only three hyperparameters which can be learned from data. These three hyperparameters define a hierarchical Dirichlet… (More)

- MATTHEW J. BEAL
- 2002

SUMMARY We present an efficient procedure for estimating the marginal likelihood of probabilistic models with latent variables or incomplete data. This method constructs and optimises a lower bound on the marginal likelihood using variational calculus, resulting in an iterative algorithm which generalises the EM algorithm by maintaining posterior… (More)

- Zoubin Ghahramani, Matthew J. Beal
- NIPS
- 2000

Variational approximations are becoming a widespread tool for Bayesian learning of graphical models. We provide some theoretical results for the variational updates in a very general family of conjugate-exponential graphical models. We show how the belief propagation and the junction tree algorithms can be used in the inference step of variational Bayesian… (More)

- Matthew J. Beal, Nebojsa Jojic, Hagai Attias
- IEEE Trans. Pattern Anal. Mach. Intell.
- 2003

—We present a new approach to modeling and processing multimedia data. This approach is based on graphical models that combine audio and video variables. We demonstrate it by developing a new algorithm for tracking a moving object in a cluttered, noisy scene using two microphones and a camera. Our model uses unobserved variables to describe the data in… (More)

- Matthew J. Beal, Francesco Falciani, Zoubin Ghahramani, Claudia Rangel, David L. Wild
- Bioinformatics
- 2005

MOTIVATION
We have used state-space models (SSMs) to reverse engineer transcriptional networks from highly replicated gene expression profiling time series data obtained from a well-established model of T cell activation. SSMs are a class of dynamic Bayesian networks in which the observed measurements depend on some hidden state variables that evolve… (More)

We present an algorithm that infers the model structure of a mixture of factor analysers using an eecient and deterministic varia-tional approximation to full Bayesian integration over model parameters. This procedure can automatically determine the optimal number of components and the local dimensionality of each component (i.e. the number of factors in… (More)

A key problem in statistics and machine learning is inferring suitable structure of a model given some observed data. A Bayesian approach to model comparison makes use of the marginal likelihood of each candidate model to form a posterior distribution over models; unfortunately for most models of interest, notably those containing hidden or latent… (More)