#### Filter Results:

#### Publication Year

2007

2017

#### Publication Type

#### Co-author

#### Publication Venue

#### Key Phrases

Learn More

- David A. Knowles, Zoubin Ghahramani
- ICA
- 2007

A nonparametric Bayesian extension of Independent Components Analysis (ICA) is proposed where observed data Y is modelled as a linear superposition, G, of a potentially infinite number of hidden sources, X. Whether a given source is active for a specific data point is specified by an infinite binary matrix, Z. The resulting sparse representation allows… (More)

A nonparametric Bayesian extension of Factor Analysis (FA) is proposed where observed data Y is modeled as a linear superposi-tion, G, of a potentially infinite number of hidden factors, X. The Indian Buffet Process (IBP) is used as a prior on G to incorporate sparsity and to allow the number of latent features to be inferred. The model's utility for… (More)

- David A. Knowles, Tom Minka
- NIPS
- 2011

Variational Message Passing (VMP) is an algorithmic implementation of the Vari-ational Bayes (VB) method which applies only in the special case of conjugate exponential family models. We propose an extension to VMP, which we refer to as Non-conjugate Variational Message Passing (NCVMP) which aims to alleviate this restriction while maintaining modularity,… (More)

- Tim Salimans, David A. Knowles
- ArXiv
- 2012

We propose a general algorithm for approximating nonstandard Bayesian posterior distributions. The algorithm minimizes the Kullback-Leibler divergence of an approximating distribution to the intractable posterior distribution. Our method can be used to approximate any posterior distribution, provided that it is given in closed form up to the proportionality… (More)

We introduce a new regression framework, Gaussian process regression networks (GPRN), which combines the structural properties of Bayesian neural networks with the nonparametric flexibility of Gaussian processes. This model accommodates input dependent signal and noise correlations between multiple response variables, input dependent length-scales and… (More)

Latent variable models for network data extract a summary of the relational structure underlying an observed network. The simplest possible models subdivide nodes of the network into clusters; the probability of a link between any two nodes then depends only on their cluster assignment. Currently available models can be classified by whether clusters are… (More)

Nonparametric Bayesian models provide a framework for flexible probabilistic modelling of complex datasets. Unfortunately, the high-dimensional averages required for Bayesian methods can be slow, especially with the unbounded representations used by nonparametric models. We address the challenge of scaling Bayesian inference to the increasingly large… (More)

Factor analysis models effectively summarise the covariance structure of high dimensional data, but the solutions are typically hard to interpret. This motivates attempting to find a disjoint partition, i.e. a simple clustering, of observed variables into highly correlated subsets. We introduce a Bayesian non-parametric approach to this problem, and… (More)

WHAT: a probabilistic model to infer binary latent variables that preserve neighbourhood structure of the data • WHY: to perform a nearest neighbour search for the purpose of retrieval • WHEN: in dynamic and streaming nature of the Internet data

- David A. Knowles, Zoubin Ghahramani
- IEEE Transactions on Pattern Analysis and Machine…
- 2015

In this paper we introduce the Pitman Yor Diffusion Tree (PYDT), a Bayesian non-parametric prior over tree structures which generalises the Dirichlet Diffusion Tree <xref ref-type="bibr" rid="ref30">[30]</xref> and removes the restriction to binary branching structure. The generative process is described and shown to result in an exchangeable distribution… (More)