• Publications
  • Influence
Canonical Correlation Forests
TLDR
We introduce canonical correlation forests (CCFs), a new tree ensemble method for classification where the individual canonical correlation trees (CCTs) use hyperplane splits based on the feature projections from a canonical correlation analysis. Expand
  • 56
  • 20
  • PDF
Auto-Encoding Sequential Monte Carlo
TLDR
We build on auto-encoding sequential Monte Carlo (AESMC): a method for model and proposal learning based on maximizing the lower bound to the log marginal likelihood in a broad family of structured probabilistic models. Expand
  • 72
  • 15
  • PDF
Tighter Variational Bounds are Not Necessarily Better
TLDR
We provide theoretical and empirical evidence that using tighter evidence lower bounds (ELBOs) can be detrimental to the process of learning an inference network by reducing the signal-to-noise ratio of the gradient estimator. Expand
  • 107
  • 14
  • PDF
Disentangling Disentanglement in Variational Autoencoders
TLDR
We develop a generalisation of disentanglement in VAEs---decomposition of the latent representation---characterising it as the fulfilment of two factors: a) the latent encodings of the data having an appropriate level of overlap, and b) the aggregate encoding of data conforming to a desired structure, represented through the prior. Expand
  • 71
  • 5
  • PDF
Interacting Particle Markov Chain Monte Carlo
TLDR
We introduce interacting particle Markov chain Monte Carlo (iPMCMC), a PMCMC method based on an interacting pool of standard and conditional sequential Monte Carlo samplers. Expand
  • 25
  • 4
  • PDF
On the Fairness of Disentangled Representations
TLDR
In this paper, we investigate the usefulness of different notions of disentanglement for improving the fairness of downstream prediction tasks based on representations. Expand
  • 51
  • 3
  • PDF
On Nesting Monte Carlo Estimators
TLDR
We investigate the statistical implications of nesting MC estimators, including cases of multiple levels of nesting, and establish the conditions under which they converge. Expand
  • 44
  • 3
  • PDF
A Statistical Approach to Assessing Neural Network Robustness
TLDR
We present a new approach to assessing the robustness of neural networks based on estimating the proportion of inputs for which a property is violated under an input model. Expand
  • 19
  • 3
  • PDF
Nesting Probabilistic Programs
TLDR
We formalize the notion of nesting probabilistic programming queries and investigate the resulting statistical implications. Expand
  • 13
  • 2
  • PDF
Variational Bayesian Optimal Experimental Design
TLDR
We propose a variational BOED approach that sidesteps the double intractability of the EIG in a principled manner and yields estimators with convergence rates in line with those for conventional estimation problems. Expand
  • 12
  • 2
  • PDF