• Publications
  • Influence
Variational Continual Learning
TLDR
Variational continual learning is developed, a simple but general framework for continual learning that fuses online variational inference and recent advances in Monte Carlo VI for neural networks that outperforms state-of-the-art continual learning methods.
Deep Gaussian Processes for Regression using Approximate Expectation Propagation
TLDR
A new approximate Bayesian learning scheme is developed that enables DGPs to be applied to a range of medium to large scale regression problems for the first time and is almost always better than state-of-the-art deterministic and sampling-based approximate inference methods for Bayesian neural networks.
Black-box α-divergence minimization
Black-box alpha (BB-α) is a new approximate inference method based on the minimization of α-divergences. BB-α scales to large datasets because it can be implemented using stochastic gradient descent.
Streaming Sparse Gaussian Process Approximations
TLDR
A new principled framework for deploying Gaussian process probabilistic models in the streaming setting is developed, providing methods for learning hyperparameters and optimising pseudo-input locations.
A Unifying Framework for Gaussian Process Pseudo-Point Approximations using Power Expectation Propagation
TLDR
This paper develops a new pseudo-point approximation framework using Power Expectation Propagation (Power EP) that unifies a large number of these pseudo- point approximations and demonstrates that the new framework includes new Pseudo- point approximation methods that outperform current approaches on regression and classification tasks.
Black-Box Alpha Divergence Minimization
TLDR
Experiments on probit regression and neural network regression and classification problems show that BB-$\alpha$ with non-standard settings of $\alpha$ usually produces better predictions than with $\alpha \rightarrow 0$ (VB) or $\alpha = 1$ (EP).
Neural Graph Learning: Training Neural Networks Using Graphs
TLDR
The proposed joint training approach convincingly outperforms many existing methods on a wide range of tasks (multi-label classification on social graphs, news categorization, document classification and semantic intent classification), with multiple forms of graph inputs and using different types of neural networks.
Tree-structured Gaussian Process Approximations
TLDR
This paper devise an approximation whose complexity grows linearly with the number of pseudo-datapoints and calibrating the approximation using a Kullback-Leibler (KL) minimization, and demonstrates the validity of this approach on a set of challenging regression tasks including missing data imputation for audio and spatial datasets.
Partitioned Variational Inference: A unified framework encompassing federated and continual learning
TLDR
A new framework is presented that explicitly acknowledges these algorithmic dimensions of VI, unifies disparate literature, and provides guidance on usage, and allows new ways of performing VI that are ideally suited to challenging learning scenarios including federated learning and continual learning.
Learning Stationary Time Series using Gaussian Processes with Nonparametric Kernels
TLDR
A novel variational free-energy approach based on inter-domain inducing variables that efficiently learns the continuous-time linear filter and infers the driving white-noise process is developed, leading to new Bayesian nonparametric approaches to spectrum estimation.
...
1
2
3
4
...