Learn More
We marry ideas from deep neural networks and approximate Bayesian inference to derive a gen-eralised class of deep, directed generative models , endowed with a new algorithm for scalable inference and learning. Our algorithm introduces a recognition model to represent an approximate posterior distribution and uses this for optimisa-tion of a variational(More)
The ever-increasing size of modern data sets combined with the difficulty of obtaining label information has made semi-supervised learning one of the problems of significant practical importance in modern data analysis. We revisit the approach to semi-supervised learning with generative models and develop new models that allow for effective generalisation(More)
— The classification of protein sequences into families is an important tool in the annotation of structural and functional properties to newly discovered proteins. We present a classification system using pattern recognition techniques to create a numerical vector representation of a protein sequence and then classify the sequence into a number of given(More)
Nonparametric Bayesian models provide a framework for flexible probabilistic modelling of complex datasets. Unfortunately, the high-dimensional averages required for Bayesian methods can be slow, especially with the unbounded representations used by nonparametric models. We address the challenge of scaling Bayesian inference to the increasingly large(More)
The use of L 1 regularisation for sparse learning has generated immense research interest , with many successful applications in diverse areas such as signal acquisition, image coding, genomics and collaborative filtering. While existing work highlights the many advantages of L 1 methods, in this paper we find that L 1 regularisation often dramatically(More)
Introduction Motivation: Sparsity has become an important area of research in machine learning and statistics for a number of reasons: The statistics of real-data are often sparse. Sparsity provides good regularisation, thus avoiding overfitting. Sparsity can be exploited for faster computation. Sparsity is central to problems in compressed sensing and(More)