• Publications
  • Influence
Pattern Recognition and Machine Learning
TLDR
This book covers a broad range of topics for regular factorial designs and presents all of the material in very mathematical fashion and will surely become an invaluable resource for researchers and graduate students doing research in the design of factorial experiments.
MCMC Using Hamiltonian Dynamics
Hamiltonian dynamics can be used to produce distant proposals for the Metropolis algorithm, thereby avoiding the slow exploration of the state space that results from the diffusive behaviour of
Bayesian Learning for Neural Networks
TLDR
Bayesian Learning for Neural Networks shows that Bayesian methods allow complex neural network models to be used without fear of the "overfitting" that can occur with traditional neural network learning methods.
Markov Chain Sampling Methods for Dirichlet Process Mixture Models
Abstract This article reviews Markov chain methods for sampling from the posterior distribution of a Dirichlet process mixture model and presents two new classes of methods. One new approach is to
A View of the Em Algorithm that Justifies Incremental, Sparse, and other Variants
TLDR
An incremental variant of the EM algorithm in which the distribution for only one of the unobserved variables is recalculated in each E step is shown empirically to give faster convergence in a mixture estimation problem.
Annealed importance sampling
TLDR
It is shown how one can use the Markov chain transitions for such an annealing sequence to define an importance sampler, which can be seen as a generalization of a recently-proposed variant of sequential importance sampling.
Probabilistic Inference Using Markov Chain Monte Carlo Methods
TLDR
The role of probabilistic inference in artificial intelligence is outlined, the theory of Markov chains is presented, and various Markov chain Monte Carlo algorithms are described, along with a number of supporting techniques.
Arithmetic coding for data compression
The state of the art in data compression is arithmetic coding, not the better-known Huffman method. Arithmetic coding gives greater compression, is faster for adaptive models, and clearly separates
Near Shannon limit performance of low density parity check codes
The authors report the empirical performance of Gallager's low density parity check codes on Gaussian channels. They show that performance substantially better than that of standard convolutional and
Slice Sampling
Markov chain sampling methods that adapt to characteristics of the distribution being sampled can be constructed using the principle that one can ample from a distribution by sampling uniformly from
...
...