• Publications
  • Influence
Pattern Recognition and Machine Learning
  • R. Neal
  • Computer Science, Mathematics
  • Technometrics
  • 1 August 2007
TLDR
This book covers a broad range of topics for regular factorial designs and presents all of the material in very mathematical fashion and will surely become an invaluable resource for researchers and graduate students doing research in the design of factorial experiments. Expand
Bayesian learning for neural networks
TLDR
Bayesian Learning for Neural Networks shows that Bayesian methods allow complex neural network models to be used without fear of the "overfitting" that can occur with traditional neural network learning methods. Expand
MCMC Using Hamiltonian Dynamics
  • R. Neal
  • Mathematics, Physics
  • 10 May 2011
Hamiltonian dynamics can be used to produce distant proposals for the Metropolis algorithm, thereby avoiding the slow exploration of the state space that results from the diffusive behaviour ofExpand
A View of the Em Algorithm that Justifies Incremental, Sparse, and other Variants
TLDR
An incremental variant of the EM algorithm in which the distribution for only one of the unobserved variables is recalculated in each E step is shown empirically to give faster convergence in a mixture estimation problem. Expand
Markov Chain Sampling Methods for Dirichlet Process Mixture Models
Abstract This article reviews Markov chain methods for sampling from the posterior distribution of a Dirichlet process mixture model and presents two new classes of methods. One new approach is toExpand
Annealed importance sampling
  • R. Neal
  • Mathematics, Physics
  • Stat. Comput.
  • 8 March 1998
TLDR
It is shown how one can use the Markov chain transitions for such an annealing sequence to define an importance sampler, which can be seen as a generalization of a recently-proposed variant of sequential importance sampling. Expand
Probabilistic Inference Using Markov Chain Monte Carlo Methods
TLDR
The role of probabilistic inference in artificial intelligence is outlined, the theory of Markov chains is presented, and various Markov chain Monte Carlo algorithms are described, along with a number of supporting techniques. Expand
Near Shannon limit performance of low density parity check codes
The authors report the empirical performance of Gallager's low density parity check codes on Gaussian channels. They show that performance substantially better than that of standard convolutional andExpand
Arithmetic coding for data compression
The state of the art in data compression is arithmetic coding, not the better-known Huffman method. Arithmetic coding gives greater compression, is faster for adaptive models, and clearly separatesExpand
Connectionist Learning of Belief Networks
  • R. Neal
  • Computer Science
  • Artif. Intell.
  • 1 July 1992
TLDR
The “Gibbs sampling” simulation procedure for “sigmoid” and “noisy-OR” varieties of probabilistic belief networks can support maximum-likelihood learning from empirical data through local gradient ascent. Expand
...
1
2
3
4
5
...