Bayesian learning for neural networks

@inproceedings{Hinton1995BayesianLF,
  title={Bayesian learning for neural networks},
  author={Geoffrey E. Hinton and Radford M. Neal},
  year={1995}
}
From the Publisher: Artificial "neural networks" are now widely used as flexible models for regression classification applications, but questions remain regarding what these models mean, and how they can safely be used when training data is limited. Bayesian Learning for Neural Networks shows that Bayesian methods allow complex neural network models to be used without fear of the "overfitting" that can occur with traditional neural network learning methods. Insight into the nature of these… Expand
A position paper on statistical inference techniques which integrate neural network and Bayesian network models
  • W. Hsu
  • Computer Science
  • Proceedings of International Conference on Neural Networks (ICNN'97)
  • 1997
TLDR
The Gibbs sampler is presented, both in its successful role as a convergence heuristic derived from statistical physics and under its probabilistic learning interpretation, and how the Bayesian network formalism informs the causal reasoning interpretation of some neural networks. Expand
Bayesian Regularization of Neural Networks
TLDR
This chapter outlines the equations that define the BRANN method plus a flowchart for producing a BRANN-QSAR model, and some results of the use of BRANNs on a number of data sets are illustrated and compared with other linear and nonlinear models. Expand
Hamiltonian Monte Carlo based on evidence framework for Bayesian learning to neural network
TLDR
This paper trains the network weights by means of Hamiltonian Monte Carlo (HMC) and proposes to sample from posterior distribution using HMC in order to approximate the derivative of evidence which allow to re-estimate hyperparameters. Expand
Bayesian approach for neural networks--review and case studies
TLDR
In the most thoroughly analyzed regression problem, the best models were those with less restrictive priors, which emphasizes the major advantage of the Bayesian approach, that the authors are not forced to guess attributes that are unknown, such as the number of degrees of freedom in the model. Expand
Evolution programs for Bayesian training of neural networks
TLDR
It is shown that Evolution Programs can be used to search the weight space for Bayesian training of a Neural Network using ANNs as classifiers, and the generalization to regression problems is straightforward. Expand
The practicalities of scaling Bayesian neural networks to real-world applications
TLDR
The emphasis is on how to achieve calibrated uncertainty estimates without compromising scalability, and a new method for implementing Bayesian neural networks within the framework of Bayesian decision theory is offered. Expand
Bayesian Inference In Neural Networks
TLDR
In the present article, some aspects of the Bayesian method will be illustrated in a 2-group classification problem and then applied at a very rudimentary level to the development of a neural network for the prediction of tornados. Expand
Can Statistical Theory Help Us Use Neural Networks Better
TLDR
If the authors view neural nets as a class of statistical models with highdimensional parameters, how to apply the ideas of statistical theory, in particular ideas for model choice and the concepts of predictive Bayesian inference give considerable insight, and enable us to find more powerful solutions with reduced computational load. Expand
On the Use of Bayesian Methods for Evaluating Compartmental Neural Models
TLDR
While the Bayesian methodology can be applied to any type of model, as an example it is outlined its use for an important, and increasingly standard, class of models in computational neuroscience—compartmental models of single neurons. Expand
Probable networks and plausible predictions - a review of practical Bayesian methods for supervised neural networks
TLDR
Practical techniques based on Gaussian approximations for implementation of these powerful methods for controlling, comparing and using adaptive networks are described. Expand
...
1
2
3
4
5
...

References

SHOWING 1-10 OF 93 REFERENCES
Bayesian Training of Backpropagation Networks by theHybrid Monte
TLDR
It is shown that Bayesian training of backpropagation neural networks can feasibly be performed by the Hybrid Monte Carlo method, and the method has been applied to a test problem, demonstrating that it can produce good predictions, as well as an indication of the uncertainty of these predictions. Expand
A Practical Bayesian Framework for Backpropagation Networks
  • D. Mackay
  • Mathematics, Computer Science
  • Neural Computation
  • 1992
TLDR
A quantitative and practical Bayesian framework is described for learning of mappings in feedforward networks that automatically embodies "Occam's razor," penalizing overflexible and overcomplex models. Expand
Ace of Bayes : Application of Neural
MacKay's Bayesian framework for backpropagation is a practical and powerful means of improving the generalisation ability of neural networks. The framework is reviewed and extended in a pedagogicalExpand
On the Use of Evidence in Neural Networks
TLDR
It turns out that the evidence procedure's MAP estimate for neural nets is, in toto, approximation error, and the exact result neither has to be re-calculated for every new data set, nor requires the running of computer code. Expand
The Evidence Framework Applied to Classification Networks
  • D. Mackay
  • Mathematics, Computer Science
  • Neural Computation
  • 1992
TLDR
It is demonstrated that the Bayesian framework for model comparison described for regression models in MacKay (1992a,b) can also be applied to classification problems and an information-based data selection criterion is derived and demonstrated within this framework. Expand
Keeping the neural networks simple by minimizing the description length of the weights
TLDR
A method of computing the derivatives of the expected squared error and of the amount of information in the noisy weights in a network that contains a layer of non-linear hidden units without time-consuming Monte Carlo simulations is described. Expand
Neural Networks and the Bias/Variance Dilemma
TLDR
It is suggested that current-generation feedforward neural networks are largely inadequate for difficult problems in machine perception and machine learning, regardless of parallel-versus-serial hardware or other implementation issues. Expand
A Learning Algorithm for Boltzmann Machines
TLDR
A general parallel search method is described, based on statistical mechanics, and it is shown how it leads to a general learning rule for modifying the connection strengths so as to incorporate knowledge about a task domain in an efficient way. Expand
Bayesian Mixture Modeling
It is shown that Bayesian inference from data modeled by a mixture distribution can feasibly be performed via Monte Carlo simulation. This method exhibits the true Bayesian predictive distribution,Expand
Bayesian Learning via Stochastic Dynamics
  • R. Neal
  • Mathematics, Computer Science
  • NIPS
  • 1992
TLDR
Bayesian methods avoid overfitting and poor generalization by averaging the outputs of many networks with weights sampled from the posterior distribution given the training data, by simulating a stochastic dynamical system that has the posterior as its stationary distribution. Expand
...
1
2
3
4
5
...