# Bayesian Learning for Neural Networks

@inproceedings{Neal1995BayesianLF, title={Bayesian Learning for Neural Networks}, author={Radford M. Neal}, year={1995} }

Artificial "neural networks" are widely used as flexible models for classification and regression applications, but questions remain about how the power of these models can be safely exploited when training data is limited. This book demonstrates how Bayesian methods allow complex neural network models to be used without fear of the "overfitting" that can occur with traditional training methods. Insight into the nature of these complex Bayesian models is provided by a theoretical investigationâ€¦Â

## 3,632 Citations

Probable networks and plausible predictions - a review of practical Bayesian methods for supervised neural networks

- Computer Science
- 1995

Practical techniques based on Gaussian approximations for implementation of these powerful methods for controlling, comparing and using adaptive networks are described.

Bayesian Neural Networks and GLM

- Computer ScienceSpringer Actuarial
- 2019

This work presents a probabilistic treatment of the authors' a priori knowledge about parameters based on Markov Chain Monte Carlo methods based on simulations, which results in a powerful framework that can be used for estimating the density of predictors.

A Framework for Nonparametric Regression Using Neural Networks

- Computer Science

This paper develops a methodology for nonparametric regression within the Bayesian paradigm, and presents results on the asymptotic consistency of the posterior for neural network regression.

Hamiltonian Monte Carlo based on evidence framework for Bayesian learning to neural network

- Computer Science, MathematicsSoft Comput.
- 2019

This paper trains the network weights by means of Hamiltonian Monte Carlo (HMC) and proposes to sample from posterior distribution using HMC in order to approximate the derivative of evidence which allow to re-estimate hyperparameters.

Bayesian Regularization of Neural Networks

- Computer ScienceArtificial Neural Networks
- 2009

This chapter outlines the equations that define the BRANN method plus a flowchart for producing a BRANN-QSAR model, and some results of the use of BRANNs on a number of data sets are illustrated and compared with other linear and nonlinear models.

Model selection and model averaging for neural networks

- Computer Science
- 1998

This thesis develops a methodology for doing nonparametric regression within the Bayesian framework, and demonstrates how to use a noninformative prior for a neural network, which is useful because of the difficulty in interpreting the parameters.

Classification using Bayesian neural nets

- Computer ScienceProceedings of International Conference on Neural Networks (ICNN'96)
- 1996

This paper demonstrates the effects of this approach by an implementation of the full Bayesian framework applied to two real world classification problems and discusses the idea of calibration to measure the predictive performance.

Bayesian neural networks for classification: how useful is the evidence framework?

- Computer ScienceNeural Networks
- 1999

A position paper on statistical inference techniques which integrate neural network and Bayesian network models

- Computer ScienceProceedings of International Conference on Neural Networks (ICNN'97)
- 1997

The Gibbs sampler is presented, both in its successful role as a convergence heuristic derived from statistical physics and under its probabilistic learning interpretation, and how the Bayesian network formalism informs the causal reasoning interpretation of some neural networks.

Bayesian techniques for neural networks â€” Review and case studies

- Computer Science, Mathematics2000 10th European Signal Processing Conference
- 2000

This contribution gives a short review on Bayesian techniques for neural networks and presents comparison results from several case studies that include regression, classification, and inverse problems.

## References

SHOWING 1-10 OF 94 REFERENCES

Bayesian Training of Backpropagation Networks by theHybrid Monte

- Computer Science
- 1993

It is shown that Bayesian training of backpropagation neural networks can feasibly be performed by the Hybrid Monte Carlo method, and the method has been applied to a test problem, demonstrating that it can produce good predictions, as well as an indication of the uncertainty of these predictions.

A Practical Bayesian Framework for Backpropagation Networks

- Computer ScienceNeural Computation
- 1992

A quantitative and practical Bayesian framework is described for learning of mappings in feedforward networks that automatically embodies "Occam's razor," penalizing overflexible and overcomplex models.

Ace of Bayes : Application of Neural

- Computer Science
- 1993

Bayesian backprop is applied in the prediction of fat content in minced meat from near infrared spectra and outperforms \early stopping" as well as quadratic regression.

The Evidence Framework Applied to Classification Networks

- Computer ScienceNeural Computation
- 1992

It is demonstrated that the Bayesian framework for model comparison described for regression models in MacKay (1992a,b) can also be applied to classification problems and an information-based data selection criterion is derived and demonstrated within this framework.

On the Use of Evidence in Neural Networks

- Computer ScienceNIPS
- 1992

It turns out that the evidence procedure's MAP estimate for neural nets is, in toto, approximation error, and the exact result neither has to be re-calculated for every new data set, nor requires the running of computer code.

Neural Networks and the Bias/Variance Dilemma

- Computer Science, PsychologyNeural Computation
- 1992

It is suggested that current-generation feedforward neural networks are largely inadequate for difficult problems in machine perception and machine learning, regardless of parallel-versus-serial hardware or other implementation issues.

Bayesian Learning via Stochastic Dynamics

- Computer ScienceNIPS
- 1992

Bayesian methods avoid overfitting and poor generalization by averaging the outputs of many networks with weights sampled from the posterior distribution given the training data, by simulating a stochastic dynamical system that has the posterior as its stationary distribution.

Bayesian Mixture Modeling

- Computer Science
- 1992

It is shown that Bayesian inference from data modeled by a mixture distribution can feasibly be performed via Monte Carlo simulation. This method exhibits the true Bayesian predictive distribution,â€¦

Keeping the neural networks simple by minimizing the description length of the weights

- Computer ScienceCOLT '93
- 1993

A method of computing the derivatives of the expected squared error and of the amount of information in the noisy weights in a network that contains a layer of non-linear hidden units without time-consuming Monte Carlo simulations is described.

Robust Parameter Estimation and Model Selection for Neural Network Regression

- Computer ScienceNIPS
- 1993

It is shown that the conventional back-propagation (BPP) algorithm for neural network regression is robust to leverages, but not to outliers, and a robust model is to model the error as a mixture of normal distribution.