Issues in Bayesian Analysis of Neural Network Models
@article{Mller1998IssuesIB, title={Issues in Bayesian Analysis of Neural Network Models}, author={Peter M{\"u}ller and David R{\'i}os Insua}, journal={Neural Computation}, year={1998}, volume={10}, pages={749-770} }
Stemming from work by Buntine and Weigend (1991) and MacKay (1992), there is a growing interest in Bayesian analysis of neural network models. [] Key Method The scheme is then extended to the variable architecture case, providing a data-driven procedure to identify sensible architectures.
141 Citations
A Noninformative Prior for Neural Networks
- Computer ScienceMachine Learning
- 2004
A noninformative prior for feed-forward neural networks is introduced, describing several theoretical and practical advantages of this approach, in particular, a simpler prior allows for a simpler Markov chain Monte Carlo algorithm.
Bayesian Methods for Neural Networks and Related Models
- Computer Science
- 2004
The paper reviews the various approaches taken to overcome the difficulty of closed-form Bayesian analysis in feed-forward neural networks, involving the use of Gaussian approximations, Markov chain Monte Carlo simulation routines and a class of non-Gaussian but "deterministic" approximation called variational approxIMations.
Variable Architecture Bayesian Neural Networks: Model Selection Based on EMC
- Computer Science
- 2006
A variable architecture model where the number of hidden units are selected by using a variant of the real-coded Evolutionary Monte Carlo algorithm developed by Liang and Wong (2001) for inference and prediction in fixed architecture Bayesian Neural Networks is proposed.
Evolutionary Model Selection in Bayesian Neural Networks
- Computer Science
- 2003
A variable architecture model is proposed where input-to-hidden connections and, therefore, hidden units are selected by using a variant of the Evolutionary Monte Carlo algorithm developed by (2000).
On MCMC sampling in Bayesian MLP neural networks
- Computer ScienceProceedings of the IEEE-INNS-ENNS International Joint Conference on Neural Networks. IJCNN 2000. Neural Computing: New Challenges and Perspectives for the New Millennium
- 2000
This article proposes a new method for choosing the starting values based on early stopping and demonstrates the benefits of using several independent chains in Bayesian MLP models.
Model selection and model averaging for neural networks
- Computer Science
- 1998
This thesis develops a methodology for doing nonparametric regression within the Bayesian framework, and demonstrates how to use a noninformative prior for a neural network, which is useful because of the difficulty in interpreting the parameters.
Neural Network Models for Conditional Distribution Under Bayesian Analysis
- Computer ScienceNeural Computation
- 2008
This work uses neural networks as a tool for a nonlinear autoregression to predict the second moment of the conditional density of return series and estimates the models in a Bayesian framework using Markov chain Monte Carlo posterior simulations.
A Framework for Nonparametric Regression Using Neural Networks
- Computer Science
This paper develops a methodology for nonparametric regression within the Bayesian paradigm, and presents results on the asymptotic consistency of the posterior for neural network regression.
Default Priors for Neural Network Classification
- Computer ScienceJ. Classif.
- 2007
This paper looks at the underlying probability model, so as to understand statistically what is going on in order to facilitate an intelligent choice of prior for a fully Bayesian analysis.
Bayesian learning in neural networks for sequence processing
- Computer Science
- 2007
An extension of the Bayesian framework to the modelli ng of multivariate time-dependent data with feedforward and recurrent neural networks is proposed and a general, albeit computationally expensive procedure is suggested.
References
SHOWING 1-10 OF 80 REFERENCES
Consistency of posterior distributions for neural networks
- Computer Science, MathematicsNeural Networks
- 2000
Bayesian Learning for Neural Networks
- Computer Science
- 1995
Bayesian Learning for Neural Networks shows that Bayesian methods allow complex neural network models to be used without fear of the "overfitting" that can occur with traditional neural network learning methods.
A Practical Bayesian Framework for Backpropagation Networks
- Computer ScienceNeural Computation
- 1992
A quantitative and practical Bayesian framework is described for learning of mappings in feedforward networks that automatically embodies "Occam's razor," penalizing overflexible and overcomplex models.
Bayesian Learning via Stochastic Dynamics
- Computer ScienceNIPS
- 1992
Bayesian methods avoid overfitting and poor generalization by averaging the outputs of many networks with weights sampled from the posterior distribution given the training data, by simulating a stochastic dynamical system that has the posterior as its stationary distribution.
A Practical Bayesian Framework for Backprop Networks
- Computer Science
- 1991
A quantitative and practical Bayesian framework is described for learning of mappings in feedforward networks and a good correlation between generalisation ability and the Bayesian evidence is obtained.
Ace of Bayes : Application of Neural
- Computer Science
- 1993
Bayesian backprop is applied in the prediction of fat content in minced meat from near infrared spectra and outperforms \early stopping" as well as quadratic regression.
Neural networks for pattern recognition
- Computer Science
- 1995
This is the first comprehensive treatment of feed-forward neural networks from the perspective of statistical pattern recognition, and is designed as a text, with over 100 exercises, to benefit anyone involved in the fields of neural computation and pattern recognition.
Neural Networks: A Review from a Statistical Perspective
- Computer Science
- 1994
This paper informs a statistical readership about Artificial Neural Networks (ANNs), points out some of the links with statistical methodology and encourages cross-disciplinary research in the…
Bayesian training of backpropagation networks by the hybrid Monte-Carlo method
- Computer Science
- 1992
It is shown that Bayesian training of backpropagation neural networks can feasibly be performed by the Hybrid Monte Carlo method, and the method has been applied to a test problem, demonstrating that it can produce good predictions, as well as an indication of the uncertainty of these predictions.
Neural networks in applied statistics
- Computer Science
- 1996
The principles of the multilayer feedforward network that is among the most commonly used neural networks in practical problems are introduced and the accuracy of neural network models relative to the accuracy obtained using other computer-intensive, nonmodel-based techniques is evaluated.