Reversible Jump MCMC Simulated Annealing for Neural Networks

  title={Reversible Jump MCMC Simulated Annealing for Neural Networks},
  author={Christophe Andrieu and Nando de Freitas and Arnaud Doucet},
We propose a novel reversible jump Markov chain Monte Carlo (MCMC) simulated an­ nealing algorithm to optimize radial basis function (RBF) networks. This algorithm enables us to maximize the joint posterior distribution of the network parameters and the number of basis functions. It performs a global search in the joint space of the pa­ rameters and number of parameters, thereby surmounting the problem of local minima. We also show that by calibrating a Bayesian model, we can obtain the… CONTINUE READING
Highly Cited
This paper has 31 citations. REVIEW CITATIONS
Related Discussions
This paper has been referenced on Twitter 3 times. VIEW TWEETS


Publications citing this paper.
Showing 1-10 of 19 extracted citations


Publications referenced by this paper.
Showing 1-10 of 24 references

A new look at statistical model identification

H. Akaike
IEEE Transactions on Automatic Control • 1974
View 8 Excerpts
Highly Influenced

Robust full Bayesian learning for neural networks, Technical Report CUED/F­ INFENG/TR 343, Cambridge University Engi­ neering Department

C. Andrieu, J.F.G. de Freitas, A. Doucet

Asymptotic MAP criteria for model selection

IEEE Trans. Signal Processing • 1998
View 1 Excerpt

Bayesian ra­ dial basis functions of variable dimension

C. C. Holmes, B. K. Mallick
Neural Computation • 1998
View 2 Excerpts

Efficient stochastic maximum a posteriori estimation for harmonic signals

9th European Signal Processing Conference (EUSIPCO 1998) • 1998
View 1 Excerpt

Feedforward neu­ ral networks for nonparametric regression, Tech­ nical Report 98-02

D. Rios Insua, P. Muller
Institute of Statistics and De­ cision Sciences, • 1998

Similar Papers

Loading similar papers…