# Bayesian Interpolation

@article{Mackay1992BayesianI, title={Bayesian Interpolation}, author={David J. C. Mackay}, journal={Neural Computation}, year={1992}, volume={4}, pages={415-447} }

Although Bayesian analysis has been in use since Laplace, the Bayesian method of model-comparison has only recently been developed in depth. [...] Key Method Regularizing constants are set by examining their posterior probability distribution. Alternative regularizers (priors) and alternative basis sets are objectively compared by evaluating the evidence for them. Occam's razor is automatically embodied by this process. The way in which Bayes infers the values of regularizing constants and noise levels has an… Expand

## Topics from this paper

## 3,915 Citations

Bayesian Methods for Backpropagation Networks

- Computer Science
- 1996

This chapter describes numerical techniques based on Gaussian approximations for implementation of powerful and practical methods for controlling, comparing, and using adaptive network models.

Backward specification of prior in bayesian inference as an inverse problem

- Mathematics
- 2004

Specification of prior distribution is one of the most important methodological as well practical problems in Bayesian inference. Although a number of approaches have been proposed, none of them is…

Extended Bayesian learning

- Computer ScienceESANN
- 1997

The extended Bayesian learning (EBL) approach consists of considering a more general form of priors by using several weight classes and by considering the mean of the Gaussian distribution to be another hyperparameter.

Bayesian methods for adaptive models

- Mathematics
- 1992

The Bayesian framework for model comparison and regularisation is demonstrated by studying interpolation and classification problems modelled with both linear and non-linear models. This framework…

Bayesian regression filters and the issue of priors

- Mathematics, Computer ScienceNeural Computing & Applications
- 2005

An online learning algorithm is derived which solves regression problems with a Kalman filter, without the risk of over-fitting, which approaches the true Bayesian posterior in the infinite dimension limit.

Comparison of Approximate Methods for Handling Hyperparameters

- Computer Science, MathematicsNeural Computation
- 1999

Two approximate methods for computational implementation of Bayesian hierarchical models that include unknown hyperparameters such as regularization constants and noise levels are examined, and the evidence framework is shown to introduce negligible predictive error under straightforward conditions.

Priors on the Variance in Sparse Bayesian Learning; the demi-Bayesian Lasso

- Computer Science
- 2008

This work outlines simple modifications of existing algorithms to solve this new variant which essentially uses type-II maximum likelihood to fit the Bayesian Lasso model and proposes an Elastic-net heuristic to help with modeling correlated inputs.

Bayesian Integration of Rule Models

Although Bayesian model averaging (BMA) is in principle the optimal method for combining learned models, it has received relatively little attention in the machine learning literature. This article…

Bayesian Function Learning Using MCMC Methods

- Computer ScienceIEEE Trans. Pattern Anal. Mach. Intell.
- 1998

It is shown that a rigorous Bayesian solution can be efficiently implemented by resorting to a Markov chain Monte Carlo (MCMC) simulation scheme.

Evaluation of gaussian processes and other methods for non-linear regression

- Computer Science
- 1997

It is shown that a Bayesian approach to learning in multi-layer perceptron neural networks achieves better performance than the commonly used early stopping procedure, even for reasonably short amounts of computation time.

## References

SHOWING 1-10 OF 74 REFERENCES

Bayesian Inductive Inference and Maximum Entropy

- Mathematics
- 1988

The principles of Bayesian reasoning are reviewed and applied to problems of inference from data sampled from Poisson, Gaussian and Cauchy distributions. Probability distributions (priors and…

Bayesian Mixture Modeling by Monte Carlo Simulation

- Mathematics
- 1991

It is shown that Bayesian inference from data modeled by a mixture distribution can feasibly be performed via Monte Carlo simulation. This method exhibits the true Bayesian predictive distribution,…

Bayesian analysis. I. Parameter estimation using quadrature NMR models

- Mathematics
- 1990

Abstract In the analysis of magnetic resonance data, a great deal of prior information is available which is ordinarily not used. For example, considering high-resolution NMR spectroscopy, one knows…

A Bayesian comparison of different classes of dynamic models using empirical data

- Mathematics
- 1977

This paper deals with the Bayesian methods of comparing different types of dynamical structures for representing the given set of observations. Specifically, given that a given process y(\cdot) obeys…

A Practical Bayesian Framework for Backpropagation Networks

- Mathematics, Computer ScienceNeural Computation
- 1992

A quantitative and practical Bayesian framework is described for learning of mappings in feedforward networks that automatically embodies "Occam's razor," penalizing overflexible and overcomplex models.

Maximum Entropy and Bayesian Methods in Applied Statistics: Bayesian Methods: General Background

- Computer Science, Mathematics
- 1986

We note the main points of history, as a framework on which to hang many background remarks concerning the nature and motivation of Bayesian/Maximum Entropy methods. Experience has shown that these…

A Practical Bayesian Framework for Backprop Networks

- Mathematics
- 1991

A quantitative and practical Bayesian framework is described for learning of mappings in feedforward networks. The framework makes possible: (1) objective comparisons between solutions using…

The Evidence Framework Applied to Classification Networks

- Mathematics, Computer ScienceNeural Computation
- 1992

It is demonstrated that the Bayesian framework for model comparison described for regression models in MacKay (1992a,b) can also be applied to classification problems and an information-based data selection criterion is derived and demonstrated within this framework.

Developments in Maximum Entropy Data Analysis

- Mathematics
- 1989

The Bayesian derivation of “Classic” MaxEnt image processing (Skilling 1989a) shows that exp(αS(f,m)), where S(f,m) is the entropy of image f relative to model m, is the only consistent prior…

Regularization Uses Fractal Priors

- Computer ScienceAAAI
- 1987

This paper shows that regularization is an example of Bayesian modeling, and that using the regularization energy function for the surface interpolation problem results in a prior model that is fractal (self-affine over a range of scales).