Bayesian Interpolation

@article{Mackay1992BayesianI,
  title={Bayesian Interpolation},
  author={David J. C. Mackay},
  journal={Neural Computation},
  year={1992},
  volume={4},
  pages={415-447}
}
  • D. Mackay
  • Published 1 May 1992
  • Computer Science
  • Neural Computation
Although Bayesian analysis has been in use since Laplace, the Bayesian method of model-comparison has only recently been developed in depth. [...] Key Method Regularizing constants are set by examining their posterior probability distribution. Alternative regularizers (priors) and alternative basis sets are objectively compared by evaluating the evidence for them. Occam's razor is automatically embodied by this process. The way in which Bayes infers the values of regularizing constants and noise levels has an…Expand
Bayesian Methods for Backpropagation Networks
TLDR
This chapter describes numerical techniques based on Gaussian approximations for implementation of powerful and practical methods for controlling, comparing, and using adaptive network models.
Backward specification of prior in bayesian inference as an inverse problem
Specification of prior distribution is one of the most important methodological as well practical problems in Bayesian inference. Although a number of approaches have been proposed, none of them is
Extended Bayesian learning
TLDR
The extended Bayesian learning (EBL) approach consists of considering a more general form of priors by using several weight classes and by considering the mean of the Gaussian distribution to be another hyperparameter.
Bayesian methods for adaptive models
The Bayesian framework for model comparison and regularisation is demonstrated by studying interpolation and classification problems modelled with both linear and non-linear models. This framework
Bayesian regression filters and the issue of priors
TLDR
An online learning algorithm is derived which solves regression problems with a Kalman filter, without the risk of over-fitting, which approaches the true Bayesian posterior in the infinite dimension limit.
Comparison of Approximate Methods for Handling Hyperparameters
  • D. Mackay
  • Computer Science, Mathematics
    Neural Computation
  • 1999
TLDR
Two approximate methods for computational implementation of Bayesian hierarchical models that include unknown hyperparameters such as regularization constants and noise levels are examined, and the evidence framework is shown to introduce negligible predictive error under straightforward conditions.
Priors on the Variance in Sparse Bayesian Learning; the demi-Bayesian Lasso
TLDR
This work outlines simple modifications of existing algorithms to solve this new variant which essentially uses type-II maximum likelihood to fit the Bayesian Lasso model and proposes an Elastic-net heuristic to help with modeling correlated inputs.
Bayesian Integration of Rule Models
Although Bayesian model averaging (BMA) is in principle the optimal method for combining learned models, it has received relatively little attention in the machine learning literature. This article
Bayesian Function Learning Using MCMC Methods
TLDR
It is shown that a rigorous Bayesian solution can be efficiently implemented by resorting to a Markov chain Monte Carlo (MCMC) simulation scheme.
Evaluation of gaussian processes and other methods for non-linear regression
TLDR
It is shown that a Bayesian approach to learning in multi-layer perceptron neural networks achieves better performance than the commonly used early stopping procedure, even for reasonably short amounts of computation time.
...
1
2
3
4
5
...

References

SHOWING 1-10 OF 74 REFERENCES
Bayesian Inductive Inference and Maximum Entropy
The principles of Bayesian reasoning are reviewed and applied to problems of inference from data sampled from Poisson, Gaussian and Cauchy distributions. Probability distributions (priors and
Bayesian Mixture Modeling by Monte Carlo Simulation
It is shown that Bayesian inference from data modeled by a mixture distribution can feasibly be performed via Monte Carlo simulation. This method exhibits the true Bayesian predictive distribution,
Bayesian analysis. I. Parameter estimation using quadrature NMR models
Abstract In the analysis of magnetic resonance data, a great deal of prior information is available which is ordinarily not used. For example, considering high-resolution NMR spectroscopy, one knows
A Bayesian comparison of different classes of dynamic models using empirical data
This paper deals with the Bayesian methods of comparing different types of dynamical structures for representing the given set of observations. Specifically, given that a given process y(\cdot) obeys
A Practical Bayesian Framework for Backpropagation Networks
  • D. Mackay
  • Mathematics, Computer Science
    Neural Computation
  • 1992
TLDR
A quantitative and practical Bayesian framework is described for learning of mappings in feedforward networks that automatically embodies "Occam's razor," penalizing overflexible and overcomplex models.
Maximum Entropy and Bayesian Methods in Applied Statistics: Bayesian Methods: General Background
We note the main points of history, as a framework on which to hang many background remarks concerning the nature and motivation of Bayesian/Maximum Entropy methods. Experience has shown that these
A Practical Bayesian Framework for Backprop Networks
A quantitative and practical Bayesian framework is described for learning of mappings in feedforward networks. The framework makes possible: (1) objective comparisons between solutions using
The Evidence Framework Applied to Classification Networks
  • D. Mackay
  • Mathematics, Computer Science
    Neural Computation
  • 1992
TLDR
It is demonstrated that the Bayesian framework for model comparison described for regression models in MacKay (1992a,b) can also be applied to classification problems and an information-based data selection criterion is derived and demonstrated within this framework.
Developments in Maximum Entropy Data Analysis
The Bayesian derivation of “Classic” MaxEnt image processing (Skilling 1989a) shows that exp(αS(f,m)), where S(f,m) is the entropy of image f relative to model m, is the only consistent prior
Regularization Uses Fractal Priors
TLDR
This paper shows that regularization is an example of Bayesian modeling, and that using the regularization energy function for the surface interpolation problem results in a prior model that is fractal (self-affine over a range of scales).
...
1
2
3
4
5
...