Corpus ID: 229363479

Testing whether a Learning Procedure is Calibrated

@inproceedings{Cockayne2020TestingWA,
  title={Testing whether a Learning Procedure is Calibrated},
  author={Jon Cockayne and M. Graham and Chris J. Oates and T. J. Sullivan},
  year={2020}
}
  • Jon Cockayne, M. Graham, +1 author T. J. Sullivan
  • Published 2020
  • Mathematics
  • A learning procedure takes as input a dataset and performs inference for the parameters θ of a model that is assumed to have given rise to the dataset. Here we consider learning procedures whose output is a probability distribution, representing uncertainty about θ after seeing the dataset. Bayesian inference is a prime example of such a procedure but one can also construct other learning procedures that return distributional output. This paper studies conditions for a learning procedure to be… CONTINUE READING
    1 Citations

    Figures from this paper

    Probabilistic Iterative Methods for Linear Systems
    • 2
    • PDF

    References

    SHOWING 1-10 OF 53 REFERENCES
    A general framework for updating belief distributions
    • 162
    • PDF
    Calibrated Approximate Bayesian Inference
    • 3
    • PDF
    Bayesian fractional posteriors
    • 46
    • Highly Influential
    • PDF
    Bayesian Synthetic Likelihood
    • 87
    • PDF
    Bayesian Calibration of computer models
    • 2,782
    • PDF
    Variational Inference: A Review for Statisticians
    • 1,657
    • Highly Influential
    • PDF