Bayes not Bust! Why Simplicity is no Problem for Bayesians1

  title={Bayes not Bust! Why Simplicity is no Problem for Bayesians1},
  author={David L. Dowe and Steve Gardner and Graham Oppy},
  journal={The British Journal for the Philosophy of Science},
  pages={709 - 754}
The advent of formal definitions of the simplicity of a theory has important implications for model selection. But what is the best way to define simplicity? Forster and Sober ([1994]) advocate the use of Akaike's Information Criterion (AIC), a non-Bayesian formalisation of the notion of simplicity. This forms an important part of their wider attack on Bayesianism in the philosophy of science. We defend a Bayesian alternative: the simplicity of a theory is to be characterised in terms of… 
What is a Bayesian Model Selection Procedure
This work argues that instead of the orthodox Bayesian approach, instrumental Bayesianism prevails in practice: the Bayesian framework is understood as a convenient mathematical machinery and conceptual toolbox, not as a philosophy of inductive inference opposed to frequentist reasoning.
MML Is Not Consistent for Neyman-Scott
  • M. Brand
  • Computer Science
    IEEE Transactions on Information Theory
  • 2020
This work investigates the Neyman-Scott estimation problem, an oft-cited showcase for the consistency of MML, and shows that even with a natural choice of prior neither SMML nor its popular approximations are consistent for it, thereby providing a counterexample to the general claim.
MML, hybrid Bayesian network graphical models, statistical consistency, invarianc
Bayes and the simplicity principle in perception.
  • J. Feldman
  • Computer Science
    Psychological review
  • 2009
The algebraic approach brings out the compositional structure inherent in such spaces, showing how perceptual interpretations are composed from a lexicon of primitive perceptual descriptors.
AIC and the challenge of complexity: A case study from ecology.
The role of Bayesian philosophy within Bayesian model selection
It is argued that Bayesian model selection procedures are very diverse in their inferential target and their justification, and substantiates this claim by means of case studies on three selected procedures: MML, BIC and DIC.
The Weight of Simplicity in Statistical Model Comparison May
  • Mathematics
  • 2010
The epistemic weight of simplicity in science has, in the last fifteen years, been extensively discussed in the framework of statistical model comparison. This paper defends three theses: First, it
Coherence, Explanation, and Hypothesis Selection
  • D. H. Glass
  • Computer Science
    The British Journal for the Philosophy of Science
  • 2021
By overcoming some of the problems with the previous approach, this work provides a more adequate defence of IBE and suggests that IBE not only tracks truth but also has practical advantages over the previous approaches.
Bayesian naturalness, simplicity, and testability applied to the $B-L$ MSSM GUT using GPU Monte Carlo
The lack of decisive experimental results to settle fundamental questions in physics has led to increasing reliance upon intuitive criteria such as naturalness, simplicity, and testability. We argue
Evidentiary inference in evolutionary biology
Evidence and Evolution compiles and integrates many of Sober’s recent publications and functions as an excellent detachable introduction to central topics in formal epistemology.


Simplicity, Inference and Modelling
The idea that simplicity matters in science is as old as science itself, with the much cited example of Ockham's Razor, 'entia non sunt multiplicanda praeter necessitatem': entities are not to be
Model Selection in Science: The Problem of Language Variance
  • M. Forster
  • Philosophy
    The British Journal for the Philosophy of Science
  • 1999
Recent solutions to the curve-fitting problem, described in Forster and Sober ([1995]), trade off the simplicity and fit of hypotheses by defining simplicity as the paucity of adjustable parameters.
Bayes and Bust: Simplicity as a Problem for a Probabilist's Approach to Confirmation1
  • M. Forster
  • Philosophy
    The British Journal for the Philosophy of Science
  • 1995
The central problem with Bayesian philosophy of science is that it cannot take account of the relevance of simplicity and unification to confirmation, induction, and scientific inference. The
MML mixture modelling of multi-state, Poisson, von Mises circular and Gaussian distributions
The MML theory can be regarded as the theory with the highest posterior probability, and the MML mixture modelling program, Snob, uses the message lengths from various parameter estimates to enable it to combine parameter estimation with selection of the number of components.
How to Tell When Simpler, More Unified, or Less Ad Hoc Theories will Provide More Accurate Predictions
It is argued that this approach throws light on the theoretical virtues of parsimoniousness, unification, and non ad hocness, on the dispute about Bayesianism, and on empiricism and scientific realism.
Key Concepts in Model Selection: Performance and Generalizability.
  • E M Forster
  • Biology
    Journal of mathematical psychology
  • 2000
It seems that simplicity and parsimony may be an additional factor in managing these errors, in which case the standard methods of model selection are incomplete implementations of Occam's razor.
MML clustering of multi-state, Poisson, von Mises circular and Gaussian distributions
This work outlines how MML is used for statistical parameter estimation, and how the MML mixture modelling program, Snob, uses the message lengths from various parameter estimates to enable it to combine parameter estimation with selection of the number of components and estimation of the relative abundances of the components.
Refinements of MDL and MML Coding
We discuss Rissanen’s scheme of ‘complete coding’ in which a two-part data code is further shortened by conditioning the second part not only on the estimates, but also on the fact that these
MML Estimation of the Parameters of the Sherical Fisher Distribution
This work applies the information-theoretic Minimum Message Length principle to the problem of estimating the concentration parameter, κ, of spherical Fisher distributions, and shows that the MML estimator compares quite favourably against alternative Bayesian methods.
Minimum Message Length and Kolmogorov Complexity
This work attempts to establish a parallel between a restricted (two-part) version of the Kolmogorov model and the minimum message length approach to statistical inference and machine learning of Wallace and Boulton (1968), in which an ‘explanation’ of a data string is modelled as a two-part message.