The Brandeis Dice Problem and Statistical Mechanics

@article{Enk2014TheBD,
  title={The Brandeis Dice Problem and Statistical Mechanics},
  author={Steven J. van Enk},
  journal={Studies in History and Philosophy of Modern Physics},
  year={2014},
  volume={48},
  pages={1-6}
}
  • S. V. Enk
  • Published 28 August 2014
  • Physics
  • Studies in History and Philosophy of Modern Physics
5 Citations

Tables from this paper

Exploring Biased Probability Using Loaded Dice: An Active Learning Exercise with Analogy to Entropic and Energetic Determinants of Equilibria in Chemical Systems

The equilibrium position of a chemical system is based on two distinct considerations, namely, the entropy and enthalpy differences between the states. Loaded dice provide a physical system for

Bargaining with entropy and energy

TLDR
This work predicts thermalization of a non-equilibrium statistical system employing the axiom of affine covariance related to the freedom of changing initial points and dimensions for entropy and energy together with the contraction invariance of the entropy-energy diagram.

Geometric probability theory and Jaynes’s methodology

We provide a generalization of the approach to geometric probability advanced by the great mathematician Gian Carlo Rota, in order to apply it to generalized probabilistic physical theories. In

References

SHOWING 1-10 OF 16 REFERENCES

What is Probability

Probabilities may be subjective or objective; we are concerned with both kinds of probability, and the relationship between them. The fundamental theory of objective probability is quantum mechanics:

Updating, supposing, and maxent

ConclusionThe philosophical controversy concerning the logical status of MAXENTmay be in large measure due to the conflation of two distinct logical roles:(1) A general inductive principle for

Maximum entropy and Bayesian data analysis: Entropic prior distributions.

  • A. CatichaR. Preuss
  • Computer Science
    Physical review. E, Statistical, nonlinear, and soft matter physics
  • 2004
TLDR
The method of maximum (relative) entropy (ME) is used to translate the information contained in the known form of the likelihood into a prior distribution for Bayesian inference.

Some random observations

ConclusionOf course, the rationale of PME is so different from what has been taught in “orthodox” statistics courses for fifty years, that it causes conceptual hangups for many with conventional

Jaynes's maximum entropy prescription and probability theory

Jaynes's prescription of maximizing the information-theoretic entropy is applied in a special situation to determine a certain set of posterior probabilities (when evidence fixing the expected value

Higher order probabilities and intervals

  • H. Kyburg
  • Mathematics
    Int. J. Approx. Reason.
  • 1988

Information Theory and Statistical Mechanics

Treatment of the predictive aspect of statistical mechanics as a form of statistical inference is extended to the density-matrix formalism and applied to a discussion of the relation between

On the relation between plausibility logic and the maximum-entropy principle: a numerical study

TLDR
A numerical collection of plausibility distributions given by the maximum-entropy principle and by plausibility logic for a set of fifteen simple problems: throwing dice.

Probability, Information and Entropy

The claim that information theory provides a foundation for statistical thermodynamics which is independent of the use of ensembles can be sustained only if probabilities can be determined