Information Theory and Statistical Mechanics

@article{Jaynes1957InformationTA,
  title={Information Theory and Statistical Mechanics},
  author={Edwin T. Jaynes},
  journal={Physical Review},
  year={1957},
  volume={106},
  pages={620-630}
}
  • E. Jaynes
  • Published 15 October 1957
  • Physics
  • Physical Review
Treatment of the predictive aspect of statistical mechanics as a form of statistical inference is extended to the density-matrix formalism and applied to a discussion of the relation between irreversibility and information loss. A principle of "statistical complementarity" is pointed out, according to which the empirically verifiable probabilities of statistical mechanics necessarily correspond to incomplete predictions. A preliminary discussion is given of the second law of thermodynamics and… 

Figures from this paper

Irreversibility, Probability and Entropy
There is a single consistent resolution of the reversible microdynamicsirreversible maero dynamics problem, stemming from better understanding of the role of probability in physics: the Bayesian
Thermodynamics and Information Theory
Statistical Mechanics and the Maximum Entropy Method
TLDR
This course reviews the foundations and methods of statistical mechanics in their relation to the maximum entropy principle and shows that two methods currently used in this respect, based on the principle of indifference and on the Principle of maximum statistical entropy, respectively, are equivalent.
The Three Phases of Statistical Mechanics
The foundations of statistical mechanics are reviewed, based on the principle of maximum entropy, and this principle is shown to underlie the fundamental mechanisms of both equilibrium and
An identity of Chernoff bounds with an interpretation in statistical physics and applications in information theory
  • N. Merhav
  • Physics
    2008 IEEE International Symposium on Information Theory
  • 2008
TLDR
Several information-theoretic application examples are described, where the analysis of this large deviations rate function of the probability a certain rare event naturally arises, which results in several relationships between information theory and statistical physics.
Entropic Formulation of Statistical Mechanics
We present an alternative formulation of Equilibrium Statistical Mechanics which follows the method based on the maximum statistical entropy principle in Information Theory combined with the use of
Maximum information entropy principle and the interpretation of probabilities in statistical mechanics − a short review
TLDR
It is shown that the relative frequencies of the ensemble of systems prepared under identical conditions actually correspond to the MaxEnt probabilites in the limit of a large number of systems in the ensemble, implying that the probabilities in statistical mechanics can be interpreted, independently of the frequency interpretation, on the basis of the maximum information entropy principle.
Foundations of statistical mechanics
Developments in the foundations of statistical mechanics during the past ten years or so. The author discusses how statistical concepts enter the treatment of deterministic mechanical systems, with
Foundations of Statistical Mechanics: in and out of Equilibrium
The first part of the paper is devoted to the foundations, that is the mathematical and physical justification, of equilibrium statistical mechanics. It is a pedagogical attempt, mostly based on
...
1
2
3
4
5
...