Maximum information entropy principle and the interpretation of probabilities in statistical mechanics − a short review

  title={Maximum information entropy principle and the interpretation of probabilities in statistical mechanics − a short review},
  author={Domagoj Kuic},
  journal={The European Physical Journal B},
  • Domagoj Kuic
  • Published 16 May 2016
  • Mathematics, Physics
  • The European Physical Journal B
Abstract In this paper an alternative approach to statistical mechanics based on the maximum information entropy principle (MaxEnt) is examined, specifically its close relation with the Gibbs method of ensembles. It is shown that the MaxEnt formalism is the logical extension of the Gibbs formalism of equilibrium statistical mechanics that is entirely independent of the frequentist interpretation of probabilities only as factual (i.e. experimentally verifiable) properties of the real world… 
1 Citations
Paradigms of Cognition
An abstract, quantitative theory which connects elements of information —key ingredients in the cognitive proces—is developed and seemingly unrelated results are thereby unified, providing a general framework for the treatment of a multitude of global optimization problems across a range of disciplines such as geometry, statistics and statistical physics.


Predictive Statistical Mechanics and Macroscopic Time Evolution: Hydrodynamics and Entropy Production
In the previous papers (Kuić et al. in Found Phys 42:319–339, 2012; Kuić in arXiv:1506.02622, 2015), it was demonstrated that applying the principle of maximum information entropy by maximizing the
Information Theory and Statistical Mechanics
Treatment of the predictive aspect of statistical mechanics as a form of statistical inference is extended to the density-matrix formalism and applied to a discussion of the relation between
Predictive statistical mechanics and macroscopic time evolution. A model for closed Hamiltonian systems
Predictive statistical mechanics is a form of inference from available data, without additional assumptions, for predicting reproducible phenomena. By applying it to systems with Hamiltonian
Gibbs vs Boltzmann Entropies
The status of the Gibbs and Boltzmann expressions for entropy has been a matter of some confusion in the literature. We show that: (1) the Gibbs H function yields the correct entropy as defined in
Probability theory: the logic of science
This is a remarkable book by a remarkable scientist. E. T. Jaynes was a physicist, principally theoretical, who found himself driven to spend much of his life advocating, defending and developing a
Entropy production fluctuation theorem and the nonequilibrium work relation for free energy differences.
  • G. Crooks
  • Physics, Medicine
    Physical review. E, Statistical physics, plasmas, fluids, and related interdisciplinary topics
  • 1999
A generalized version of the fluctuation theorem is derived for stochastic, microscopically reversible dynamics and this generalized theorem provides a succinct proof of the nonequilibrium work relation.
Time Evolution in Macroscopic Systems. II. The Entropy
The concept of entropy in nonequilibrium macroscopic systems is investigated in the light of an extended equation of motion for the density matrix obtained in a previous study. It is found that a
Fluctuation theorem for arbitrary open quantum systems.
The validity of the Crooks theorem and of the Jarzynski equality is extended to open quantum systems because the thermodynamic equilibrium free energy of an open quantum system in contact with a thermal environment is the difference between thefree energy of the total system and that of the bare environment.
Macroscopic Time Evolution and MaxEnt Inference for Closed Systems with Hamiltonian Dynamics
MaxEnt inference algorithm and information theory are relevant for the time evolution of macroscopic systems considered as problem of incomplete information. Two different MaxEnt approaches are
An introduction to chaos in nonequilibrium statistical mechanics
Preface 1. Non-equilibrium statistical mechanics 2. The Boltzmann equation 3. Liouville's equation 4. Poincare recurrence theorem 5. Boltzmann's ergodic hypothesis 6. Gibbs' picture-mixing systems 7.