Variability as a better characterization of Shannon entropy

  title={Variability as a better characterization of Shannon entropy},
  author={Gabriele Carcassi and Christine A. Aidala and Julian B. Barbour},
  journal={European Journal of Physics},
The Shannon entropy, one of the cornerstones of information theory, is widely used in physics, particularly in statistical mechanics. Yet its characterization and connection to physics remain vague, leaving ample room for misconceptions and misunderstanding. We will show that the Shannon entropy can be fully understood as measuring the variability of the elements within a given distribution: it characterizes how much variation can be found within a collection of objects. We will see that it is… 
Superdeterministic hidden-variables models II: arbitrary conspiracy
We prove that superdeterministic models of quantum mechanics are arbitrarily conspiratorial in a mathematically well-defined sense, by further development of the ideas presented in a previous article
Understanding physics: ‘What?’, ‘Why?’, and ‘How?’
  • Mario Hubert
  • Physics
    European Journal for Philosophy of Science
  • 2021
I want to combine two hitherto largely independent research projects, scientific understanding and mechanistic explanations. Understanding is not only achieved by answering why-questions, that is, by


Gibbs vs Boltzmann Entropies
The status of the Gibbs and Boltzmann expressions for entropy has been a matter of some confusion in the literature. We show that: (1) the Gibbs H function yields the correct entropy as defined in
Relative entropy, Haar measures and relativistic canonical velocity distributions
The thermodynamic maximum principle for the Boltzmann–Gibbs–Shannon (BGS) entropy is reconsidered by combining elements from group and measure theory. Our analysis starts by noting that the BGS
The Gibbs Paradox
We point out that an early work of J. Willard Gibbs (1875) contains a correct analysis of the “Gibbs Paradox” about entropy of mixing, free of any elements of mystery and directly connected to
Hamiltonian mechanics is conservation of information entropy
Time evolution of entropy, in various scenarios
While undergraduate texts describe the change, ΔS in entropies, from time t = 0 to t = ∞ , they do not speak about exactly how, Δ S ( t ) ≡ S ( t ) − S ( 0 ) , the change in entropy up to time t,
Entropy as Disorder: History of a Misconception
Before reading this essay, go to your kitchen and find a bottle of Italian salad dressing. Get one that’s been sitting still for a while at a fixed temperature—that is, one in thermal equilibrium.
Entropy and the second law: A pedagogical alternative
Thermodynamics in introductory physics typically begins with the ideas of temperature and heating and with the first law of thermodynamics. Then the more difficult parts come: entropy and the second
The different paths to entropy
To understand how the complex concept of entropy emerged, a trip into the past is proposed, reviewing the works of Clausius, Boltzmann, Gibbs and Planck and recalling the three definitions of entropy that Gibbs gives.
A formal derivation of the Gibbs entropy for classical systems following the Schrödinger quantum mechanical approach
In the traditional statistical mechanics textbooks, the entropy concept is first introduced for the microcanonical ensemble and then extended to the canonical and grand-canonical cases. However, in