• Corpus ID: 237260092

Duality Symmetry, Two Entropy Functions, and an Eigenvalue Problem in Gibbs' Theory

  title={Duality Symmetry, Two Entropy Functions, and an Eigenvalue Problem in Gibbs' Theory},
  author={Jeffrey Commons and Ying-Jen Yang and Hong Qian},
We generalize the convex duality symmetry in Gibbs’ statistical ensemble formulation, between Massieu’s free entropy ΦV,N (β) and the Gibbs entropy φV,N (u) as a function of mean internal energy u. The duality tells us that Gibbs thermodynamic entropy is to the law of large numbers (LLN) for arithmetic sample means what Shannon’s information entropy is to the LLN for empirical counting frequencies. Following the same logic, we identify u as the conjugate variable to counting frequency, a… 
Thermodynamic Behavior of Statistical Event Counting in Time: Independent and Correlated Measurements
We introduce an entropy analysis of time series, repeated measurements of statistical observables, based on an Eulerian homogeneous degree-one entropy function Φ(t, n) of time t and number of events
Statistical Thermodynamics and Data Infinitum: Conjugate Variables as Forces, and their Statistical Variations
Following ideas of Szilard, Mandelbrot and Hill, we show that a statistical thermodynamic structure can emerge purely from the infinitely large data limit under a probabilistic framework independent
On the Posterior Distribution of a Random Process Conditioned on Empirical Frequencies of a Finite Path: the i.i.d and finite Markov chain case
We obtain the posterior distribution of a random process conditioned on observing the empirical frequencies of a finite sample path. We find under a rather broad assumption on the “dependence
Legendre-Fenchel transforms capture layering transitions in porous media
We have investigated the state of a nanoconfined fluid in a slit pore in the canonical and isobaric ensembles. The systems were simulated with molecular dynamics simulations. The fluid has...


Emergence and Breaking of Duality Symmetry in Thermodynamic Behavior: Repeated Measurements and Macroscopic Limit
Thermodynamic laws are limiting behavior of the statistics of repeated measurements of an arbitrary system with a priori probability distribution. A duality symmetry arises, between
Minimum entropy production principle from a dynamical fluctuation law
The minimum entropy production principle provides an approximative variational characterization of close-to-equilibrium stationary states, both for macroscopic systems and for stochastic models.
Axiomatic derivation of the principle of maximum entropy and the principle of minimum cross-entropy
Jaynes's principle of maximum entropy and Kullbacks principle of minimum cross-entropy (minimum directed divergence) are shown to be uniquely correct methods for inductive inference when new
Statistical Theory of Equations of State and Phase Transitions. II. Lattice Gas and Ising Model
The problems of an Ising model in a magnetic field and a lattice gas are proved mathematically equivalent. From this equivalence an example of a two-dimensional lattice gas is given for which the
Mesoscopic kinetic basis of macroscopic chemical thermodynamics: A mathematical theory.
It is proved that in the macroscopic limit by merely allowing the molecular numbers to be infinite, the generalized mesoscopic free energy F^{(meso)} converges to φ^{ss}, the large deviation rate function for the stationary distributions.
Intrinsic and Extrinsic Thermodynamics for Stochastic Population Processes with Multi-Level Large-Deviation Structure
The work is meant to encourage development of inherent thermodynamic descriptions for rule-based systems and the living state, which are not conceived as reductive explanations to heat flows.
Probability theory: the logic of science
Foreword Preface Part I. Principles and Elementary Applications: 1. Plausible reasoning 2. The quantitative rules 3. Elementary sampling theory 4. Elementary hypothesis testing 5. Queer uses for
The Maximum Caliber Variational Principle for Nonequilibria.
Maximum caliber is reviewed, which is a maximum-entropy-like principle that can infer distributions of flows over pathways, given dynamical constraints, and is providing new insights into few-particle complex systems, including gene circuits, protein conformational reaction coordinates, network traffic, bird flocking, cell motility, and neuronal firing.
A new theorem of information theory
Consider a random experiment whose possible outcomes arez1,z2,...,zn. Let the prior probabilities be p10, ...,pn0, and let the posterior probabilities bep1,...,pn. It is shown that, subject to
Statistical Mechanics of Fluid Mixtures
Expressions for the chemical potentials of the components of gas mixtures and liquid solutions are obtained in terms of relatively simple integrals in the configuration spaces of molecular pairs. The