Shannon Entropy Rate of Hidden Markov Processes

@article{Jurgens2021ShannonER,
  title={Shannon Entropy Rate of Hidden Markov Processes},
  author={Alexandra M. Jurgens and James P. Crutchfield},
  journal={ArXiv},
  year={2021},
  volume={abs/2008.12886}
}
Hidden Markov chains are widely applied statistical models of stochastic processes, from fundamental physics and chemistry to finance, health, and artificial intelligence. The hidden Markov processes they generate are notoriously complicated, however, even if the chain is finite state: no finite expression for their Shannon entropy rate exists, as the set of their predictive features is generically infinite. As such, to date one cannot make general statements about how random they are nor how… 

Divergent Predictive States: The Statistical Complexity Dimension of Stationary, Ergodic Hidden Markov Processes

TLDR
This work addresses the complementary challenge of determining how structured hidden Markov processes are by calculating their statistical complexity dimension—the information dimension of the minimal set of predictive features.

Ambiguity rate of hidden Markov processes

TLDR
The ambiguity rate is demonstrated to be the (until now missing) correction to the Lyapunov dimension of an IFS’s attracting invariant set, which allows calculating their statistical complexity dimension—the information dimension of the minimal set of predictive features.

Functional thermodynamics of Maxwellian ratchets: Constructing and deconstructing patterns, randomizing and derandomizing behaviors

TLDR
Recent results from dynamical-systems and ergodic theories are adapted that efficiently and accurately calculate the entropy rates and the rate of statistical complexity divergence of general hidden Markov processes that accurately determine thermodynamic operating regimes for finite-state Maxwellian demons with arbitrary numbers of states and transitions.

Breakdown of random matrix universality in Markov models.

TLDR
It is shown that the data can be quantitatively understood in terms of the random model, and that brain activity lies close to the phase transition when engaged in unconstrained, task-free cognition-supporting the brain criticality hypothesis in this context.

Discovering Causal Structure with Reproducing-Kernel Hilbert Space ε-Machines

TLDR
A widely applicable method that infers causal structure directly from observations of a system's behaviors whether they are over discrete or continuous events or time, and robustly estimates causal structure in the presence of varying external and measurement noise levels and for very high-dimensional data.

Optimality and Complexity in Measured Quantum-State Stochastic Processes

Temporal sequences of quantum states are essential to quantum computation protocols, as used in quantum key distribution, and to quantum computing implementations, as witnessed by substantial efforts

Microseismic event detection in noisy environments with instantaneous spectral Shannon entropy.

TLDR
This work proposes a robust methodology based on the instantaneous-spectral Shannon entropy for capturing microseismic events in noisy environments without the requirement of data preprocessing.

Engines for predictive work extraction from memoryfull quantum stochastic processes

TLDR
A phase transition in the efficacy of knowledge for work extraction from quantum processes, which is without classical precedent is discovered, which opens up the prospect of machines that harness environmental free energy in an essentially quantum, essentially time-varying form.

Estimating Entropy Production from Waiting Time Distributions.

TLDR
A novel method for bounding the entropy production of physical and living systems which uses only the waiting time statistics of hidden Markov processes and can be directly applied to experimental data is introduced.

Improved bounds on entropy production in living systems

TLDR
By reformulating the problem within an optimization framework, the approach is able to infer improved bounds on the rate of entropy production from partial measurements of biological systems, and yields provably optimal estimates given certain measurable transition statistics.

References

SHOWING 1-10 OF 57 REFERENCES

Hidden Markov Process: A New Representation, Entropy Rate and Estimation Entropy

TLDR
A new representation of hidden Markov process using iterated function system provides a unified framework for the analysis of the two limiting entropies for this process, resulting in integral expressions for the limits.

Regularities unseen, randomness observed: levels of entropy convergence.

TLDR
Several phenomenological approaches to applying information theoretic measures of randomness and memory to stochastic and deterministic processes are synthesized by using successive derivatives of the Shannon entropy growth curve to look at the relationships between a process's entropy convergence behavior and its underlying computational structure.

Analyticity of Entropy Rate of Hidden Markov Chains

We prove that under mild positivity assumptions the entropy rate of a hidden Markov chain varies analytically as a function of the underlying Markov chain parameters. A general principle to determine

Hidden Markov processes

TLDR
An overview of statistical and information-theoretic aspects of hidden Markov processes (HMPs) is presented and consistency and asymptotic normality of the maximum-likelihood parameter estimator were proved under some mild conditions.

On the entropy rate of a hidden Markov model

TLDR
The central idea of this article is to replace the summation over all words of length n by a summations over a complete set of prefixes (or prefixset for brevity) so as to balance the contributions of all words in the bounds.

Predictive Rate-Distortion for Infinite-Order Markov Processes

TLDR
This work casts predictive rate-distortion objective functions in terms of the forward- and reverse-time causal states of computational mechanics, and shows that the resulting algorithms yield substantial improvements.

Functional thermodynamics of Maxwellian ratchets: Constructing and deconstructing patterns, randomizing and derandomizing behaviors

TLDR
Recent results from dynamical-systems and ergodic theories are adapted that efficiently and accurately calculate the entropy rates and the rate of statistical complexity divergence of general hidden Markov processes that accurately determine thermodynamic operating regimes for finite-state Maxwellian demons with arbitrary numbers of states and transitions.

Spectral Simplicity of Apparent Complexity, Part II: Exact Complexities and Complexity Spectra

TLDR
The meromorphic functional calculus developed in Part I overcomes the nondiagonalizability of linear operators that arises often in the temporal evolution of complex systems and is generic to the metadynamics of predicting their behavior and introduces analytical tractability to investigating information processing in discrete-event stochastic processes, symbolic dynamics, and chaotic dynamical systems.

New bounds on the entropy rate of hidden Markov processes

TLDR
A new approach to bounding the entropy rate of {Z/sub t/} is presented, by approximating the distribution of this random variable by the construction and study of a Markov process.

A Mathematical Theory of Communication

TLDR
It is proved that the authors can get some positive data rate that has the same small error probability and also there is an upper bound of the data rate, which means they cannot achieve the data rates with any encoding scheme that has small enough error probability over the upper bound.
...