# Shannon Entropy Rate of Hidden Markov Processes

@article{Jurgens2021ShannonER, title={Shannon Entropy Rate of Hidden Markov Processes}, author={Alexandra M. Jurgens and James P. Crutchfield}, journal={ArXiv}, year={2021}, volume={abs/2008.12886} }

Hidden Markov chains are widely applied statistical models of stochastic processes, from fundamental physics and chemistry to finance, health, and artificial intelligence. The hidden Markov processes they generate are notoriously complicated, however, even if the chain is finite state: no finite expression for their Shannon entropy rate exists, as the set of their predictive features is generically infinite. As such, to date one cannot make general statements about how random they are nor how…

## 10 Citations

### Divergent Predictive States: The Statistical Complexity Dimension of Stationary, Ergodic Hidden Markov Processes

- Computer ScienceChaos
- 2021

This work addresses the complementary challenge of determining how structured hidden Markov processes are by calculating their statistical complexity dimension—the information dimension of the minimal set of predictive features.

### Ambiguity rate of hidden Markov processes

- Computer SciencePhysical Review E
- 2021

The ambiguity rate is demonstrated to be the (until now missing) correction to the Lyapunov dimension of an IFS’s attracting invariant set, which allows calculating their statistical complexity dimension—the information dimension of the minimal set of predictive features.

### Functional thermodynamics of Maxwellian ratchets: Constructing and deconstructing patterns, randomizing and derandomizing behaviors

- Computer SciencePhysical Review Research
- 2020

Recent results from dynamical-systems and ergodic theories are adapted that efficiently and accurately calculate the entropy rates and the rate of statistical complexity divergence of general hidden Markov processes that accurately determine thermodynamic operating regimes for finite-state Maxwellian demons with arbitrary numbers of states and transitions.

### Breakdown of random matrix universality in Markov models.

- Computer SciencePhysical review. E
- 2021

It is shown that the data can be quantitatively understood in terms of the random model, and that brain activity lies close to the phase transition when engaged in unconstrained, task-free cognition-supporting the brain criticality hypothesis in this context.

### Discovering Causal Structure with Reproducing-Kernel Hilbert Space ε-Machines

- Computer ScienceChaos
- 2022

A widely applicable method that infers causal structure directly from observations of a system's behaviors whether they are over discrete or continuous events or time, and robustly estimates causal structure in the presence of varying external and measurement noise levels and for very high-dimensional data.

### Optimality and Complexity in Measured Quantum-State Stochastic Processes

- Physics
- 2022

Temporal sequences of quantum states are essential to quantum computation protocols, as used in quantum key distribution, and to quantum computing implementations, as witnessed by substantial eﬀorts…

### Microseismic event detection in noisy environments with instantaneous spectral Shannon entropy.

- Geology, Computer SciencePhysical review. E
- 2022

This work proposes a robust methodology based on the instantaneous-spectral Shannon entropy for capturing microseismic events in noisy environments without the requirement of data preprocessing.

### Engines for predictive work extraction from memoryfull quantum stochastic processes

- Computer Science
- 2022

A phase transition in the eﬃcacy of knowledge for work extraction from quantum processes, which is without classical precedent is discovered, which opens up the prospect of machines that harness environmental free energy in an essentially quantum, essentially time-varying form.

### Estimating Entropy Production from Waiting Time Distributions.

- Computer SciencePhysical review letters
- 2021

A novel method for bounding the entropy production of physical and living systems which uses only the waiting time statistics of hidden Markov processes and can be directly applied to experimental data is introduced.

### Improved bounds on entropy production in living systems

- Computer ScienceProceedings of the National Academy of Sciences
- 2021

By reformulating the problem within an optimization framework, the approach is able to infer improved bounds on the rate of entropy production from partial measurements of biological systems, and yields provably optimal estimates given certain measurable transition statistics.

## References

SHOWING 1-10 OF 57 REFERENCES

### Hidden Markov Process: A New Representation, Entropy Rate and Estimation Entropy

- MathematicsArXiv
- 2006

A new representation of hidden Markov process using iterated function system provides a unified framework for the analysis of the two limiting entropies for this process, resulting in integral expressions for the limits.

### Regularities unseen, randomness observed: levels of entropy convergence.

- Computer ScienceChaos
- 2003

Several phenomenological approaches to applying information theoretic measures of randomness and memory to stochastic and deterministic processes are synthesized by using successive derivatives of the Shannon entropy growth curve to look at the relationships between a process's entropy convergence behavior and its underlying computational structure.

### Analyticity of Entropy Rate of Hidden Markov Chains

- MathematicsIEEE Transactions on Information Theory
- 2006

We prove that under mild positivity assumptions the entropy rate of a hidden Markov chain varies analytically as a function of the underlying Markov chain parameters. A general principle to determine…

### Hidden Markov processes

- MathematicsIEEE Trans. Inf. Theory
- 2002

An overview of statistical and information-theoretic aspects of hidden Markov processes (HMPs) is presented and consistency and asymptotic normality of the maximum-likelihood parameter estimator were proved under some mild conditions.

### On the entropy rate of a hidden Markov model

- Computer ScienceInternational Symposium onInformation Theory, 2004. ISIT 2004. Proceedings.
- 2004

The central idea of this article is to replace the summation over all words of length n by a summations over a complete set of prefixes (or prefixset for brevity) so as to balance the contributions of all words in the bounds.

### Predictive Rate-Distortion for Infinite-Order Markov Processes

- Computer Science
- 2016

This work casts predictive rate-distortion objective functions in terms of the forward- and reverse-time causal states of computational mechanics, and shows that the resulting algorithms yield substantial improvements.

### Functional thermodynamics of Maxwellian ratchets: Constructing and deconstructing patterns, randomizing and derandomizing behaviors

- Computer SciencePhysical Review Research
- 2020

Recent results from dynamical-systems and ergodic theories are adapted that efficiently and accurately calculate the entropy rates and the rate of statistical complexity divergence of general hidden Markov processes that accurately determine thermodynamic operating regimes for finite-state Maxwellian demons with arbitrary numbers of states and transitions.

### Spectral Simplicity of Apparent Complexity, Part II: Exact Complexities and Complexity Spectra

- Computer ScienceChaos
- 2018

The meromorphic functional calculus developed in Part I overcomes the nondiagonalizability of linear operators that arises often in the temporal evolution of complex systems and is generic to the metadynamics of predicting their behavior and introduces analytical tractability to investigating information processing in discrete-event stochastic processes, symbolic dynamics, and chaotic dynamical systems.

### New bounds on the entropy rate of hidden Markov processes

- Computer ScienceInformation Theory Workshop
- 2004

A new approach to bounding the entropy rate of {Z/sub t/} is presented, by approximating the distribution of this random variable by the construction and study of a Markov process.

### A Mathematical Theory of Communication

- Computer Science
- 2006

It is proved that the authors can get some positive data rate that has the same small error probability and also there is an upper bound of the data rate, which means they cannot achieve the data rates with any encoding scheme that has small enough error probability over the upper bound.