Local information transfer as a spatiotemporal filter for complex systems.

@article{Lizier2008LocalIT,
  title={Local information transfer as a spatiotemporal filter for complex systems.},
  author={Joseph T. Lizier and Mikhail Prokopenko and Albert Y. Zomaya},
  journal={Physical review. E, Statistical, nonlinear, and soft matter physics},
  year={2008},
  volume={77 2 Pt 2},
  pages={
          026110
        }
}
We present a measure of local information transfer, derived from an existing averaged information-theoretical measure, namely, transfer entropy. Local transfer entropy is used to produce profiles of the information transfer into each spatiotemporal point in a complex system. These spatiotemporal profiles are useful not only as an analytical tool, but also allow explicit investigation of different parameter settings and forms of the transfer entropy metric itself. As an example, local transfer… 

Figures from this paper

Spatiotemporal dynamics driven by the maximization of local information transfer
TLDR
It is found that, within a certain condition of limited memory, even if each cell is driven to maximize the LTE, the entire system cannot reach toward its theoretical maximum value at all due to its intrinsic property, in which the system is dynamically bounding its limit on its own.
Symbolic local information transfer
TLDR
This paper proposes measures called symbolic local transfer entropies, and applies them to two test models, the coupled map lattice system and the Bak-Sneppen model, to show their relevance to spatiotemporal systems that have continuous states, and demonstrates that these measures can provide novel insight to the model.
Measuring the Dynamics of Information Processing on a Local Scale in Time and Space
TLDR
This chapter reviews the mathematics of how to measure local entropy and mutual information values at specific observations of time-series processes, and describes how these measures can reveal much more intricate details about the dynamics of complex systems than their more well-known “average” measures do.
Analyzing Information Distribution in Complex Systems
TLDR
A recent estimator of partial information decomposition is applied to characterize the dynamics of two different complex systems and finds that while redundant information obtains a maximum at the critical point, synergistic information peaks in the disorder phase.
Adaptive Local Information Transfer in Random Boolean Networks
TLDR
This work proposes an adaptive random Boolean network model in which each unit rewires its incoming arcs from other units to balance stability of its information processing based on the measurement of the local information transfer pattern.
Information storage and transfer in the synchronization process in locally-connected networks
TLDR
Information-theoretical techniques are used to view the synchronization process as a distributed computation, and to measure the information storage and transfer at each node in the system at each time step, producing novel insights, including that the computation of the synchronized state appears to be completed much more quickly than application-specific measures would indicate.
Coherent information structure in complex computation
TLDR
It is conjecture that coherent information structure is a defining feature of complex computation, particularly in biological systems or artificially evolved computation that solves human-understandable tasks.
Measuring information storage and transfer in swarms
TLDR
Two localized informationtheoretic measures adapted to the task of tracing the information dynamics in a kinematic context can be applied to non-trivial models and can tell us about the dynamics within these model where the authors can not rely on visual intuitions.
Differentiating information transfer and causal effect
TLDR
It is shown that causal information flow is a primary tool to describe the causal structure of a system, while information transfer can then be used to describes the emergent computation on that causal structure.
Specific transfer entropy and other state-dependent transfer entropies for continuous-state input-output systems.
TLDR
The utility of specific transfer entropy and the proposed estimation procedures with two model systems are found to be more, and more easily interpretable, information about an input-output system compared to currently existing methods.
...
...

References

SHOWING 1-10 OF 84 REFERENCES
Automatic filters for the detection of coherent structure in spatiotemporal systems.
TLDR
Two approaches to automatically filter the changing configurations of spatial dynamical systems and extract coherent structures are presented, a modification of the local Lyapunov exponent approach suitable to cellular automata and other discrete spatial systems and a more traditional approach based on an order parameter and free energy is compared.
Quantifying self-organization with optimal predictors.
TLDR
This Letter proposes a new criterion, namely, an internally generated increase in the statistical complexity, the amount of information required for optimal prediction of the system's dynamics, for spatially extended dynamical systems.
Regularities unseen, randomness observed: levels of entropy convergence.
TLDR
Several phenomenological approaches to applying information theoretic measures of randomness and memory to stochastic and deterministic processes are synthesized by using successive derivatives of the Shannon entropy growth curve to look at the relationships between a process's entropy convergence behavior and its underlying computational structure.
The attractor—basin portrait of a cellular automaton
TLDR
The attractor-basin portrait of nonlinear elementary CA rule 18 is described, whose global dynamics is largely determined by a single regular attracting domain, and it is confirmed that in rule 18, isolated dislocation trajectories, as well as a dislocation gas, agree extremely well with the classical model of annihilating diffusive particles.
Measuring information transfer
  • Schreiber
  • Computer Science
    Physical review letters
  • 2000
TLDR
An information theoretic measure is derived that quantifies the statistical coherence between systems evolving in time and is able to distinguish effectively driving and responding elements and to detect asymmetry in the interaction of subsystems.
Information transfer between solitary waves in the saturable Schrödinger equation
In this paper we study the transfer of information between colliding solitary waves. By this we mean the following: The state of a solitary wave is a set of parameters, such as amplitude, width,
Mapping Information Flow in Sensorimotor Networks
TLDR
The results suggest a fundamental link between physical embeddedness and information, highlighting the effects of embodied interactions on internal (neural) information processing, and illuminating the role of various system components on the generation of behavior.
...
...