Local measures of information storage in complex distributed computation
@article{Lizier2012LocalMO, title={Local measures of information storage in complex distributed computation}, author={Joseph T. Lizier and Mikhail Prokopenko and Albert Y. Zomaya}, journal={Inf. Sci.}, year={2012}, volume={208}, pages={39-54} }
142 Citations
Local active information storage as a tool to understand distributed neural information processing
- Computer ScienceFront. Neuroinform.
- 2014
It is suggested that LAIS will be a useful quantity to test theories of cortical function, such as predictive coding, when measured on a local scale in time and space in voltage sensitive dye imaging data from area 18 of the cat.
Measuring the Dynamics of Information Processing on a Local Scale in Time and Space
- Computer Science
- 2014
This chapter reviews the mathematics of how to measure local entropy and mutual information values at specific observations of time-series processes, and describes how these measures can reveal much more intricate details about the dynamics of complex systems than their more well-known “average” measures do.
Towards a synergy-based approach to measuring information modification
- Computer Science2013 IEEE Symposium on Artificial Life (ALife)
- 2013
This work outlines how a recently-introduced axiomatic framework for measuring information redundancy and synergy, called partial information decomposition, can be applied to a perspective of distributed computation in order to quantify component operations on information.
Information modification and particle collisions in distributed computation.
- Computer ScienceChaos
- 2010
The separable information is introduced, a measure which locally identifies information modification events where separate inspection of the sources to a computation is misleading about its outcome.
Coherent information structure in complex computation
- Computer ScienceTheory in Biosciences
- 2011
It is conjecture that coherent information structure is a defining feature of complex computation, particularly in biological systems or artificially evolved computation that solves human-understandable tasks.
Measuring information storage and transfer in swarms
- Computer ScienceECAL
- 2011
Two localized informationtheoretic measures adapted to the task of tracing the information dynamics in a kinematic context can be applied to non-trivial models and can tell us about the dynamics within these model where the authors can not rely on visual intuitions.
Characterising information-theoretic storage and transfer in continuous time processes
- Computer SciencePhysical review. E
- 2018
A total account of information processing in this setting, incorporating information storage is given and it is found that a convergent rate of predictive capacity does not exist, arising through divergent rates of active information storage.
Information storage and transfer in the synchronization process in locally-connected networks
- Computer Science2011 IEEE Symposium on Artificial Life (ALIFE)
- 2011
Information-theoretical techniques are used to view the synchronization process as a distributed computation, and to measure the information storage and transfer at each node in the system at each time step, producing novel insights, including that the computation of the synchronized state appears to be completed much more quickly than application-specific measures would indicate.
Symbolic local information transfer
- Computer ScienceArXiv
- 2014
This paper proposes measures called symbolic local transfer entropies, and applies them to two test models, the coupled map lattice system and the Bak-Sneppen model, to show their relevance to spatiotemporal systems that have continuous states, and demonstrates that these measures can provide novel insight to the model.
Analyzing Information Distribution in Complex Systems
- Computer ScienceEntropy
- 2017
A recent estimator of partial information decomposition is applied to characterize the dynamics of two different complex systems and finds that while redundant information obtains a maximum at the critical point, synergistic information peaks in the disorder phase.
References
SHOWING 1-10 OF 59 REFERENCES
The local information dynamics of distributed computation in complex systems
- Computer Science
- 2012
A complete information-theoretic framework to quantify these operations on information, and in particular their dynamics in space and time, is applied to cellular automata, and delivers important insights into the fundamental nature of distributed computation and the dynamics of complex systems.
Information modification and particle collisions in distributed computation.
- Computer ScienceChaos
- 2010
The separable information is introduced, a measure which locally identifies information modification events where separate inspection of the sources to a computation is misleading about its outcome.
Coherent information structure in complex computation
- Computer ScienceTheory in Biosciences
- 2011
It is conjecture that coherent information structure is a defining feature of complex computation, particularly in biological systems or artificially evolved computation that solves human-understandable tasks.
Information storage and transfer in the synchronization process in locally-connected networks
- Computer Science2011 IEEE Symposium on Artificial Life (ALIFE)
- 2011
Information-theoretical techniques are used to view the synchronization process as a distributed computation, and to measure the information storage and transfer at each node in the system at each time step, producing novel insights, including that the computation of the synchronized state appears to be completed much more quickly than application-specific measures would indicate.
Local information transfer as a spatiotemporal filter for complex systems.
- PhysicsPhysical review. E, Statistical, nonlinear, and soft matter physics
- 2008
A measure of local information transfer, derived from an existing averaged information-theoretical measure, namely, transfer entropy, is presented, providing the first quantitative evidence for the long-held conjecture that the emergent traveling coherent structures known as particles are the dominant information transfer agents in cellular automata.
Partial information decomposition as a spatiotemporal filter.
- Computer ScienceChaos
- 2011
This work utilizes the new technique of partial information decomposition to show that previous information-theoretic measures can confound distinct sources of information, and proposes a new set of filters that more cleanly separate out the background domains, particles, and collisions that are typically associated with information storage, transfer, and modification in cellular automata.
Information Dynamics in Small-World Boolean Networks
- Computer ScienceArtificial Life
- 2011
An ensemble investigation of the computational capabilities of small-world networks as compared to ordered and random topologies finds that the ordered phase of the dynamics and topologies with low randomness are dominated by information storage, while the chaotic phase is dominated byInformation storage and information transfer.
Information storage, loop motifs, and clustered structure in complex networks.
- Computer SciencePhysical review. E, Statistical, nonlinear, and soft matter physics
- 2012
Investigation of the role of two- and three-node motifs in contributing to local information storage shows directed feedback and feedforward loop motifs are the dominant contributors to information storage capability, with their weighted motif counts locally positively correlated to storage capability.
Detecting Non-trivial Computation in Complex Dynamics
- Computer ScienceECAL
- 2007
This work quantifies the local information dynamics at each spatiotemporal point in a complex system in terms of each element of computation: information storage, transfer and modification to provide the first quantitative evidence that collisions between particles therein are the dominant information modification events.
Regularities unseen, randomness observed: levels of entropy convergence.
- Computer ScienceChaos
- 2003
Several phenomenological approaches to applying information theoretic measures of randomness and memory to stochastic and deterministic processes are synthesized by using successive derivatives of the Shannon entropy growth curve to look at the relationships between a process's entropy convergence behavior and its underlying computational structure.