Towards a synergy-based approach to measuring information modification

@article{Lizier2013TowardsAS,
  title={Towards a synergy-based approach to measuring information modification},
  author={Joseph T. Lizier and Benjamin Flecker and Paul L. Williams},
  journal={2013 IEEE Symposium on Artificial Life (ALife)},
  year={2013},
  pages={43-51}
}
Distributed computation in artificial life and complex systems is often described in terms of component operations on information: information storage, transfer and modification. Information modification remains poorly described however, with the popularly-understood examples of glider and particle collisions in cellular automata being only quantitatively identified to date using a heuristic (separable information) rather than a proper information-theoretic measure. We outline how a recently… 

Figures and Tables from this paper

Quantifying Information Modification in Developing Neural Networks via Partial Information Decomposition
TLDR
It is found that modification rose with maturation, but ultimately collapsed when redundant information among neurons took over, indicating that this particular developing neural system initially developed intricate processing capabilities but ultimately displayed information processing that was highly similar across neurons, possibly due to a lack of external inputs.
Measuring the Dynamics of Information Processing on a Local Scale in Time and Space
TLDR
This chapter reviews the mathematics of how to measure local entropy and mutual information values at specific observations of time-series processes, and describes how these measures can reveal much more intricate details about the dynamics of complex systems than their more well-known “average” measures do.
JIDT: An Information-Theoretic Toolkit for Studying the Dynamics of Complex Systems
  • J. Lizier
  • Computer Science
    Front. Robot. AI
  • 2014
TLDR
The Java Information Dynamics Toolkit (JIDT) is introduced, a Google code project which provides a standalone, (GNU GPL v3 licensed) open-source code implementation for empirical estimation of information-theoretic measures from time-series data.
Local active information storage as a tool to understand distributed neural information processing
TLDR
It is suggested that LAIS will be a useful quantity to test theories of cortical function, such as predictive coding, when measured on a local scale in time and space in voltage sensitive dye imaging data from area 18 of the cat.
Bits from Biology for Computational Intelligence
TLDR
This article shows how to analyze neural systems using information theory to obtain constraints that help identify the algorithms run by such systems and the information they represent.
Quantifying Synergistic Information Using Intermediate Stochastic Variables "2279
TLDR
A metric of synergistic entropy and synergistic information from first principles is proposed, which is used to demonstrate that synergy is associated with resilience to noise and may be a marked step forward in the study of multivariate information theory and its numerous applications.
Information Theoretical Approaches
Information processing features can detect behavioral regimes of dynamical systems
TLDR
An 'information processing' framework based on Shannon mutual information quantities between the initial and future states is applied to the 256 elementary cellular automata (ECA), finding that only a few features are needed for full predictability and that the 'information integration' feature is always most predictive.
Synergy, redundancy, and multivariate information measures: an experimentalist’s perspective
TLDR
The usefulness of the information measures will be illustrated by analyzing neural spiking data from a dissociated culture through early stages of its development and the aim is that this work will aid other researchers as they seek the best multivariate information measure for their specific research goals and system.
...
...

References

SHOWING 1-10 OF 50 REFERENCES
Local measures of information storage in complex distributed computation
Information modification and particle collisions in distributed computation.
TLDR
The separable information is introduced, a measure which locally identifies information modification events where separate inspection of the sources to a computation is misleading about its outcome.
The local information dynamics of distributed computation in complex systems
TLDR
A complete information-theoretic framework to quantify these operations on information, and in particular their dynamics in space and time, is applied to cellular automata, and delivers important insights into the fundamental nature of distributed computation and the dynamics of complex systems.
Partial information decomposition as a spatiotemporal filter.
TLDR
This work utilizes the new technique of partial information decomposition to show that previous information-theoretic measures can confound distinct sources of information, and proposes a new set of filters that more cleanly separate out the background domains, particles, and collisions that are typically associated with information storage, transfer, and modification in cellular automata.
Local information transfer as a spatiotemporal filter for complex systems.
TLDR
A measure of local information transfer, derived from an existing averaged information-theoretical measure, namely, transfer entropy, is presented, providing the first quantitative evidence for the long-held conjecture that the emergent traveling coherent structures known as particles are the dominant information transfer agents in cellular automata.
Detecting Non-trivial Computation in Complex Dynamics
TLDR
This work quantifies the local information dynamics at each spatiotemporal point in a complex system in terms of each element of computation: information storage, transfer and modification to provide the first quantitative evidence that collisions between particles therein are the dominant information modification events.
A Bivariate Measure of Redundant Information
TLDR
A new formalism for redundant information is introduced and it is proved that it satisfies all the properties necessary outlined in earlier work, as well as an additional criterion that is proposed to be necessary to capture redundancy.
Towards a New Information Processing Measure for Neural Computation
TLDR
A general (i.e. implementation independent) new information processing measure that takes into account the fundamental concepts in brain theory and neural computation and is based on measuring the transformations required to go from the original alphabet in which the sensory messages are represented, to the objective alphabet which depends on the implicit task(s) imposed by the environment-system relation.
Information Dynamics of Evolved Agents
TLDR
A novel information-theoretic approach for analyzing the dynamics of information flow in embodied systems is formulated and applied to analyze a previously evolved model of relational categorization.
Differentiating information transfer and causal effect
TLDR
It is shown that causal information flow is a primary tool to describe the causal structure of a system, while information transfer can then be used to describes the emergent computation on that causal structure.
...
...