Measuring the Dynamics of Information Processing on a Local Scale in Time and Space

@inproceedings{Lizier2014MeasuringTD,
  title={Measuring the Dynamics of Information Processing on a Local Scale in Time and Space},
  author={Joseph T. Lizier},
  year={2014}
}
Studies of how information is processed in natural systems, in particular in nervous systems, are rapidly gaining attention. Less known however is that the local dynamics of such information processing in space and time can be measured. In this chapter, we review the mathematics of how to measure local entropy and mutual information values at specific observations of time-series processes.We then review how these techniques are used to construct measures of local information storage and… 
JIDT: An Information-Theoretic Toolkit for Studying the Dynamics of Complex Systems
  • J. Lizier
  • Computer Science
    Front. Robot. AI
  • 2014
TLDR
The Java Information Dynamics Toolkit (JIDT) is introduced, a Google code project which provides a standalone, (GNU GPL v3 licensed) open-source code implementation for empirical estimation of information-theoretic measures from time-series data.
Transfer Entropy and Transient Limits of Computation
TLDR
This work generalises Landauer's limit to bi-directional information dynamics for non-equilibrium processes, revealing that the limit applies to prediction, in addition to retrodiction (information erasure), and establishes transfer entropy as a central measure connecting information theory, thermodynamics and theory of computation.
Early lock-in of structured and specialised information flows during neural development
TLDR
This work characterises the flow of information during the crucial periods of population bursts and finds that, during these bursts, nodes tend to undertake specialised computational roles as either transmitters, mediators, or receivers of information, with these roles tending to align with their average spike ordering.
Efficient Transfer Entropy Analysis of Non-Stationary Neural Time Series
TLDR
This work combines the ensemble method with a recently proposed transfer entropy estimator to make transfer entropy estimation applicable to non-stationary time series and tests the performance and robustness of the implementation on data from numerical simulations of stochastic processes.
Bits from Biology for Computational Intelligence
TLDR
This article shows how to analyze neural systems using information theory to obtain constraints that help identify the algorithms run by such systems and the information they represent.
Early lock-in of structured and specialised information flows during neural development
TLDR
This work represents the first study which compares information flows in the intrinsic dynamics across development time and makes use of a recently proposed continuous-time transfer entropy estimator for spike trains, which is able to capture important effects occurring on both small and large timescales simultaneously.
Information Flow in a Model of Policy Diffusion: An Analytical Study
TLDR
This paper presents an analytical study of the information flow in a network model of policy diffusion, thereby establishing closed-form expressions for the transfer entropy between any pair of nodes, and offers a compelling evidence for the potential of transfer entropy to assist in the process of network reconstruction.
Informative and misinformative interactions in a school of fish
TLDR
This analysis reveals peaks in information flows during collective U-turns and identifies two different flows: an informative flow from fish that have already turned to fish that are turning, and a misinformative flow that has not turned yet and is related to relative position and alignment between fish.
Transitions in brain-network level information processing dynamics are driven by alterations in neural gain
TLDR
The results suggest that the modulation of neural gain via the ascending arousal system may fundamentally alter the information processing mode of the brain, which in turn has important implications for understanding the biophysical basis of cognition.
Transitions in information processing dynamics at the whole-brain network level are driven by alterations in neural gain
TLDR
This study demonstrates that the gain-mediated phase transition is characterized by a shift in the underlying dynamics of neural information processing, and suggests that the constraints imposed by the ascending arousal system constrain low-dimensional modes of information processing within the brain.
...
...

References

SHOWING 1-10 OF 111 REFERENCES
The local information dynamics of distributed computation in complex systems
TLDR
A complete information-theoretic framework to quantify these operations on information, and in particular their dynamics in space and time, is applied to cellular automata, and delivers important insights into the fundamental nature of distributed computation and the dynamics of complex systems.
Local measures of information storage in complex distributed computation
JIDT: An Information-Theoretic Toolkit for Studying the Dynamics of Complex Systems
  • J. Lizier
  • Computer Science
    Front. Robot. AI
  • 2014
TLDR
The Java Information Dynamics Toolkit (JIDT) is introduced, a Google code project which provides a standalone, (GNU GPL v3 licensed) open-source code implementation for empirical estimation of information-theoretic measures from time-series data.
Local information transfer as a spatiotemporal filter for complex systems.
TLDR
A measure of local information transfer, derived from an existing averaged information-theoretical measure, namely, transfer entropy, is presented, providing the first quantitative evidence for the long-held conjecture that the emergent traveling coherent structures known as particles are the dominant information transfer agents in cellular automata.
Towards a synergy-based approach to measuring information modification
TLDR
This work outlines how a recently-introduced axiomatic framework for measuring information redundancy and synergy, called partial information decomposition, can be applied to a perspective of distributed computation in order to quantify component operations on information.
Information storage and transfer in the synchronization process in locally-connected networks
TLDR
Information-theoretical techniques are used to view the synchronization process as a distributed computation, and to measure the information storage and transfer at each node in the system at each time step, producing novel insights, including that the computation of the synchronized state appears to be completed much more quickly than application-specific measures would indicate.
Information modification and particle collisions in distributed computation.
TLDR
The separable information is introduced, a measure which locally identifies information modification events where separate inspection of the sources to a computation is misleading about its outcome.
Automatic filters for the detection of coherent structure in spatiotemporal systems.
TLDR
Two approaches to automatically filter the changing configurations of spatial dynamical systems and extract coherent structures are presented, a modification of the local Lyapunov exponent approach suitable to cellular automata and other discrete spatial systems and a more traditional approach based on an order parameter and free energy is compared.
Assessing coupling dynamics from an ensemble of time series
TLDR
This work gear a data-efficient estimator of probability densities to make use of the full structure of trial-based measures and obtains time-resolved estimates for a family of entropy combinations (including mutual information, transfer entropy and their conditional counterparts), which are more accurate than the simple average of individual estimates over trials.
Coherent information structure in complex computation
TLDR
It is conjecture that coherent information structure is a defining feature of complex computation, particularly in biological systems or artificially evolved computation that solves human-understandable tasks.
...
...