JIDT: An Information-Theoretic Toolkit for Studying the Dynamics of Complex Systems
@article{Lizier2014JIDTAI, title={JIDT: An Information-Theoretic Toolkit for Studying the Dynamics of Complex Systems}, author={Joseph T. Lizier}, journal={Frontiers Robotics AI}, year={2014}, volume={1}, pages={11} }
Complex systems are increasingly being viewed as distributed information processing systems, particularly in the domains of computational neuroscience, bioinformatics and Artificial Life. This trend has resulted in a strong uptake in the use of (Shannon) information-theoretic measures to analyse the dynamics of complex systems in these fields. We introduce the Java Information Dynamics Toolkit (JIDT): a Google code project which provides a standalone, (GNU GPL v3 licensed) open-source code…
261 Citations
Inform: A toolkit for information-theoretic analysis of complex systems
- Computer Science2017 IEEE Symposium Series on Computational Intelligence (SSCI)
- 2017
Evidence is presented that suggests that Inform's computational performance is at least comparable to the Java Information Dynamics Toolkit (JIDT), which is taken to be the gold-standard for the field.
Measuring the Dynamics of Information Processing on a Local Scale in Time and Space
- Computer Science
- 2014
This chapter reviews the mathematics of how to measure local entropy and mutual information values at specific observations of time-series processes, and describes how these measures can reveal much more intricate details about the dynamics of complex systems than their more well-known “average” measures do.
Information-theoretic analysis of the directional influence between cellular processes
- BiologyPloS one
- 2017
This work re-analyzes the experimental data from Kiviet et al. (2014) where fluctuations in gene expression of metabolic enzymes and growth rate have been measured in single cells of E. coli and confirms the formerly detected modes between growth and gene expression, while prescribing more stringent conditions on the structure of noise sources.
Inform: Efficient Information-Theoretic Analysis of Collective Behaviors
- Computer ScienceFront. Robot. AI
- 2018
The architecture of the Inform framework is described, its computational efficiency is studied and it is used to analyze three different case studies of collective behavior: biochemical information storage in regenerating planaria, nest-site selection in the ant Temnothorax rugatulus, and collective decision making in multi-agent simulations.
IDTxl: The Information Dynamics Toolkit xl: a Python package for the efficient analysis of multivariate information dynamics in networks
- Computer ScienceJ. Open Source Softw.
- 2019
The Information Dynamics Toolkit xl (IDTxl) is a comprehensive software package for efficient inference of networks and their node dynamics from multivariate time series data using information…
Information-theoretic model selection for optimal prediction of stochastic dynamical systems from data.
- Computer SciencePhysical review. E
- 2018
An information-theoretic criterion for selecting the embedding dimension for a predictively optimal data-driven model of a stochastic dynamical system, and a nonparametric estimator for the negative log-predictive likelihood is developed and compared to a recently proposed criterion based on active information storage.
Inferring Coupling of Distributed Dynamical Systems via Transfer Entropy
- Computer ScienceArXiv
- 2016
This work represents this model as a directed acyclic graph that characterises the unidirectional coupling between subsystems that supports the previously held conjecture that transfer entropy can be used to infer effective connectivity in complex networks.
Measuring Information-Transfer Delays
- Computer SciencePloS one
- 2013
A robust estimator for neuronal interaction delays rooted in an information-theoretic framework is proposed, which allows a model-free exploration of interactions and shows the ability of the extended transfer entropy to detect the presence of multiple delays, as well as feedback loops.
Bits from Biology for Computational Intelligence
- Computer ScienceArXiv
- 2014
This article shows how to analyze neural systems using information theory to obtain constraints that help identify the algorithms run by such systems and the information they represent.
Decomposing past and future: Integrated information decomposition based on shared probability mass exclusions
- Computer Science
- 2022
This work proposes a novel information-theoretic measure of temporal dependency (Iτsx) based on informative and misinformative local probability mass exclusions and applies the decomposition to spontaneous spiking activity recorded from dissociated neural cultures of rat cerebral cortex to show how different modes of information processing are distributed over the system.
References
SHOWING 1-10 OF 252 REFERENCES
Measuring the Dynamics of Information Processing on a Local Scale in Time and Space
- Computer Science
- 2014
This chapter reviews the mathematics of how to measure local entropy and mutual information values at specific observations of time-series processes, and describes how these measures can reveal much more intricate details about the dynamics of complex systems than their more well-known “average” measures do.
The local information dynamics of distributed computation in complex systems
- Computer Science
- 2012
A complete information-theoretic framework to quantify these operations on information, and in particular their dynamics in space and time, is applied to cellular automata, and delivers important insights into the fundamental nature of distributed computation and the dynamics of complex systems.
On active information storage in input-driven systems
- Computer ScienceArXiv
- 2013
Using the proposed input-corrected information storage, the aim is to better quantify system behaviour, which will be important for heavily input-driven systems like artificial neural networks to abstract from specific benchmarks, or for brain networks, where intervention is difficult, individual components cannot be tested in isolation or with arbitrary input data.
Conditional Entropy-Based Evaluation of Information Dynamics in Physiological Systems
- Computer Science
- 2014
The framework is illustrated on numerical examples showing its capability to deal with the curse of dimensionality in the multivariate computation of CondEn, and to reliably estimate SE, CE and TE in the challenging conditions of biomedical time series analysis featuring noise and small sample size.
Towards a synergy-based approach to measuring information modification
- Computer Science2013 IEEE Symposium on Artificial Life (ALife)
- 2013
This work outlines how a recently-introduced axiomatic framework for measuring information redundancy and synergy, called partial information decomposition, can be applied to a perspective of distributed computation in order to quantify component operations on information.
Measuring Information-Transfer Delays
- Computer SciencePloS one
- 2013
A robust estimator for neuronal interaction delays rooted in an information-theoretic framework is proposed, which allows a model-free exploration of interactions and shows the ability of the extended transfer entropy to detect the presence of multiple delays, as well as feedback loops.
TRENTOOL: A Matlab open source toolbox to analyse information flow in time series data with transfer entropy
- Computer ScienceBMC Neuroscience
- 2011
This work presents the open-source MATLAB toolbox TRENTOOL, an implementation of transfer entropy and mutual information analysis that aims to support the user in the application of this information theoretic measure.
Information Dynamics in Small-World Boolean Networks
- Computer ScienceArtificial Life
- 2011
An ensemble investigation of the computational capabilities of small-world networks as compared to ordered and random topologies finds that the ordered phase of the dynamics and topologies with low randomness are dominated by information storage, while the chaotic phase is dominated byInformation storage and information transfer.
Assessing coupling dynamics from an ensemble of time series
- Computer ScienceEntropy
- 2015
This work gear a data-efficient estimator of probability densities to make use of the full structure of trial-based measures and obtains time-resolved estimates for a family of entropy combinations (including mutual information, transfer entropy and their conditional counterparts), which are more accurate than the simple average of individual estimates over trials.
Symbolic local information transfer
- Computer ScienceArXiv
- 2014
This paper proposes measures called symbolic local transfer entropies, and applies them to two test models, the coupled map lattice system and the Bak-Sneppen model, to show their relevance to spatiotemporal systems that have continuous states, and demonstrates that these measures can provide novel insight to the model.