Assessing coupling dynamics from an ensemble of time series

@article{GmezHerrero2015AssessingCD,
  title={Assessing coupling dynamics from an ensemble of time series},
  author={Germ{\'a}n G{\'o}mez-Herrero and Wei Wu and Kalle Rutanen and Miguel C. Soriano and Gordon Pipa and Raul Vicente},
  journal={Entropy},
  year={2015},
  volume={17},
  pages={1958-1970}
}
Finding interdependency relations between (possibly multivariate) time series provides valuable knowledge about the processes that generate the signals. Information theory sets a natural framework for non-parametric measures of several classes of statistical dependencies. However, a reliable estimation from information-theoretic functionals is hampered when the dependency to be assessed is brief or evolves in time. Here, we show that these limitations can be overcome when we have access to an… 

Figures from this paper

A Recipe for the Estimation of Information Flow in a Dynamical System
TLDR
A new methodology to estimate Transfer Entropy (TE) is proposed and a set of methods are applied as an accuracy cross-check to provide a reliable mathematical tool for any given data set.
Efficient Transfer Entropy Analysis of Non-Stationary Neural Time Series
TLDR
This work combines the ensemble method with a recently proposed transfer entropy estimator to make transfer entropy estimation applicable to non-stationary time series and tests the performance and robustness of the implementation on data from numerical simulations of stochastic processes.
Information-theoretic model selection for optimal prediction of stochastic dynamical systems from data.
TLDR
An information-theoretic criterion for selecting the embedding dimension for a predictively optimal data-driven model of a stochastic dynamical system, and a nonparametric estimator for the negative log-predictive likelihood is developed and compared to a recently proposed criterion based on active information storage.
Parametric and Non-parametric Criteria for Causal Inference from Time-Series
TLDR
This work compares the different criteria for causal inference from timeseries and introduces new criteria that complete a unified picture of how the different approaches are related.
Efficient Estimation of Information Transfer
TLDR
This chapter describes step by step the efficient estimation of transfer entropy for a typical electrophysiology data set, and how the multi-trial structure of such data sets can be used to partially alleviate the problem of non-stationarity.
Measuring the Dynamics of Information Processing on a Local Scale in Time and Space
TLDR
This chapter reviews the mathematics of how to measure local entropy and mutual information values at specific observations of time-series processes, and describes how these measures can reveal much more intricate details about the dynamics of complex systems than their more well-known “average” measures do.
Large-scale directed network inference with multivariate transfer entropy and hierarchical statistical testing
TLDR
The algorithm presented—as implemented in the IDTxl open-source software—addresses challenges by employing hierarchical statistical tests to control the family-wise error rate and to allow for efficient parallelization, and was validated on synthetic datasets involving random networks of increasing size.
JIDT: An Information-Theoretic Toolkit for Studying the Dynamics of Complex Systems
  • J. Lizier
  • Computer Science
    Front. Robot. AI
  • 2014
TLDR
The Java Information Dynamics Toolkit (JIDT) is introduced, a Google code project which provides a standalone, (GNU GPL v3 licensed) open-source code implementation for empirical estimation of information-theoretic measures from time-series data.
Inferring time-varying brain connectivity graph based on a new method for link estimation
TLDR
An efficient framework to derive a graphical model for the statistical analysis of multivariate processes from observed time series in a data-driven pipeline to explore the interregional brain interactions is proposed.
...
...

References

SHOWING 1-10 OF 62 REFERENCES
Efficient Transfer Entropy Analysis of Non-Stationary Neural Time Series
TLDR
This work combines the ensemble method with a recently proposed transfer entropy estimator to make transfer entropy estimation applicable to non-stationary time series and tests the performance and robustness of the implementation on data from numerical simulations of stochastic processes.
Efficient Estimation of Information Transfer
TLDR
This chapter describes step by step the efficient estimation of transfer entropy for a typical electrophysiology data set, and how the multi-trial structure of such data sets can be used to partially alleviate the problem of non-stationarity.
Information transfer in continuous processes
Partial mutual information for coupling analysis of multivariate time series.
We propose a method to discover couplings in multivariate time series, based on partial mutual information, an information-theoretic generalization of partial correlation. It represents the part of
TRENTOOL: A Matlab open source toolbox to analyse information flow in time series data with transfer entropy
TLDR
This work presents the open-source MATLAB toolbox TRENTOOL, an implementation of transfer entropy and mutual information analysis that aims to support the user in the application of this information theoretic measure.
Nonlinear multivariate analysis of neurophysiological signals
Measuring Information-Transfer Delays
TLDR
A robust estimator for neuronal interaction delays rooted in an information-theoretic framework is proposed, which allows a model-free exploration of interactions and shows the ability of the extended transfer entropy to detect the presence of multiple delays, as well as feedback loops.
Investigating causal relations by econometric models and cross-spectral methods
There occurs on some occasions a difficulty in deciding the direction of causality between two related variables and also whether or not feedback is occurring. Testable definitions of causality and
Phase Space Reconstruction and Nonlinear Predictions for Stationary and nonstationary Markovian Processes
TLDR
It is proposed that under certain conditions a scalar time series can be modeled as a finite memory Markov process in the observable, and the same concept can be extended to nonstationary stochastic processes.
Estimating mutual information.
TLDR
Two classes of improved estimators for mutual information M(X,Y), from samples of random points distributed according to some joint probability density mu(x,y), based on entropy estimates from k -nearest neighbor distances are presented.
...
...