Probing High-Order Dependencies With Information Theory

@article{GraneroBelinchn2019ProbingHD,
  title={Probing High-Order Dependencies With Information Theory},
  author={Carlos Granero-Belinch{\'o}n and St{\'e}phane G. Roux and Patrice Abry and Nicolas B. Garnier},
  journal={IEEE Transactions on Signal Processing},
  year={2019},
  volume={67},
  pages={3796-3805}
}
Information theoretic measures (entropies, entropy rates, mutual information) are nowadays commonly used in statistical signal processing for real-world data analysis. This paper proposes the use of auto mutual information (mutual information between subsets of the same signal) and entropy rate as powerful tools to assess refined dependencies of any order in signal temporal dynamics. Notably, it is shown how two-point auto mutual information and entropy rate unveil information conveyed by… Expand
Information Theory for Non-Stationary Processes with Stationary Increments
TLDR
The stationarity (independence on the integration time) of the ersatz entropy rate is shown that this quantity is not only able to fine probe the self-similarity of the process, but also offers a new way to quantify the multi-fractality. Expand
Multi-Meteorological-Factor-Based Graph Modeling for Photovoltaic Power Forecasting
TLDR
The testing results suggest that the proposed multi-graph model outperforms other benchmark models in terms of accuracy under day-ahead forecasting cases, and achieves a reduced cost of training time comparing to deep-learning benchmark models. Expand
Quantifying Non-Stationarity with Information Theory
We introduce an index based on information theory to quantify the stationarity of a stochastic process. The index compares on the one hand the information contained in the increment at the time scaleExpand
Distance to Healthy Metabolic and Cardiovascular Dynamics From Fetal Heart Rate Scale-Dependent Features in Pregnant Sheep Model of Human Labor Predicts the Evolution of Acidemia and Cardiovascular Decompensation
TLDR
It is demonstrated that this novel metric, applied to clinically available FHR temporal dynamics alone, accurately predicts the time occurrence of CVD which heralds a clinically significant degradation of the fetal health reserve to tolerate the trial of labor. Expand

References

SHOWING 1-10 OF 62 REFERENCES
Entropy measures, entropy estimators, and their performance in quantifying complex dynamics: Effects of artifacts, nonstationarity, and long-range correlations.
TLDR
It is found that entropy measures can only differentiate changes of specific types in cardiac dynamics and that appropriate preprocessing is vital for correct estimation and interpretation. Expand
Estimating mutual information.
TLDR
Two classes of improved estimators for mutual information M(X,Y), from samples of random points distributed according to some joint probability density mu(x,y), based on entropy estimates from k -nearest neighbor distances are presented. Expand
Entropy estimation of very short symbolic sequences.
TLDR
This work investigates the performance of entropy estimation procedures, relying either on block entropies or Lempel-Ziv complexity, when only very short symbolic sequences are available, and recommends a two-step procedure. Expand
Using time-delayed mutual information to discover and interpret temporal correlation structure in complex populations
TLDR
The proposed methods can be used in nearly any situation and are applied and demonstrated on the time series of glucose measurements from two different subpopulations of individuals from the Columbia University Medical Center electronic health record repository, revealing a picture of the composition of the population as well as physiological features. Expand
Mutual Information in Frequency and Its Application to Measure Cross-Frequency Coupling in Epilepsy
TLDR
The MI-in-frequency metric is defined and used to investigate the cross-frequency coupling in seizure onset zone from electrocorticographic recordings during seizures and to estimate mutual information between two data streams that are dependent across time, without making any parametric model assumptions. Expand
Demystifying fixed k-nearest neighbor information estimators
TLDR
It is demonstrated that the KSG estimator is consistent and an upper bound on the rate of convergence of the ℓ2 error as a function of number of samples is identified, and it is argued that the performance benefits of the KSg estimator stems from a curious “correlation boosting” effect. Expand
The relation between Granger causality and directed information theory: a review
TLDR
The links are developed by studying how measures based on directed information theory naturally emerge from Granger causality inference frameworks as hypothesis testing, and showing that the useful decomposition is blurred by instantaneous coupling. Expand
Estimation of time-delayed mutual information and bias for irregularly and sparsely sampled time-series.
TLDR
The proposed bias estimation techniques are shown to work for Lorenz equations data and glucose time series data of three patients from the Columbia University Medical Center database. Expand
Elements of Information Theory
TLDR
The author examines the role of entropy, inequality, and randomness in the design of codes and the construction of codes in the rapidly changing environment. Expand
Noise, chaos, and (ε,τ)-entropy per unit time
Abstract The degree of dynamical randomness of different time processes is characterized in terms of the (e, τ)-entropy per unit time. The (e, τ)-entropy is the amount of information generated perExpand
...
1
2
3
4
5
...