On the Bias of Directed Information Estimators

@article{Schamberg2019OnTB,
  title={On the Bias of Directed Information Estimators},
  author={Gabriel Schamberg and Todd P. Coleman},
  journal={2019 IEEE International Symposium on Information Theory (ISIT)},
  year={2019},
  pages={186-190}
}
  • Gabriel Schamberg, T. Coleman
  • Published 1 February 2019
  • Computer Science, Mathematics
  • 2019 IEEE International Symposium on Information Theory (ISIT)
When estimating the directed information between two jointly stationary Markov processes, it is typically assumed that the recipient of the directed information is itself Markov of the same order as the joint process. While this assumption is often made explicit in the presentation of such estimators, a characterization of when we can expect the assumption to hold is lacking. Using the concept of d-separation from Bayesian networks, we present sufficient conditions for which this assumption… Expand
Measuring Sample Path Causal Influences With Relative Entropy
TLDR
It is proved a finite sample bound on this regret is determined by the worst case regret of the sequential predictors used in the estimator, and a notion of DI with “stale history” is introduced, which can be combined with a plug-in estimator to upper and lower bound the DI when marginal Markovicity does not hold. Expand
Inferring neural information flow from spiking data
  • A. Tauste Campo
  • Computer Science, Medicine
  • Computational and structural biotechnology journal
  • 2020
TLDR
The Granger Causality framework is discussed, which includes many popular state-of-the-art methods and its limitations are highlighted, and directions for future research are discussed, including the development of theoretical information flow models and the use of dimensionality reduction techniques to extract relevant interactions from large-scale recording datasets. Expand

References

SHOWING 1-10 OF 21 REFERENCES
Estimating the Directed Information and Testing for Causality
TLDR
The plug-in estimator is shown to be asymptotically Gaussian and to converge at the optimal rate O(1/√n) under appropriate conditions; this is the first estimator that has been shown to achieve this rate. Expand
Universal Estimation of Directed Information
TLDR
Four estimators of the directed information rate between a pair of jointly stationary ergodic finite-alphabet processes are proposed, based on universal probability assignments, and show the almost sure and L1 convergence properties of the estimator for any underlying universal probability assignment. Expand
Directed Information Graphs
We propose a graphical model for representing networks of stochastic processes, the minimal generative model graph. It is based on reduced factorizations of the joint distribution over time. We showExpand
k-NN Estimation of Directed Information
TLDR
An exhaustive numerical study shows that the discussed $k-NN estimators perform well even for relatively small number of samples (few thousands), and are capable of accurately detecting linear as well as non-linear causal interactions. Expand
CAUSALITY, FEEDBACK AND DIRECTED INFORMATION
It is shown that the "usual definition" of a discrete memoryless channel (DMC) in fact prohibits the use of feedback. The difficulty stems from the confusion of causality and statistical dependence.Expand
Independence properties of directed markov fields
TLDR
A criterion for conditional independence of two groups of variables given a third is given and named as the directed, global Markov property and it is argued that this criterion is easy to use, it is sharper than that given by Kiiveri, Speed, and Carlin and equivalent to that of Pearl. Expand
Causation, prediction, and search
What assumptions and methods allow us to turn observations into causal knowledge, and how can even incomplete causal knowledge be used in planning and prediction to influence and control ourExpand
Investigating causal relations by econometric models and cross-spectral methods
There occurs on some occasions a difficulty in deciding the direction of causality between two related variables and also whether or not feedback is occurring. Testable definitions of causality andExpand
Estimating the directed information to infer causal relationships in ensemble neural spike train recordings
TLDR
This paper motivates the directed information, an information and control theoretic concept, as a modality-independent embodiment of Granger’s original notion of causality, which is shown to be able to differentiate between true direct causal influences, common inputs, and cascade effects in more two processes. Expand
The Bidirectional Communication Theory - A Generalization of Information Theory
  • H. Marko
  • Computer Science
  • IEEE Transactions on Communications
  • 1973
A generalization of information theory is presented with the aim of distinguishing the direction of information flow for mutually coupled statistical systems. The bidirectional communication theoryExpand
...
1
2
3
...