• Corpus ID: 15319561

Characterizing Multivariate Information Flows

  title={Characterizing Multivariate Information Flows},
  author={Shohei Hidaka},
  • S. Hidaka
  • Published 21 December 2012
  • Mathematics, Computer Science
  • ArXiv
One of the crucial steps in scientific studies is to specify de pen-dent relationships among factors in a system of interest. Givenlittle knowledge of a system, can we characterize the underlyingdependent relationships through observation of its temporal be-haviors? In multivariate systems, there are potentially many pos-sible dependentstructures confusable with each other, and it maycause false detection of illusory dependency between unrelatedfactors. The present study proposes a new… 


It is shown that the "usual definition" of a discrete memoryless channel (DMC) in fact prohibits the use of feedback. The difficulty stems from the confusion of causality and statistical dependence.
Causality detected by transfer entropy leads acquisition of joint attention
In the computer simulation of human-robot interaction, which pair of perceptions and actions are selected as the causal pair are examined and it is shown that the selected pairs can be used to learn a sensorimotor map for achieving joint attention.
The Multiinformation Function as a Tool for Measuring Stochastic Dependence
It is argued that the corresponding multiinformation function is a useful tool for problems concerning stochastic (conditional) dependence and independence (at least in the discrete case).
Analyzing multimodal time series as dynamical systems
A novel approach to discovering latent structures from multimodal time series based on the concept of generating partition which is the theoretically best symbolization of time series maximizing the information of the underlying original continuous dynamical system.
Uncertainty and structure as psychological concepts
It was a misfortune of psychology that it lacked a tradition of dealing with rigorous mathematical theories when psychologists were first attracted by information theory. Applications were made with
Spatio-Temporal Symbolization of Multidimensional Time Series
  • S. Hidaka, Chen Yu
  • Computer Science
    2010 IEEE International Conference on Data Mining Workshops
  • 2010
Probabilistic symbolic sequences derived from the symbolization method can be used in various supervised and unsupervised data-mining tasks and the new algorithm outperforms its alternative approaches.
An approximation to the distribution of finite sample size mutual information estimates
The distribution of mutual information between two discrete random variables is approximated by means of a second-order Taylor series expansion and conditional MI between conditionally independent variables and MI between (weakly) dependent random variables are derived.
Approximating discrete probability distributions with causal dependence trees
It is demonstrated that the minimum divergence approximation is the directed tree with maximum sum of directed informations, and a low-complexity minimum weight directed spanning tree, or arborescence, algorithm is specified to find the optimal tree.
Complex network measures of brain connectivity: Uses and interpretations
Construction of brain networks from connectivity data is discussed and the most commonly used network measures of structural and functional connectivity are described, which variously detect functional integration and segregation, quantify centrality of individual brain regions or pathways, and test resilience of networks to insult.
Nonlinear Time Series Analysis
Abstract : This thesis applies neural network feature selection techniques to multivariate time series data to improve prediction of a target time series. Two approaches to feature selection are