A partial information decomposition for discrete and continuous variables
@article{SchickPoland2021API, title={A partial information decomposition for discrete and continuous variables}, author={Kyle Schick-Poland and Abdullah Makkeh and Aaron J. Gutknecht and Patricia Wollstadt and Anja Sturm and Michael Wibral}, journal={ArXiv}, year={2021}, volume={abs/2106.12393} }
Conceptually, partial information decomposition (PID) is concerned with separating the information contributions several sources hold about a certain target by decomposing the corresponding joint mutual information into contributions such as synergistic, redundant, or unique information. Despite PID conceptually being defined for any type of random variables, so far, PID could only be quantified for the joint mutual information of discrete systems. Recently, a quantification for PID in…
Figures from this paper
13 Citations
A Rigorous Information-Theoretic Definition of Redundancy and Relevancy in Feature Selection Based on (Partial) Information Decomposition
- Computer ScienceArXiv
- 2021
It is shown that the conditional mutual information (CMI) maximizes relevancy while minimizing redundancy and proposed an iterative, CMI-based algorithm for practical feature selection and clarified why feature selection is a conceptually difinition problem when approached using information theory.
Partial Information Decomposition Reveals the Structure of Neural Representations
- Computer ScienceArXiv
- 2022
This work proposes representational complexity as a principled and interpretable summary statistic for analyzing the structure of neural representations and proposes subsampling and coarse-graining procedures and proves corresponding bounds on the latter.
Conservative significance testing of tripartite statistical relations in multivariate neural data
- Computer ScienceNetwork Neuroscience
- 2022
A conservative null hypothesis for significance testing of tripartite measures is presented, which significantly decreases false positive rate at a tolerable expense of increasing false negative rate and raises awareness about the potential pitfalls of significance testing and of interpretation of functional relations, offering both conceptual and practical advice.
Disentanglement Analysis with Partial Information Decomposition
- Computer ScienceICLR
- 2022
Through experiments on variational autoencoders, this work finds that models with similar disentanglement scores have a variety of characteristics in entanglement, for each of which a distinct strategy may be required to obtain a disentangled representation.
Conservative Significance Testing of Tripartite Interactions in Multivariate Neural Data
- Computer SciencebioRxiv
- 2022
A conservative null hypothesis for significance testing of tripartite metrics is presented, which significantly decreases false positive rate at a tolerable expense of increasing false negative rate, and an adjusted conservative testing procedure is developed, reducingfalse positive rate of studied estimators for impure data.
Information-theoretic analyses of neural data to minimize the effect of researchers' assumptions in predictive coding studies
- Computer Science, BiologyArXiv
- 2022
The proposed approach to express information processing strategies (such as predictive coding) by local information-theoretic quantities, such that they can be estimated directly from neural data, by investigating two opposing accounts of predictive coding-like processing strategies.
Quantifying Reinforcement-Learning Agent’s Autonomy, Reliance on Memory and Internalisation of the Environment
- Computer ScienceEntropy
- 2022
This work uses the partial information decomposition (PID) framework to monitor the levels of autonomy and environment internalisation of reinforcement-learning (RL) agents and introduces an algorithm for calculating autonomy in a limiting process of time step approaching infinity.
Quantifying high-order interdependencies on individual patterns via the local O-information: Theory and applications to music analysis
- Materials SciencePhysical Review Research
- 2022
Tomas Scagliarini, Daniele Marinazzo, Yike Guo, 4 Sebastiano Stramaglia, 5 and Fernando E. Rosas 6, 7 Dipartimento Interateneo di Fisica, Universitá degli Studi Aldo Moro, Bari and INFN, Italy…
High-order functional redundancy in ageing explained via alterations in the connectome in a whole-brain model
- Psychology, BiologybioRxiv
- 2021
A neurobiologically-realistic whole-brain computational model using both anatomical and functional MRI data from 161 participants ranging from 10 to 80 years old shows that the age differences in high-order functional interactions can be largely explained by variations in the connectome, and proposes a simple neurodegeneration model that is representative of normal physiological aging.
References
SHOWING 1-10 OF 24 REFERENCES
An exploration of synergistic and redundant information sharing in static and dynamical Gaussian systems
- Computer SciencePhysical review. E, Statistical, nonlinear, and soft matter physics
- 2015
This work shows that Gaussian systems frequently exhibit net synergy, i.e., the information carried jointly by both sources is greater than the sum of information carried by each source individually, and provides independent formulas for synergy and redundancy applicable to continuous time-series data.
Introducing a differentiable measure of pointwise shared information.
- Computer SciencePhysical review. E
- 2021
This work shows how the measure can be understood from the perspective of exclusions of probability mass, a principle that is foundational to the original definition of mutual information by Fano, and gives an operational interpretation of the measure based on the decisions that an agent should take if given only the shared information.
Measuring multivariate redundant information with pointwise common change in surprisal
- Computer ScienceEntropy
- 2017
This work presents a new measure of redundancy which measures the common change in surprisal shared between variables at the local or pointwise level, and shows how this redundancy measure can be used within the framework of the Partial Information Decomposition (PID) to give an intuitive decomposition of the multivariate mutual information into redundant, unique and synergistic contributions.
Bits and pieces: understanding information decomposition from part-whole relationships and formal logic
- Computer ScienceProceedings of the Royal Society A
- 2021
This paper shows that the entire theory of PID can be derived, firstly, from considerations of part-whole relationships between information atoms and mutual information terms, and secondly, based on a hierarchy of logical constraints describing how a given information atom can be accessed.
Nonnegative Decomposition of Multivariate Information
- Computer ScienceArXiv
- 2010
This work reconsider from first principles the general structure of the information that a set of sources provides about a given variable and proposes a definition of partial information atoms that exhaustively decompose the Shannon information in a multivariate system in terms of the redundancy between synergies of subsets of the sources.
A Bivariate Measure of Redundant Information
- Computer SciencePhysical review. E, Statistical, nonlinear, and soft matter physics
- 2013
A new formalism for redundant information is introduced and it is proved that it satisfies all the properties necessary outlined in earlier work, as well as an additional criterion that is proposed to be necessary to capture redundancy.
Estimating the Unique Information of Continuous Variables
- Computer ScienceNeurIPS
- 2021
This work presents a method for estimating the unique information in continuous distributions, for the case of one versus two variables, and solves the associated optimization problem over the space of distributions with fixed bivariate marginals by combining copula decompositions and techniques developed to optimize variational autoencoders.
Quantifying unique information
- MathematicsEntropy
- 2014
We propose new measures of shared information, unique information and synergistic information that can be used to decompose the mutual information of a pair of random variables (Y, Z) with a third…
An operational information decomposition via synergistic disclosure
- BusinessJournal of Physics A: Mathematical and Theoretical
- 2020
This work proposes a new information decomposition based on a novel operationalisation of informational synergy, which leverages recent developments in the literature of data privacy, and provides a natural coarse-graining that scales gracefully with the system's size.