Corpus ID: 9282922

# Irreducibility is Minimum Synergy Among Parts

@article{Griffith2013IrreducibilityIM,
title={Irreducibility is Minimum Synergy Among Parts},
author={V. Griffith and Jonathan Harel},
journal={arXiv: Information Theory},
year={2013}
}
• Published 2013
• Computer Science, Mathematics
• arXiv: Information Theory
For readers already familiar with Partial Information Decomposition (PID), we show that PID's definition of synergy enables quantifying at least four different notions of irreducibility. First, we show four common notions of "parts" give rise to a spectrum of four distinct measures of irreducibility. Second, we introduce a nonnegative expression based on PID for each notion of irreducibility. Third, we delineate these four notions of irreducibility with exemplary binary circuits. This work will… Expand
1 Citations

#### Figures, Tables, and Topics from this paper

A Principled Infotheoretic \phi-like Measure
This work pinpointed three concerns about $\phi$ and proposed a revised measure, $\psi$, which addresses them and is rigorously grounded in Partial Information Decomposition and is faster to compute than $\phi$. Expand

#### References

SHOWING 1-10 OF 14 REFERENCES
A Bivariate Measure of Redundant Information
• Mathematics, Medicine
• Physical review. E, Statistical, nonlinear, and soft matter physics
• 2013
A new formalism for redundant information is introduced and it is proved that it satisfies all the properties necessary outlined in earlier work, as well as an additional criterion that is proposed to be necessary to capture redundancy. Expand
Nonnegative Decomposition of Multivariate Information
• Mathematics, Computer Science
• ArXiv
• 2010
This work reconsider from first principles the general structure of the information that a set of sources provides about a given variable and proposes a definition of partial information atoms that exhaustively decompose the Shannon information in a multivariate system in terms of the redundancy between synergies of subsets of the sources. Expand
On a Connection between Information and Group Lattices
• Mathematics, Computer Science
• Entropy
• 2011
A comprehensive parallelism between information lattices and subgroup lattices is exposed, admitting an appealing group-action explanation and providing useful insights into the intrinsic structure among information elements from a group-theoretic perspective. Expand
Synergy, Redundancy, and Independence in Population Codes
• Computer Science, Medicine
• The Journal of Neuroscience
• 2003
This work distinguishes questions about how information is encoded by a population of neurons from how that information can be decoded, and shows that these measures form an interrelated framework for evaluating contributions of signal and noise correlations to the joint information conveyed about the stimulus. Expand
THE CO-INFORMATION LATTICE
In 1955, McGill published a multivariate generalisation of Shannon’s mutual information. Algorithms such as Independent Component Analysis use a different generalisation, the redundancy, orExpand
Quantifying synergistic mutual information
• Computer Science, Mathematics
• ArXiv
• 2012
No consensus is held on which measure for quantifying synergy is most valid, but there remains no consensus which measure is mostvalid. Expand
Shared Information -- New Insights and Problems in Decomposing Information in Complex Systems
• Computer Science, Mathematics
• ArXiv
• 2012
How can the information that a set {X 1,…,X n } of random variables contains about another random variable S be decomposed? To what extent do different subgroups provide the same, i.e. shared orExpand
Analyzing Attribute Dependencies
• Computer Science
• PKDD
• 2003
This paper formally defines the degree of interaction between attributes through the deviation of the best possible “voting” classifier from the true relation between the class and the attributes in a domain, and proposes a practical heuristic for detecting attribute interactions, called interaction gain. Expand
Group Redundancy Measures Reveal Redundancy Reduction in the Auditory Pathway
• Mathematics, Computer Science
• NIPS
• 2001
The results suggest that, the auditory system transforms low level representations that contain redundancies due to the statistical structure of natural stimuli, into a representation in which cortical neurons extract rare and independent component of complex acoustic signals, that are useful for auditory scene analysis. Expand
Computational analysis of the synergy among multiple interacting genes
An information‐theoretic analysis is presented that provides a quantitative measure of the multivariate synergy and decomposes sets of genes into submodules each of which contains synergistically interacting genes. Expand