Information of interactions in complex systems
@article{Krippendorff2009InformationOI, title={Information of interactions in complex systems}, author={Klaus Krippendorff}, journal={International Journal of General Systems}, year={2009}, volume={38}, pages={669 - 680} }
This paper addresses misconceptions of the multi-variate interaction-information measure Q, which several authors have reinvented since its proposal by McGill (1954), giving it a variety of names and interpretations. McGill’s measure claimed to quantify the amount of information of interactions among three or more variables in complex systems. In (Krippendorff, 1980), I raised doubts about the validity of Q and its relatives. The chief problem that Q-measures fail to recognize is that complex…
84 Citations
Redundancy in Systems which Entertain a Model of Themselves: Interaction Information and the Self-organization of Anticipation
- Computer ScienceEntropy
- 2010
It is argued that Q provides a measure of the imprint of a second-order observing system—a model entertained by the system itself—on the underlying information processing within the information processing.
Multivariate Dependence Beyond Shannon Information
- Computer ScienceEntropy
- 2017
The vast majority of Shannon information measures are simply inadequate for determining the meaningful dependency structure within joint probability distributions and are inadequate for discovering intrinsic causal relations, particularly when employing information to express the organization and mechanisms embedded in complex systems.
The Generation and Self-organization of Meaning in the Communication of Information and Redundancy
- Computer Science
- 2015
Shannon’s linear model of communication piecemeal is extended into a complex systems model and correlations between patterns of relations are distinguished, which span a vector space in which relations are positioned and are thus provided with meaning.
Multiscale Information Theory and the Marginal Utility of Information
- Computer ScienceEntropy
- 2017
A formalism for multiscale information theory is introduced, which allows information to be quantified using any function that satisfies two basic axioms, and discusses two quantitative indices that summarize system structure: an existing index, the complexity profile, and a newIndex, the marginal utility of information.
Towards a Calculus of Redundancy
- Computer ScienceQualitative and Quantitative Analysis of Scientific and Scholarly Communication
- 2021
Shannon’s linear model of communication is extended into a model in which communication is differentiated both vertically and horizontally (Simon, 1973), and relations at level A are first distinguished from correlations among patterns of relations and non-relations at level B.
Self-organization of meaning and the reflexive communication of information
- Computer ScienceSocial sciences information. Information sur les sciences sociales
- 2017
The Shannon model of communication piecemeal is extended into a complex systems model in which communication is differentiated both vertically and horizontally, and next-order codifications of meaning can be expected to generate redundancies when interacting in instantiations.
The Self-Organization of Meaning and the Reflexive Communication of Information
- Computer Science
- 2016
The Shannon model of communication piecemeal is extended into a complex systems model in which communication is differentiated both vertically and horizontally, and next-order codifications of meaning can be expected to generate redundancies when interacting in instantiations.
Mutual redundancies in interhuman communication systems: Steps toward a calculus of processing meaning
- Computer ScienceJ. Assoc. Inf. Sci. Technol.
- 2014
Shannon's theory is extended by defining mutual redundancy as a positional counterpart of the relational communication of information as the surplus of meanings that can be provided to the exchanges in reflexive communications.
Toward a Calculus of Redundancy: The feedback arrow of expectations in knowledge-based systems
- SociologyArXiv
- 2017
A calculus of redundancy is presented as an indicator whereby these dynamics of discourse and meaning may be explored empirically, and using Shannon's equations, the generation and selection of meanings from a horizon of possibilities can be considered probabilistically.
Information, Meaning, and Intellectual Organization in Networks of Inter-Human Communication
- Computer ScienceArXiv
- 2014
The Shannon-Weaver model of linear information transmission is extended with two loops potentially generating redundancies: (i) meaning is provided locally to the information from the perspective of…
References
SHOWING 1-10 OF 41 REFERENCES
THE CO-INFORMATION LATTICE
- Computer Science
- 2003
The co-information lattice sheds light on the problem of approximating a joint density with a set of marginal densities, though as usual the authors run into the partition function.
Interaction information: linear and nonlinear interpretations
- Computer ScienceInt. J. Gen. Syst.
- 2009
Following Ashby’s (1969) definitions, the author defines transmission as the difference between the sum of the entropies for variables independently and the uncertainty in the system of all variables taken together.
Information Theoretical Analysis of Multivariate Correlation
- MathematicsIBM J. Res. Dev.
- 1960
The present paper gives various theorems, according to which Ctot(λ) can be decomposed in terms of the partial correlations existing in subsets of λ, and of quantities derivable therefrom.
Uncertainty and structure as psychological concepts
- Psychology
- 1975
It was a misfortune of psychology that it lacked a tradition of dealing with rigorous mathematical theories when psychologists were first attracted by information theory. Applications were made with…
Ross Ashby's information theory: a bit of history, some solutions to problems, and what we face today
- SociologyInt. J. Gen. Syst.
- 2009
This paper presents a personal history of one strand of W. Ross Ashby's many ideas: using information theory to analyse complex systems empirically, and how his idea of decomposing complex systems into smaller interactions reappears in one of the most complex technologies of the authors' time: cyberspace.
Physical nature of higher-order mutual information: intrinsic correlations and frustration
- Computer SciencePhysical review. E, Statistical physics, plasmas, fluids, and related interdisciplinary topics
- 2000
The higher-order mutual information provides an appropriate measure of the frustration effect and can either be positive or negative depending on the correlation among ensembles.
The dynamics of exchanges and references among scientific texts, and the autopoiesis of discursive knowledge
- Computer ScienceJ. Informetrics
- 2009
Introduction to Cybernetics.
- Computer Science
- 1966
Abstract : This book contains the collected and unified material necessary for the presentation of such branches of modern cybernetics as the theory of electronic digital computers, theory of…
Information Theory and Network Coding
- Computer Science
- 2008
This book contains a thorough discussion of the classical topics in information theory together with the first comprehensive treatment of network coding, a subject first emerged under information…
A Mathematical Theory of Communication
- Computer Science
- 2006
It is proved that the authors can get some positive data rate that has the same small error probability and also there is an upper bound of the data rate, which means they cannot achieve the data rates with any encoding scheme that has small enough error probability over the upper bound.