Information of interactions in complex systems

  title={Information of interactions in complex systems},
  author={Klaus Krippendorff},
  journal={International Journal of General Systems},
  pages={669 - 680}
  • K. Krippendorff
  • Published 22 July 2009
  • Computer Science
  • International Journal of General Systems
This paper addresses misconceptions of the multi-variate interaction-information measure Q, which several authors have reinvented since its proposal by McGill (1954), giving it a variety of names and interpretations. McGill’s measure claimed to quantify the amount of information of interactions among three or more variables in complex systems. In (Krippendorff, 1980), I raised doubts about the validity of Q and its relatives. The chief problem that Q-measures fail to recognize is that complex… 
Redundancy in Systems which Entertain a Model of Themselves: Interaction Information and the Self-organization of Anticipation
It is argued that Q provides a measure of the imprint of a second-order observing system—a model entertained by the system itself—on the underlying information processing within the information processing.
Multivariate Dependence Beyond Shannon Information
The vast majority of Shannon information measures are simply inadequate for determining the meaningful dependency structure within joint probability distributions and are inadequate for discovering intrinsic causal relations, particularly when employing information to express the organization and mechanisms embedded in complex systems.
The Generation and Self-organization of Meaning in the Communication of Information and Redundancy
Shannon’s linear model of communication piecemeal is extended into a complex systems model and correlations between patterns of relations are distinguished, which span a vector space in which relations are positioned and are thus provided with meaning.
Multiscale Information Theory and the Marginal Utility of Information
A formalism for multiscale information theory is introduced, which allows information to be quantified using any function that satisfies two basic axioms, and discusses two quantitative indices that summarize system structure: an existing index, the complexity profile, and a newIndex, the marginal utility of information.
Towards a Calculus of Redundancy
  • L. Leydesdorff
  • Computer Science
    Qualitative and Quantitative Analysis of Scientific and Scholarly Communication
  • 2021
Shannon’s linear model of communication is extended into a model in which communication is differentiated both vertically and horizontally (Simon, 1973), and relations at level A are first distinguished from correlations among patterns of relations and non-relations at level B.
Self-organization of meaning and the reflexive communication of information
The Shannon model of communication piecemeal is extended into a complex systems model in which communication is differentiated both vertically and horizontally, and next-order codifications of meaning can be expected to generate redundancies when interacting in instantiations.
The Self-Organization of Meaning and the Reflexive Communication of Information
The Shannon model of communication piecemeal is extended into a complex systems model in which communication is differentiated both vertically and horizontally, and next-order codifications of meaning can be expected to generate redundancies when interacting in instantiations.
Mutual redundancies in interhuman communication systems: Steps toward a calculus of processing meaning
Shannon's theory is extended by defining mutual redundancy as a positional counterpart of the relational communication of information as the surplus of meanings that can be provided to the exchanges in reflexive communications.
Toward a Calculus of Redundancy: The feedback arrow of expectations in knowledge-based systems
A calculus of redundancy is presented as an indicator whereby these dynamics of discourse and meaning may be explored empirically, and using Shannon's equations, the generation and selection of meanings from a horizon of possibilities can be considered probabilistically.
Information, Meaning, and Intellectual Organization in Networks of Inter-Human Communication
The Shannon-Weaver model of linear information transmission is extended with two loops potentially generating redundancies: (i) meaning is provided locally to the information from the perspective of


The co-information lattice sheds light on the problem of approximating a joint density with a set of marginal densities, though as usual the authors run into the partition function.
Interaction information: linear and nonlinear interpretations
Following Ashby’s (1969) definitions, the author defines transmission as the difference between the sum of the entropies for variables independently and the uncertainty in the system of all variables taken together.
Information Theoretical Analysis of Multivariate Correlation
The present paper gives various theorems, according to which Ctot(λ) can be decomposed in terms of the partial correlations existing in subsets of λ, and of quantities derivable therefrom.
Uncertainty and structure as psychological concepts
It was a misfortune of psychology that it lacked a tradition of dealing with rigorous mathematical theories when psychologists were first attracted by information theory. Applications were made with
Ross Ashby's information theory: a bit of history, some solutions to problems, and what we face today
This paper presents a personal history of one strand of W. Ross Ashby's many ideas: using information theory to analyse complex systems empirically, and how his idea of decomposing complex systems into smaller interactions reappears in one of the most complex technologies of the authors' time: cyberspace.
Physical nature of higher-order mutual information: intrinsic correlations and frustration
  • Matsuda
  • Computer Science
    Physical review. E, Statistical physics, plasmas, fluids, and related interdisciplinary topics
  • 2000
The higher-order mutual information provides an appropriate measure of the frustration effect and can either be positive or negative depending on the correlation among ensembles.
Introduction to Cybernetics.
Abstract : This book contains the collected and unified material necessary for the presentation of such branches of modern cybernetics as the theory of electronic digital computers, theory of
Information Theory and Network Coding
This book contains a thorough discussion of the classical topics in information theory together with the first comprehensive treatment of network coding, a subject first emerged under information
A Mathematical Theory of Communication
It is proved that the authors can get some positive data rate that has the same small error probability and also there is an upper bound of the data rate, which means they cannot achieve the data rates with any encoding scheme that has small enough error probability over the upper bound.