Effective Complexity and Its Relation to Logical Depth

@article{Ay2010EffectiveCA,
  title={Effective Complexity and Its Relation to Logical Depth},
  author={Nihat Ay and Markus M{\"u}ller and Arleta Szkola},
  journal={IEEE Transactions on Information Theory},
  year={2010},
  volume={56},
  pages={4593-4607}
}
Effective complexity measures the information content of the regularities of an object. It has been introduced by Gell-Mann and Lloyd to avoid some of the disadvantages of Kolmogorov complexity. In this paper, we derive a precise definition of effective complexity in terms of algorithmic information theory. We analyze rigorously its basic properties such as effective simplicity of incompressible binary strings and existence of strings that have effective complexity close to their lengths. Since… 

Figures from this paper

Effective Complexity of Stationary Process Realizations
TLDR
This work investigates the effective complexity of binary strings generated by stationary, in general not computable, processes and shows that under not too strong conditions long typical process realizations are effectively simple.
Notes on facticity and effective complexity
TLDR
The length of the shortest input string that generates the target string, as thetarget string's Kolmogorov complexity, or algorithmic information content, is referred to.
Relativity of Depth and Sophistication
TLDR
The measures are relativized to auxiliary information and re-compared to one another to show the ability of auxiliary information to solve the halting problem introduces a distortion between the measures.
Sophistication vs Logical Depth
TLDR
It is shown that theBusy Beaver function of the sophistication of a string exceeds its logical depth with logarithmically bigger precision, and that logical depth exceeds the Busy Beaverfunction of sophistication withLogic depth.
Organized Complexity: is Big History a Big Computation?
TLDR
It is argued in this paper that organized complexity is a valid and useful way to make sense of big history and has a rigorous formal definition in theoretical computer science that hints at a broader research program to quantify complexity in the universe.
Towards a Universal Measure of Complexity
TLDR
It is shown that the most complex is the optimally mixed state consisting of pure states, i.e., of the most regular and most disordered which the space of states of a given system allows.
Fields of Application of Information Geometry
1. Complexity measures can be geometrically built by using the information distance (Kullback–Leibler divergence) from families with restricted statistical dependencies. The Pythagorean geometry
A geometric approach to complexity.
TLDR
A geometric approach to complexity based on the principle that complexity requires interactions at different scales of description is developed, which presents a theory of complexity measures for finite random fields using the geometric framework of hierarchies of exponential families.
Effective Complexity: In Which Sense is It Informative?
TLDR
This work responds to a criticism of effective complexity made by James McAllister, according to which such a notion is not an appropriate measure for information content and argues that effective complexity is an interesting epistemological concept that may be applied to better understand crucial issues related to context dependence.
Information , complexity , and dynamic depth
Why are computers so radically different than brains in terms of phenomenology? The difference is one of complexity but not complexity in mere numbers of elements, interactions, operations per time
...
...

References

SHOWING 1-10 OF 15 REFERENCES
Effective Complexity of Stationary Process Realizations
TLDR
This work investigates the effective complexity of binary strings generated by stationary, in general not computable, processes and shows that under not too strong conditions long typical process realizations are effectively simple.
Meaningful Information
  • P. Vitányi
  • Computer Science, Mathematics
    IEEE Transactions on Information Theory
  • 2006
TLDR
The theory of recursive functions statistic, the maximum and minimum value, the existence of absolutely nonstochastic objects (that have maximal sophistication-all the information in them is meaningful and there is no residual randomness), and the relation to the halting problem and further algorithmic properties are developed.
Sophistication Revisited
TLDR
This work formalizes a connection between sophistication and a variation of computational depth and proves the existence of strings with maximum sophistication and shows that they are the deepest of all strings.
Effective Complexity
TLDR
The effective complexity (EC) of an entity is defined as the length of a highly compressed description of its regularities, and a formal approach is needed both to the notion of minimum description length and to the distinction between regularities and those features that are treated as random or incidental.
Information measures, effective complexity, and total information
This article defines the concept of an information measure and shows how common information measures such as entropy, Shannon information, and algorithmic information content can be combined to solve
Logical depth and physical complexity
Some mathematical and natural objects (a random sequence, a sequence of zeros, a perfect crystal, a gas) are intuitively trivial, while others (e.g. the human body, the digits of π) contain internal
Measures of complexity: a nonexhaustive list
The world has grown more complex recently, and the number of ways of measuring complexity has grown even faster. This multiplication of measures has been taken by some to indicate confusion in the
Algorithmic statistics
TLDR
The algorithmic theory of statistic, sufficient statistic, and minimal sufficient statistic is developed and it is shown that a function is a probabilistic sufficient statistic iff it is with high probability (in an appropriate sense) an algorithmic sufficient statistic.
Information and Complexity in Statistical Modeling
Summary form only. Inspired by Kolmogorov's structure function for finite sets as models of data in the algorithmic theory of information we adapt the construct to families of probability models to
Exploring RANDOMNESS
  • G. Chaitin
  • Computer Science
    Discrete Mathematics and Theoretical Computer Science
  • 2001
...
...