• Corpus ID: 206781

Facticity as the amount of self-descriptive information in a data set

@article{Adriaans2012FacticityAT,
  title={Facticity as the amount of self-descriptive information in a data set},
  author={Pieter W. Adriaans},
  journal={ArXiv},
  year={2012},
  volume={abs/1203.2245}
}
  • P. Adriaans
  • Published 10 March 2012
  • Computer Science
  • ArXiv
Using the theory of Kolmogorov complexity the notion of facticity {\phi}(x) of a string is defined as the amount of self-descriptive information it contains. It is proved that (under reasonable assumptions: the existence of an empty machine and the availability of a faithful index) facticity is definite, i.e. random strings have facticity 0 and for compressible strings 0 < {\phi}(x) < 1/2 |x| + O(1). Consequently facticity measures the tension in a data set between structural and ad-hoc… 

Figures from this paper

On Information Conservation and Algorithmic Complexity

It is proved that universal Turing machines do not conserve information in a strong sense, but it is conjecture the existence of at least one such machine U, and introduced Information Conserving Algorithmic complexity K∗, defined on a kernel space of random strings.

Two Problems for Sophistication

This work describes two fundamental problems with existing proposals of Kolmogorov complexity: it may be impossible to objectively quantify the sophistication and many of them are shown to be unsound.

Organized Complexity: is Big History a Big Computation?

It is argued in this paper that organized complexity is a valid and useful way to make sense of big history and has a rigorous formal definition in theoretical computer science that hints at a broader research program to quantify complexity in the universe.

Effective Complexity: In Which Sense is It Informative?

This work responds to a criticism of effective complexity made by James McAllister, according to which such a notion is not an appropriate measure for information content and argues that effective complexity is an interesting epistemological concept that may be applied to better understand crucial issues related to context dependence.

Effective Complexity: In Which Sense is It Informative?

This work responds to a criticism of effective complexity made by James McAllister, according to which such a notion is not an appropriate measure for information content and argues that effective complexity is an interesting epistemological concept that may be applied to better understand crucial issues related to context dependence.

Notes on ensembles as a model of theory choice

This paper examines the foundations of the concept of effective complexity proposed by Murray Gell-Mann and Seth Lloyd using the methods of Bayesian inference. Given a data string, x Î 80, 1< of

A Safe Approximation for Kolmogorov Complexity

The Kolmogorov complexity method is applied to the normalized information distance (NID) and conditions that affect the safety of the approximation are discussed.

Maximum entropy economics

A coherent statistical methodology is necessary for analyzing and understanding complex economic systems characterized by large degrees of freedom with non-trivial patterns of interaction and

Maximum entropy economics

A coherent statistical methodology is necessary for analyzing and understanding complex economic systems characterized by large degrees of freedom with non-trivial patterns of interaction and

UvA-DARE (Digital Academic Repository) A safe approximation for Kolmogorov complexity

This work constructs a computable function κ to approximate K in a probabilistic sense: the probability that the error is greater than k decays exponentially with k, and applies this method to the normalized information distance (NID).

Between Order and Chaos: The Quest for Meaningful Information

  • P. Adriaans
  • Computer Science
    Theory of Computing Systems
  • 2009
A theoretical framework that can be seen as a first approximation to a study of meaningful information is developed and it is proved that, under adequate measurement conditions, the free energy of a system in the world is associated with the randomness deficiency of a data set with observations about this system.

Meaningful Information

  • P. Vitányi
  • Computer Science, Mathematics
    IEEE Transactions on Information Theory
  • 2006
The theory of recursive functions statistic, the maximum and minimum value, the existence of absolutely nonstochastic objects (that have maximal sophistication-all the information in them is meaningful and there is no residual randomness), and the relation to the halting problem and further algorithmic properties are developed.

Kolmogorov's structure functions and model selection

The goodness-of-fit of an individual model with respect to individual data is precisely quantify and it is shown that-within the obvious constraints-every graph is realized by the structure function of some data.

Using MDL for Grammar Induction

It is proved that, in DFA induction, already as a result of a single deterministic merge of two nodes, divergence of randomness deficiency and MDL code can occur, which shows why the applications of MDL to grammar induction so far have been disappointing.

Effective Complexity as a Measure of Information Content

This paper argues that the effective complexity of a given string is not uniquely determined, and the concept of effective complexity is unsuitable as a measure of information content.

Logical depth and physical complexity

This work defines an object’s “logical depth” as the time required by a standard universal Turing machine to generate it from an input that is algorithmically random (i.e. Martin-Lof random), and applies depth to the physical problem of “self-organization.”

An Introduction to Kolmogorov Complexity and Its Applications

The book presents a thorough treatment of the central ideas and their applications of Kolmogorov complexity with a wide range of illustrative applications, and will be ideal for advanced undergraduate students, graduate students, and researchers in computer science, mathematics, cognitive sciences, philosophy, artificial intelligence, statistics, and physics.

Effective Complexity

The effective complexity (EC) of an entity is defined as the length of a highly compressed description of its regularities, and a formal approach is needed both to the notion of minimum description length and to the distinction between regularities and those features that are treated as random or incidental.