Complexity-based induction systems: Comparisons and convergence theorems

@article{Solomonoff1978ComplexitybasedIS,
  title={Complexity-based induction systems: Comparisons and convergence theorems},
  author={Ray J. Solomonoff},
  journal={IEEE Trans. Inf. Theory},
  year={1978},
  volume={24},
  pages={422-432}
}
  • R. Solomonoff
  • Published 1 July 1978
  • Computer Science
  • IEEE Trans. Inf. Theory
In 1964 the author proposed as an explication of {\em a priori} probability the probability measure induced on output strings by a universal Turing machine with unidirectional output tape and a randomly coded unidirectional input tape. Levin has shown that if tilde{P}'_{M}(x) is an unnormalized form of this measure, and P(x) is any computable probability measure on strings, x , then \tilde{P}'_{M}\geqCP(x) where C is a constant independent of x . The corresponding result for the normalized form… 
Hierarchies of Generalized Kolmogorov Complexities and Nonenumerable Universal Measures Computable in the Limit
  • J. Schmidhuber
  • Mathematics, Computer Science
    Int. J. Found. Comput. Sci.
  • 2002
TLDR
A natural hierarchy of generalizations of algorithmic probability and Kolmogorov complexity is obtained, suggesting that the "true" information content of some bitstring x is the size of the shortest nonhalting program that converges to x and nothing but x on a Turing machine that can edit its previous outputs.
The Semimeasure Property of Algorithmic Probability - "Feature" or "Bug"?
  • D. Campbell
  • Computer Science
    Algorithmic Probability and Friends
  • 2011
TLDR
This paper argues that the semimeasure property contributes substantially, in its own right, to the power of an algorithmic-probability-based theory of induction, and that normalization is unnecessary.
KOLMOGOROV'S CONTRIBUTIONS TO INFORMATION THEORY AND ALGORITHMIC COMPLEXITY
TLDR
If the authors let Pu(x) = Pr{U prints x} be the probability that a given computer U prints x when given a random program, it can be shown that log(1/Pu(x)) - K( x) for all x, thus establishing a vital link between the "universal" probability measure Pu and the " universal" complexity K.
Coding-theorem like behaviour and emergence of the universal distribution from resource-bounded algorithmic probability
TLDR
It is shown that up to 60% of the simplicity/complexity bias in distributions produced even by the weakest of the computational models can be accounted for by Algorithmic Probability in its approximation to the Universal Distribution.
The Speed Prior: A New Simplicity Measure Yielding Near-Optimal Computable Predictions
TLDR
This work replaces Solomonoff's optimal but noncomputable method for inductive inference with the novel Speed Prior S, under which the cumulative a priori probability of all data whose computation through an optimal algorithm requires more than O(n) resources is 1/n.
Inductive Reasoning and Kolmogorov Complexity
TLDR
The thesis is developed that Solomonoff's method is fundamental in the sense that many other induction principles can be viewed as particular ways to obtain computable approximations to it.
Predictions and algorithmic statistics for infinite sequence
TLDR
A new way for prediction is suggested that for every finite string $x$ the authors predict the new bit according to the best (in some sence) distribution for £x and this method has no that negative aspect as Solomonoff's method.
Merging with a set of probability measures: A characterization
TLDR
It is argued that the characterization result can be extended to the case of infinitely repeated games and has some interesting applications with regard to the impossibility result in Nachbar (1997, 2005).
The Kolmogorov Lecture* The Universal Distribution and Machine Learning
TLDR
This lecture discusses the Universal Distribution and some of its properties: its accuracy, its incomputability, its subjectivity, and how to use this distribution to create very intelligent machines.
Convergence and Error Bounds for Universal Prediction of Nonbinary Sequences
TLDR
It is shown that the average number of prediction errors made by the universal ξ scheme rapidly Convergence of ξ to µ in a conditional mean squared sense and with µ probability 1 is proven.
...
...

References

SHOWING 1-10 OF 21 REFERENCES
A new algorithm for the recursive identification of stochastic systems using an automaton with slowly growing memory
  • B. Kurtz, P. Caines
  • Mathematics, Computer Science
    1976 IEEE Conference on Decision and Control including the 15th Symposium on Adaptive Processes
  • 1976
TLDR
It is shown that the BED converges strongly to the correct parameter and for a very large class of systems, convergence of parameter estimates is impossible without acess to an unbounded memory.
On Determining the Irrationality of the Mean of a Random Variable
A complexity approach is used to decide whether or not the mean of a sequence of independent identically distributed random variables lies in a n arbitrary specified countable subset of the real
Logical basis for information theory and probability theory
TLDR
A new logical basis for information theory as well as probability theory is proposed, based on computing complexity, according to a new approach to computing complexity.
Computational Complexity and Probability Constructions
TLDR
Using any universal Tur ing machine as a basis, it is possible to cons t ruc t an infinite number of increas ingly accurate computable probabil i ty measures which are independen t of any p robab i l i ty assumpt ions.
Some inequalities [lo] A. N. Kohnogordv, humiztiom of the Theory of Probability. New York: Chelsea. 1950. between Shr&on entropy and Kolmogorov, Chaitin and extension comnlexities
  • Tech. Reo. Statistics Dem.. Stanford Univ
  • 1975
The recursive indentification of stochastic systems using an automaton with slowly growing memory
  • presented at IEEE Sym. Inform. Theory, Cornell Univ., Oct. 1977. [12] T. M. Cover, “On the determination of the irrationality of the
  • 1977
THE COMPLEXITY OF FINITE OBJECTS AND THE DEVELOPMENT OF THE CONCEPTS OF INFORMATION AND RANDOMNESS BY MEANS OF THE THEORY OF ALGORITHMS
TLDR
The present article is a survey of the fundamental results connected with the concept of complexity as the minimum number of binary signs containing all the information about a given object that are sufficient for its recovery (decoding).
A Theory of Program Size Formally Identical to Information Theory
TLDR
A new definition of program-size complexity is made, which has precisely the formal properties of the entropy concept of information theory.
...
...