# Algorithmic Probability|Theory and Applications

@inproceedings{Solomonoff2009AlgorithmicPA,
title={Algorithmic Probability|Theory and Applications},
author={Ray J. Solomonoff},
year={2009}
}
We flrst deflne Algorithmic Probability, an extremely powerful method of inductive inference. We discuss its completeness, incomputability, diversity and subjectivity and show that its incomputability in no way inhibits its use for practical prediction. Applications to Bernoulli sequence prediction and grammar discovery are described. We conclude with a note on its employment in a very strong AI system for very general problem solving.
36 Citations
Diverse Consequences of Algorithmic Probability
• Eray Özkural
• Computer Science
Algorithmic Probability and Friends
• 2011
It is proposed that Solomonoff has effectively axiomatized the field of artificial intelligence, therefore establishing it as a rigorous scientific discipline.
Diverse Consequences of Algorithmic Probability
It is proposed that Solomonoff has effectively axiomatized the field of artificial intelligence, therefore establishing it as a rigorous scientific discipline.
Towards Heuristic Algorithmic Memory
Four synergistic update algorithms that use a Stochastic Context-Free Grammar as a guiding probability distribution of programs are introduced that accomplish adjusting production probabilities, re-using previous solutions, learning programming idioms and discovery of frequent subprograms.
Solomonoff Prediction and Occam’s Razor
It is suggested at times that Solomonoff prediction, or algorithmic information theory in a predictive setting, can deliver an argument to justify Occam's razor, but why it has no justificatory force is explained.
Gigamachine: incremental machine learning on desktop computers
A Levin Search variant based on a stochastic Context Free Grammar together with new update algorithms that use the same grammar as a guiding probability distribution for incremental machine learning are introduced.
An Application of Stochastic Context Sensitive Grammar Induction to Transfer Learning
We generalize Solomonoff’s stochastic context-free grammar induction method to context-sensitive grammars, and apply it to transfer learning problem by means of an efficient update algorithm. The
Low complexity, low probability patterns and consequences for algorithmic probability applications
• Computer Science
ArXiv
• 2022
This work examines some applications of algorithmic probability and discusses some implications of low complexity, low probability patterns for several research areas including simplicity in physics and biology, a priori probability predictions, Solomonoﬀ induction and Occam's razor, machine learning, and password guessing.
Parsimonious Inference
• Computer Science
ArXiv
• 2021
The approaches combine efficient encodings with prudent sampling strategies to construct predictive ensembles without cross-validation, thus addressing a fundamental challenge in how to efficiently obtain predictions from data.
Stochastic Grammar Based Incremental Machine Learning Using Scheme
• Computer Science
AGI 2010
• 2010
Introduction Gigamachine is our initial implementation of an Artificial General Intelligence (AGI system) in the O’Caml language with the goal of building Solomonoff’s “Phase 1 machine” that he
Teraflop-scale Incremental Machine Learning
A Levin Search variant based on Stochastic Context Free Grammar together with four synergistic update algorithms that use the same grammar as a guiding probability distribution of programs are introduced.

## References

SHOWING 1-10 OF 24 REFERENCES
Precise N-Gram Probabilities From Stochastic Context-Free Grammars
• Computer Science
ACL
• 1994
An algorithm for computing n-gram probabilities from stochastic context-free grammars, which operates via the computation of substring expectations, which is accomplished by solving systems of linear equations derived from the grammar.
A SYSTEM FOR INCREMENTAL LEARNING BASED ON ALGORITHMIC PROBABILITY
Algorithmic Probability Theory is employed to construct a system for machine learning of great power and generality and the design of sequences of problems to train is designed.
A PRELIMINARY REPORT ON A GENERAL THEORY OF INDUCTIVE INFERENCE
Some preliminary work is presented on a very general new theory of inductive inference. The extrapolation of an ordered sequence of symbols is implemented by computing the a priori probabilities of
Complexity-based induction systems: Comparisons and convergence theorems
Levin has shown that if tilde{P}'_{M}(x) is an unnormalized form of this measure, and P( x) is any computable probability measure on strings, x, then \tilde{M}'_M}\geqCP (x) where C is a constant independent of x .
A SYSTEM FOR INCREMENTAL LEARNING BASED ON ALGORITHMIC PROBABILITY
The use of training sequences of problems for machine knowledge acquisition promises to yield Expert Systems that will be easier to train and free of the brittleness that characterizes the narrow specialization of present day systems of this sort.
Three Kinds of Probabilistic Induction: Universal Distributions and Convergence Theorems
Three kinds of probabilistic induction problems are described, and general solutions for each are given, with associated convergence theorems that show they tend to give good probability estimates.
Inducing Probabilistic Grammars by Bayesian Model Merging
• Computer Science
ICGI
• 1994
A framework for inducing probabilistic grammars from corpora of positive samples is described, which formalizes a trade-off between a close fit to the data and a default preference for simpler models (‘Occam's Razor’).
The Discovery of Algorithmic Probability
This paper will describe a voyage of discovery — the discovery of Algorithmic Probability, the result of “goal motivated discovery” — like theiscovery of the double helix in biology, but with fewer people involved and relatively little political skullduggery.
Progress In Incremental Machine Learning
This work describes recent developments in a system for machine learning that has been working on for some time and is meant to be a "Scientist’s Assistant" of great power and versatility in many areas of science and mathematics.