# Causal deconvolution by algorithmic generative models

@article{Zenil2019CausalDB, title={Causal deconvolution by algorithmic generative models}, author={Hector Zenil and Narsis Aftab Kiani and Allan A. Zea and Jesper N. Tegner}, journal={Nature Machine Intelligence}, year={2019}, volume={1}, pages={58-66} }

Complex behaviour emerges from interactions between objects produced by different generating mechanisms. Yet to decode their causal origin(s) from observations remains one of the most fundamental challenges in science. Here we introduce a universal, unsupervised and parameter-free model-oriented approach, based on the seminal concept and the first principles of algorithmic probability, to decompose an observation into its most likely algorithmic generative models. Our approach uses a…

## 58 Citations

### An Algorithmic Information Calculus for Causal Discovery and Reprogramming Systems

- Computer SciencebioRxiv
- 2018

### DisCo: Physics-Based Unsupervised Discovery of Coherent Structures in Spatiotemporal Systems

- Computer Science2019 IEEE/ACM Workshop on Machine Learning in High Performance Computing Environments (MLHPC)
- 2019

In several firsts, the efficacy of DisCo is demonstrated in capturing physically meaningful coherent structures from observational and simulated scientific data, and the first application software developed entirely in Python to scale to over 1000 machine nodes, providing good performance along with ensuring domain scientists' productivity.

### Algorithmic Information Dynamics

- Computer ScienceScholarpedia
- 2020

Algorithmic Information Dynamics (AID) is an algorithmic probabilistic framework for causal discovery and causal analysis. It enables a numerical solution to inverse problems based or motivated on…

### A deeper look into natural sciences with physics-based and data-driven measures

- Computer ScienceiScience
- 2021

### Predicting phenotype transition probabilities via conditional algorithmic probability approximations

- Computer SciencebioRxiv
- 2022

The derived bound is bound, which may facilitate the prediction of transition probabilities directly from examining phenotype themselves, without utilising detailed knowledge of the GP map, by predicting phenotype transition probabilities in simulations of RNA and protein secondary structures.

### Towards Demystifying Shannon Entropy, Lossless Compression and Approaches to Statistical Machine Learning

- Computer ScienceProceedings
- 2020

It is proposed that a fundamental question in science regarding how to find shortcuts for faster adoption of proven mathematical tools can be answered by shortening the adoption cycle and leaving behind old practices in favour of new ones.

### The Thermodynamics of Network Coding, and an Algorithmic Refinement of the Principle of Maximum Entropy †

- Computer ScienceEntropy
- 2019

The analysis provides insight in that the reprogrammability asymmetry appears to originate from a non-monotonic relationship to algorithmic probability, which motivates further analysis of the origin and consequences of the aforementioned asymmetries, reprogmability, and computation.

### A Parsimonious Granger Causality Formulation for Capturing Arbitrarily Long Multivariate Associations

- Computer ScienceEntropy
- 2019

A generalization of autoregressive models for GC estimation based on Wiener–Volterra decompositions with Laguerre polynomials as basis functions is presented, showing that it is able to reproduce current knowledge as well as to uncover previously unknown directed influences between cortical and limbic brain regions.

### Algorithmic Probability-Guided Machine Learning on Non-Differentiable Spaces

- Computer Science, MathematicsFrontiers in Artificial Intelligence
- 2020

It is found that machine learning can successfully be performed on a non-smooth surface using algorithmic complexity, and that solutions can be found using an algorithmic-probability classifier, establishing a bridge between a fundamentally discrete theory of computability and a fundamentally continuous mathematical theory of optimization methods.

### Towards Unsupervised Segmentation of Extreme Weather Events

- Environmental Science, Computer ScienceArXiv
- 2019

A scalable physics-based representation learning method that decomposes spatiotemporal systems into their structurally relevant components, which are captured by latent variables known as local causal states are presented.

## References

SHOWING 1-10 OF 74 REFERENCES

### An Algorithmic Information Calculus for Causal Discovery and Reprogramming Systems

- Computer SciencebioRxiv
- 2018

### Bayesian Structural Inference for Hidden Processes

- Computer SciencePhysical review. E, Statistical, nonlinear, and soft matter physics
- 2014

BSI is applied to in-class examples of finite- and infinite-order Markov processes, as well to an out-of-class, infinite-state hidden process, and it is shown that the former more accurately reflects uncertainty in estimated values.

### Coding-theorem like behaviour and emergence of the universal distribution from resource-bounded algorithmic probability

- Computer ScienceInt. J. Parallel Emergent Distributed Syst.
- 2019

It is shown that up to 60% of the simplicity/complexity bias in distributions produced even by the weakest of the computational models can be accounted for by Algorithmic Probability in its approximation to the Universal Distribution.

### Two-dimensional Kolmogorov complexity and an empirical validation of the Coding theorem method by compressibility

- Computer SciencePeerJ Comput. Sci.
- 2015

Experiments are presented, showing that the measure is stable in the face of some changes in computational formalism and that results are in agreement with the results obtained using lossless compression algorithms when both methods overlap in their range of applicability.

### Two-Dimensional Kolmogorov Complexity and Validation of the Coding Theorem Method by Compressibility

- Computer ScienceArXiv
- 2012

Experiments are presented, showing that the measure is stable in the face of some changes in computational formalism and that results are in agreement with the results obtained using lossless compression algorithms when both methods overlap in their range of applicability.

### The Speed Prior: A New Simplicity Measure Yielding Near-Optimal Computable Predictions

- Mathematics, Computer ScienceCOLT
- 2002

This work replaces Solomonoff's optimal but noncomputable method for inductive inference with the novel Speed Prior S, under which the cumulative a priori probability of all data whose computation through an optimal algorithm requires more than O(n) resources is 1/n.

### Computational Mechanics: Pattern and Prediction, Structure and Simplicity

- Computer ScienceArXiv
- 1999

It is shown that the causal-state representation—an ∈-machine—is the minimal one consistent with accurate prediction, and several results are established on ∉-machine optimality and uniqueness and on how∈-machines compare to alternative representations.

### Nonnegative Decomposition of Multivariate Information

- Computer ScienceArXiv
- 2010

This work reconsider from first principles the general structure of the information that a set of sources provides about a given variable and proposes a definition of partial information atoms that exhaustively decompose the Shannon information in a multivariate system in terms of the redundancy between synergies of subsets of the sources.

### Investigating causal relations by econometric models and cross-spectral methods

- Mathematics
- 1969

There occurs on some occasions a difficulty in deciding the direction of causality between two related variables and also whether or not feedback is occurring. Testable definitions of causality and…

### A Decomposition Method for Global Evaluation of Shannon Entropy and Local Estimations of Algorithmic Complexity

- Computer ScienceEntropy
- 2018

It is shown that the method provides efficient estimations of algorithmic complexity but that it performs like Shannon entropy when it loses accuracy, and the measure may be adapted for use with more multi-dimensional objects than strings, objects such as arrays and tensors.