Causal Inference Using the Algorithmic Markov Condition

@article{Janzing2010CausalIU,
  title={Causal Inference Using the Algorithmic Markov Condition},
  author={Dominik Janzing and Bernhard Sch{\"o}lkopf},
  journal={IEEE Transactions on Information Theory},
  year={2010},
  volume={56},
  pages={5168-5194}
}
  • D. Janzing, B. Schölkopf
  • Published 23 April 2008
  • Mathematics, Computer Science
  • IEEE Transactions on Information Theory
Inferring the causal structure that links n observables is usually based upon detecting statistical dependences and choosing simple graphs that make the joint measure Markovian. Here we argue why causal inference is also possible when the sample size is one. We develop a theory how to generate causal graphs explaining similarities between single objects. To this end, we replace the notion of conditional stochastic independence in the causal Markov condition with the vanishing of conditional… Expand
Replacing Causal Faithfulness with Algorithmic Independence of Conditionals
TLDR
This paper compares IC with causal faithfulness (FF), stating that only those conditional independences that are implied by the causal Markov condition hold true. Expand
Restricted structural equation models for causal inference
Causal inference tries to solve the following problem: given i.i.d. data from a joint distribution, one tries to infer the underlying causal DAG (directed acyclic graph), in which each nodeExpand
Causal Markov Condition for Submodular Information Measures
TLDR
This work forms a generalized causal Markov condition (CMC) for any kind of observations on which independence is defined via an arbitrary submodular information measure and shows that this CMC is justified if one restricts the attention to a class of causal mechanisms that are adapted to the respective information measure. Expand
Novel Methods of Causal Inference and their impact for standard prediction tasks
  • 2012
Causal inference is usually concerned with exploring causal relations among random variables X1, . . . , Xn after observing sufficiently many samples drawn from the joint probability distribution.Expand
Causal Discovery Beyond Conditional Independences
TLDR
This thesis addresses the problem of causal discovery, that is, recovering the underlying causal structure based on the joint probability distribution of the observed random variables, and estimates a set of Markov equivalent graphs. Expand
Distinguishing Cause and Effect via Second Order Exponential Models
We propose a method to infer causal structures containing both discrete and continuous variables. The idea is to select causal hypotheses for which the conditional density of every variable, givenExpand
Identifiability of Causal Graphs using Functional Models
TLDR
A main theorem is proved that if the data generating process belongs to an IFMOC, one can identify the complete causal graph and is the first identifiability result of this kind that is not limited to linear functional relationships. Expand
Telling cause from effect by local and global regression
TLDR
The linear-time Slope and Sloper algorithms are introduced that through thorough empirical evaluation the authors show outperform the state of the art by a wide margin. Expand
Inferring latent structures via information inequalities
TLDR
An information-theoretic approach is proposed, based on the insight that conditions on entropies of Bayesian networks take the form of simple linear inequalities, and an algorithm for deriving entropic tests for latent structures is described. Expand
Discovering Fully Oriented Causal Networks
TLDR
The Globe algorithm, which greedily adds, removes, and orients edges such that it minimizes the overall cost, is introduced, which shows Globe performs very well in practice, beating the state of the art by a margin. Expand
...
1
2
3
4
5
...

References

SHOWING 1-10 OF 93 REFERENCES
Inference of Graphical Causal Models: Representing the Meaningful Information of Probability Distributions
TLDR
It is argued that if the shortest description of the joint distribution is given by separate descriptions of the conditional distributions for each variable given its effects, the decomposition given by the DAG should be considered as the top-ranked causal hypothesis. Expand
Causal Inference by Choosing Graphs with Most Plausible Markov Kernels
TLDR
This paper discusses the inference rule for causal relationships between two variables in detail, applies it to a real-world temperature data set with known causality and shows that the method provides a correct result for the example. Expand
Causation, prediction, and search
What assumptions and methods allow us to turn observations into causal knowledge, and how can even incomplete causal knowledge be used in planning and prediction to influence and control ourExpand
Causal reasoning by evaluating the complexity of conditional densities with kernel methods
TLDR
A method to quantify the complexity of conditional probability measures by a Hilbert space seminorm of the logarithm of its density of reproducing kernel Hilbert spaces (RKHSs) is proposed. Expand
A Bayesian Approach to Causal Discovery
We examine the Bayesian approach to the discovery of causal DAG models and compare it to the constraint-based approach. Both approaches rely on the Causal Markov condition, but the two differExpand
On causally asymmetric versions of Occam's Razor and their relation to thermodynamics
In real-life statistical data, it seems that conditional probabilities for the effect given their causes tend to be less complex and smoother than conditionals for causes, given their effects. WeExpand
Causal Models as Minimal Descriptions of Multivariate Systems
By applying the minimality principle for model selection, one should seek the model that describes the data by a code of minimal length. Learning is viewed as data compression that exploits theExpand
Algorithmic statistics
TLDR
The algorithmic theory of statistic, sufficient statistic, and minimal sufficient statistic is developed and it is shown that a function is a probabilistic sufficient statistic iff it is with high probability (in an appropriate sense) an algorithmic sufficient statistic. Expand
Causal Inference Using Nonnormality
Path analysis, often applied to observational data to study causal structures, describes causal relationship between observed variables. The path analysis is of confirmatory nature and can makeExpand
On Universal Prediction and Bayesian Confirmation
  • Marcus Hutter
  • Computer Science, Mathematics
  • Theor. Comput. Sci.
  • 2007
TLDR
It is shown that Solomonoff’s model possesses many desirable properties: strong total and future bounds, and weak instantaneous limits, and, in contrast to most classical continuous prior densities, it has no zero p(oste)rior problem. Expand
...
1
2
3
4
5
...