#### Filter Results:

#### Publication Year

2010

2016

#### Co-author

#### Key Phrase

#### Publication Venue

Learn More

This paper shows that causal model discovery is not an NP-hard problem, in the sense that for sparse graphs bounded by node degree k the sound and complete causal model can be obtained in worst case order N 2(k+2) independence tests, even when latent variables and selection bias may be present. We present a modification of the well-known FCI algorithm that… (More)

We target the problem of accuracy and robustness in causal inference from finite data sets. Our idea is to combine the inherent robustness of Bayesian approaches to causal structure discovery, such as GES, with the theoretical strength and clarity of constraint-based methods such as IC and PC/FCI. We obtain probability estimates on the input statements in a… (More)

We present a novel approach to constraint-based causal discovery, that takes the form of straightforward logical inference, applied to a list of simple, logical statements about causal relations that are derived directly from observed (in)dependencies. It is both sound and complete, in the sense that all invariant features of the corresponding partial… (More)

A long-standing open research problem is how to use information from different experiments, including background knowledge, to infer causal relations. Recent developments have shown ways to use multiple data sets, provided they originate from identical experiments. We present the MCI-algorithm as the first method that can infer provably valid causal… (More)

We target the problem of accuracy and robustness in causal inference from finite data sets. Our aim is to combine the inherent robustness of the Bayesian approach with the theoretical strength and clarity of constraint-based methods. We use a Bayesian score to obtain probability estimates on the input statements used in a constraint-based procedure. These… (More)

Bayesian Constraint-based Causal Discovery (BCCD) is a state-of-the-art method for robust causal discovery in the presence of latent variables. It combines probabilistic estimation of Bayesian networks over subsets of variables with a causal logic to infer causal statements. Currently BCCD is limited to discrete or Gaussian variables. Most of the real-world… (More)

- Tom Claassen, Tom Heskes
- 2010

We tackle the problem of how to use information from multiple (in)dependence models, representing results from different experiments, including background knowledge, in causal discovery. We introduce the framework of a causal system in an external context to derive a connection between strict conditional independencies and causal relations between… (More)

We present two inference rules, based on so called minimal conditional independencies, that are sufficient to find all invariant arrow-heads in a single causal DAG, even when selection bias may be present. It turns out that the set of seven graphical orientation rules that are usually employed to identify these arrowheads are, in fact, just different… (More)

- Tom Claassen, Tom Heskes
- 2011

etc. are disjoint (sets of) observed nodes in a causal DAG G C , and S represents the (possibly empty) set of selection nodes. 2 Background The definition of a causal relation in a causal DAG, rewritten in terms of standard logical properties: Proposition 1. Causal relations in a DAG G C are: Proof. As the edges in G C represent causal relations, a path of… (More)

Causal discovery provides an opportunity to infer causal relationships from purely observational data and to predict the effect of interventions. Constraint-based methods for causal discovery exploit conditional (in)dependencies to infer the direction of causal relationships. They typically work through forward chaining: given some causal statements, others… (More)