We show that the linear–non-Gaussian causal discovery framework can be generalized to admit nonlinear functional dependencies as long as the noise on the variables remains additive.Expand

We propose a new framework for causal inference in which we collect all models that do show invariance in their predictive accuracy across settings and interventions.Expand

We propose a Kernel-based Conditional Independence test (KCI-test), by constructing an appropriate test statistic and deriving its asymptotic distribution under the null hypothesis of conditional independence.Expand

We study how to distinguish X causing Y from Y causing X using only purely observational data, i.e., a finite i.i.d. sample drawn from the joint distribution PX,Y .Expand

This work shows how to leverage causal inference to understand the behavior of complex learning systems interacting with their environment and predict the consequences of changes to the system.Expand

We show that if the observational distribution follows a structural equation model with an additive noise structure, the directed acyclic graph becomes identifiable from the distribution under mild conditions.Expand

We develop estimation for potentially high-dimensional additive structural equation models that can be efficiently addressed using sparse regression techniques.Expand

We consider structural equation models in which variables can be written as a function of their parents and noise terms, which are assumed to be jointly independent. Corresponding to each structural… Expand