Diffusion Models for Causal Discovery via Topological Ordering

  title={Diffusion Models for Causal Discovery via Topological Ordering},
  author={Pedro Sanchez and Xiao Liu and Alison Q. O'Neil and Sotirios A. Tsaftaris},
Discovering causal relations from observational data becomes possible with additional assumptions such as considering the functional relations to be constrained as nonlinear with additive noise. In this case, the Hessian of the data log-likelihood can be used for finding leaf nodes in a causal graph. Topological ordering ap-proaches for causal discovery exploit this by performing graph discovery in two steps, first sequentially identifying nodes in reverse order of depth ( topological ordering… 

Figures and Tables from this paper



Ordering-Based Causal Discovery with Reinforcement Learning

This work forms the ordering search problem as a multi-step Markov decision process, implements the ordering generating process with an encoder-decoder architecture, and uses RL to optimize the proposed model based on the reward mechanisms designed for each ordering.

Masked Gradient-Based Causal Structure Learning

A masked gradient-based structure learning method based on binary adjacency matrix that exists for any structural equation model that can readily include any differentiable score function and model function for learning causal structures is proposed.

Causal discovery with continuous additive noise models

If the observational distribution follows a structural equation model with an additive noise structure, the directed acyclic graph becomes identifiable from the distribution under mild conditions, which constitutes an interesting alternative to traditional methods that assume faithfulness and identify only the Markov equivalence class of the graph, thus leaving some edges undirected.

Learning directed acyclic graph models based on sparsest permutations

The sparsest permutation (SP) algorithm is proposed, showing that learning Bayesian networks is possible under strictly weaker assumptions than faithfulness, but this comes at a computational price, thereby indicating a statistical‐computational trade‐off for causal inference algorithms.

Score matching enables causal discovery of nonlinear additive noise models

This paper demonstrates how to recover causal graphs from the score of the data distribution in nonlinear additive (Gaussian) noise models and proposes a new efficient method for approximating the score’s Jacobian, enabling to recover the causal graph.

Being Bayesian About Network Structure. A Bayesian Approach to Structure Discovery in Bayesian Networks

This paper shows how to efficiently compute a sum over the exponential number of networks that are consistent with a fixed order over network variables, and uses this result as the basis for an algorithm that approximates the Bayesian posterior of a feature.

DAGs with NO TEARS: Continuous Optimization for Structure Learning

This paper forms the structure learning problem as a purely continuous optimization problem over real matrices that avoids this combinatorial constraint entirely and achieves a novel characterization of acyclicity that is not only smooth but also exact.

Diffusion Causal Models for Counterfactual Estimation

The proposed Diff-SCM is a deep structural causal model that builds on recent advances of generative energy-based models and produces more realistic and minimal counterfactuals than baselines on MNIST data and can also be applied to ImageNet data.

Structural Intervention Distance for Evaluating Causal Graphs

The proposed structural intervention distance (SID) is based on a graphical criterion only and quantifies the closeness between two DAGs in terms of their corresponding causal inference statements, well suited for evaluating graphs that are used for computing interventions.

Ordering-Based Search: A Simple and Effective Algorithm for Learning Bayesian Networks

It is shown that ordering-based search outperforms the standard baseline, and is competitive with recent algorithms that are much harder to implement.