• Corpus ID: 219559309

Structure Learning for Cyclic Linear Causal Models

@inproceedings{Amendola2020StructureLF,
  title={Structure Learning for Cyclic Linear Causal Models},
  author={Carlos Am'endola and Philipp Dettling and Mathias Drton and Federica Onori and Jun Wu},
  booktitle={UAI},
  year={2020}
}
We consider the problem of structure learning for linear causal models based on observational data. We treat models given by possibly cyclic mixed graphs, which allow for feedback loops and effects of latent confounders. Generalizing related work on bow-free acyclic graphs, we assume that the underlying graph is simple. This entails that any two observed variables can be related through at most one direct causal effect and that (confounding-induced) correlation between error terms in structural… 

Figures and Tables from this paper

A powerful test for differentially expressed gene pathways via graph-informed structural equation modeling
TLDR
T2-DAG, a Hotelling’s T 2-type test for detecting differentially expressed gene pathways, which efficiently leverages the auxiliary pathway information on gene interactions through a linear structural equation model is proposed.
Learning Bayesian Networks in the Presence of Structural Side Information
TLDR
It is shown that bounded treewidth BNs can be learned with polynomial complexity and the performance and the scalability of the algorithms are evaluated and show that they outperform the state-of-the-art structure learning algorithms.
Maximal ancestral graph structure learning via exact search
TLDR
This work develops a methodology for score-based structure learning of directed maximal ancestral graphs employing a linear Gaussian BIC score, as well as score pruning techniques, which are essential for exact structure learning approaches.
Computing Maximum Likelihood Estimates for Gaussian Graphical Models with Macaulay2
TLDR
This package allows the computation of MLEs for the class of loopless mixed graphs in the computer algebra system Macaulay2 and additional functionality allows the user to explore the underlying algebraic structure of the model, such as its maximum likelihood degree and the ideal of score equations.
The spectrum of covariance matrices of randomly connected recurrent neuronal networks
TLDR
Using simple connectivity models, this work provides theoretical predictions for the covariance spectrum, a fundamental property of recurrent neuronal dynamics, that can be compared with experimental data.

References

SHOWING 1-10 OF 39 REFERENCES
Distributional Equivalence and Structure Learning for Bow-free Acyclic Path Diagrams
TLDR
Some necessary and some sufficient conditions for distributional equivalence of BAPs are proved which are used in an algorithmic approach to compute (nearly) equivalent model structures, which allows us to infer lower bounds of causal effects.
Causal Protein-Signaling Networks Derived from Multiparameter Single-Cell Data
TLDR
Reconstruction of network models from physiologically relevant primary single cells might be applied to understanding native-state tissue signaling biology, complex drug actions, and dysfunctional signaling in diseased cells.
Consistency Guarantees for Greedy Permutation-Based Causal Inference Algorithms
TLDR
This work provides the first consistency guarantees, both uniform and high-dimensional, of a greedy permutation-based search over the edge-graph of a sub-polytope of the permutohedron, called the DAG associahedron.
Model selection and local geometry
  • R. Evans
  • Computer Science
    The Annals of Statistics
  • 2020
TLDR
A generic algorithm for learning Bayesian network models is given and it is shown that Bayesiannetwork models, amongst others, cannot be learned directly with a convex method similar to the graphical lasso.
Characterizing Distribution Equivalence for Cyclic and Acyclic Directed Graphs
TLDR
This paper presents a general, unified notion of equivalence for linear Gaussian directed graphs, based on the set of distributions that the structure is able to generate, and proposes a score-based method for learning the structure from observational data.
Computation of maximum likelihood estimates in cyclic structural equation models
TLDR
This work proposes a block-coordinate descent method that cycles through the considered variables, updating only the parameters related to a given variable in each step, and shows that the resulting block update problems can be solved in closed form even when the structural equation model comprises feedback cycles.
Efficient Identification in Linear Structural Causal Models with Instrumental Cutsets
TLDR
This paper develops a method to efficiently find unconditioned instrumental subsets, which are generalizations of IVs that can be used to tame the complexity of many canonical algorithms found in the literature.
Handbook of graphical models. Chapman & Hall/CRC Handbooks of Modern Statistical Methods
  • 2019
The maximum likelihood threshold of a path diagram
TLDR
This work clarifies, in particular, that standard likelihood inference is applicable to sparse high-dimensional models even if they feature feedback loops, and proves that if the sample size is below the threshold, then the likelihood function is almost surely unbounded.
...
1
2
3
4
...