# Discovery of Causal Models that Contain Latent Variables Through Bayesian Scoring of Independence Constraints

@article{Jabbari2017DiscoveryOC, title={Discovery of Causal Models that Contain Latent Variables Through Bayesian Scoring of Independence Constraints}, author={Fattaneh Jabbari and Joseph Ramsey and Peter L. Spirtes and Gregory F. Cooper}, journal={Machine learning and knowledge discovery in databases : European Conference, ECML PKDD ... : proceedings. ECML PKDD}, year={2017}, volume={2017}, pages={ 142-157 } }

Discovering causal structure from observational data in the presence of latent variables remains an active research area. Constraint-based causal discovery algorithms are relatively efficient at discovering such causal models from data using independence tests. Typically, however, they derive and output only one such model. In contrast, Bayesian methods can generate and probabilistically score multiple models, outputting the most probable one; however, they are often computationally infeasibleā¦Ā

## 22 Citations

Hybrid Bayesian network discovery with latent variables by scoring multiple interventions

- Computer ScienceArXiv
- 2021

This work presents the hybrid mFGS-BS (majority rule and Fast Greedy equivalence Search with Bayesian Scoring) algorithm for structure learning from discrete data that involves an observational data set and one or more interventional data sets and produces a Partial Ancestral Graph (PAG).

Identification of Latent Variables From Graphical Model Residuals

- Computer ScienceArXiv
- 2021

A novel method is presented that aims to control for the latent space when estimating a DAG by iteratively deriving proxies forThe latent space from the residuals of the inferred model and enhances identifiability of the causal effect.

Causal learning with sufficient statistics: an information bottleneck approach

- Computer ScienceArXiv
- 2020

Using the Information Bottleneck method, a technique commonly applied for dimensionality reduction, to find underlying sufficient sets of statistics, new additional rules of causal orientation are formulated that provide causal information not obtainable from standard structure learning algorithms, which exploit only conditional independencies between observable variables.

An Instance-Specific Algorithm for Learning the Structure of Causal Bayesian Networks Containing Latent Variables

- Computer ScienceSDM
- 2020

Simulations support that the proposed instance-specific method improves structure-discovery performance compared to an existing PAG-learning method called GFCI, and results provide support for instance- specific causal relationships existing in real-time.

Learning Bayesian Networks That Enable Full Propagation of Evidence

- Computer ScienceIEEE Access
- 2020

The results suggest that the proposed algorithm discovers satisfactorily accurate connected DAGs in cases where other algorithms produce multiple disjoint subgraphs that often underfit the true graph.

Learning Optimal Cyclic Causal Graphs from Interventional Data

- Computer SciencePGM
- 2020

This work generalizes bcause, a recent exact branch-and-bound causal discovery approach, to this setting, integrating support for the sigma-separation criterion and several interventional datasets, and empirically evaluates bcause and the refined ASP-approach.

High-recall causal discovery for autocorrelated time series with latent confounders

- Computer ScienceNeurIPS
- 2020

This work presents a new method for linear and nonlinear, lagged and contemporaneous constraint-based causal discovery from observational time series in the presence of latent confounders and proves that the method is order-independent, and sound and complete in the oracle case.

Discovering causal graphs with cycles and latent confounders: An exact branch-and-bound approach

- Computer ScienceInt. J. Approx. Reason.
- 2020

Instance-Specific Bayesian Network Structure Learning

- Computer SciencePGM
- 2018

This work introduces an instance-specific BN structure learning method that searches the space of Bayesian networks to build a model that is specific to an instance by guiding the search based on attributes of the given instance (e.g., patient symptoms, signs, lab results, and genotype).

A survey of Bayesian Network structure learning

- Computer ScienceArXiv
- 2021

This paper provides a comprehensive review of combinatoric algorithms proposed for learning BN structure from data, describing 61 algorithms including prototypical, well-established and stateof-the-art approaches.

## References

SHOWING 1-10 OF 37 REFERENCES

A Bayesian Approach to Causal Discovery

- Computer Science
- 2006

The general Bayesian approach to causal discovery is described and approximation methods for missing data and hidden variables are reviewed, and differences between the Bayesian and constraint-based methods are illustrated using artificial and real examples.

Ancestor Relations in the Presence of Unobserved Variables

- Computer ScienceECML/PKDD
- 2011

The experimental results show that ancestor relations between observed variables, arcs in particular, can be learned with good power even when a majority of the involved variables are unobserved.

Learning Neighborhoods of High Confidence in Constraint-Based Causal Discovery

- Computer ScienceProbabilistic Graphical Models
- 2014

PROPeR, a method for estimating posterior probabilities of pairwise relations of a network skeleton as a function of the corresponding p-values, is presented and it is demonstrated that the posterior probability estimates are reasonable and comparable with estimates obtained using more expensive Bayesian methods.

The variational Bayesian EM algorithm for incomplete data: with application to scoring graphical model structures

- Computer Science
- 2003

This method constructs and optimises a lower bound on the marginal likelihood using variational calculus, resulting in an iterative algorithm which generalises the EM algorithm by maintaining posterior distributions over both latent variables and parameters.

A Bayesian Approach to Constraint Based Causal Inference

- Computer ScienceUAI
- 2012

Tests show that a basic implementation of the resulting Bayesian Constraint-based Causal Discovery (BCCD) algorithm already outperforms established procedures such as FCI and Conservative PC.

On the completeness of orientation rules for causal discovery in the presence of latent confounders and selection bias

- Computer ScienceArtif. Intell.
- 2008

Learning Latent Tree Graphical Models

- Computer ScienceJ. Mach. Learn. Res.
- 2011

This work proposes two consistent and computationally efficient algorithms for learning minimal latent trees, that is, trees without any redundant hidden nodes, and applies these algorithms to both discrete and Gaussian random variables.

Learning high-dimensional directed acyclic graphs with latent and selection variables

- Computer Science
- 2012

This work proposes the new RFCI algorithm, which is much faster than FCI, and proves consistency of FCI and RFCI in sparse high-dimensional settings, and demonstrates in simulations that the estimation performances of the algorithms are very similar.

Identifiability of Causal Graphs using Functional Models

- Computer Science, MathematicsUAI
- 2011

A main theorem is proved that if the data generating process belongs to an IFMOC, one can identify the complete causal graph and is the first identifiability result of this kind that is not limited to linear functional relationships.

Learning Bayesian Networks: The Combination of Knowledge and Statistical Data

- Computer ScienceMachine Learning
- 2004

A methodology for assessing informative priors needed for learning Bayesian networks from a combination of prior knowledge and statistical data is developed and how to compute the relative posterior probabilities of network structures given data is shown.