Causal Graphical Models with Latent Variables: Learning and Inference

@inproceedings{Meganck2007CausalGM,
  title={Causal Graphical Models with Latent Variables: Learning and Inference},
  author={Stijn Meganck and Philippe Leray and Bernard Manderick},
  booktitle={ECSQARU},
  year={2007}
}
Several paradigms exist for modeling causal graphical models for discrete variables that can handle latent variables without explicitly modeling them quantitatively. Applying them to a problem domain consists of different steps: structure learning, parameter learning and using them for probabilistic or causal inference. We discuss two well-known formalisms, namely semi-Markovian causal models and maximal ancestral graphs and indicate their strengths and limitations. Previously an algorithm has… 

A new approach to learn the projection of latent Causal Bayesian Networks

  • Xia LiuYoulong Yang
  • Computer Science
    2012 International Conference on Systems and Informatics (ICSAI2012)
  • 2012
TLDR
A new approach which contains two stages to learn a skeleton of projection of Causal Bayesian Networks with unobserved variables by using a new algorithm LSofP, which reduces computation amount of and improves reliability of conditional independence tests compared with other existing algorithms.

Structure-Learning of Causal Bayesian Networks Based on adjacent Nodes

TLDR
An algorithm called BSPC based on adjacent nodes to learn the structure of Causal Bayesian Networks with unobserved variables by using observational data is presented, which reduces computational complexity and improves reliability of conditional independence tests.

Causal discovery using clusters from observational data

TLDR
A new method is proposed, and it is shown, using both artificial and real data, that accounting for clusters in the data leads to more accurate learning of causal structures.

Integrating Ontological Knowledge for Iterative Causal Discovery and Visualization

TLDR
This article proposes a new method for learning CBNs from observational data and interventions by adding a new step based on the integration of ontological knowledge, which will allow us to choose efficiently the interventions to perform in order to obtain the complete CBN.

Probabilistic matching: Causal inference under measurement errors

TLDR
This study proposes a novel approach for causal inference when one or more key variables are noisy and utilizes the knowledge about the uncertainty of the real values of key variables in order to reduce the bias induced by noisy measurements.

Time Series Deconfounder: Estimating Treatment Effects over Time in the Presence of Hidden Confounders

TLDR
A method that leverages the assignment of multiple treatments over time to enable the estimation of treatment effects in the presence of multi-cause hidden confounders and a theoretical analysis for obtaining unbiased causal effects of time-varying exposures using the Time Series Deconfounder.

Machine learning approaches to statistical dependences and causality Dagstuhl Seminar

TLDR
Machine learning approaches to statistical dependences and causality was held in Schloss Dagstuhl Leibniz Center for Informatics, and several participants presented their current research and ongoing work and open problems were discussed.

A factorization criterion for acyclic directed mixed graphs

TLDR
This paper presents a factorization criterion for cyclic directed mixed graphs that is equivalent to the global Markov property given by (the natural extension of) dseparation.

Towards an Integral Approach for Modeling Causality

TLDR
HAL is a multi-disciplinary open access archive for the deposit and dissemination of scientific research documents, whether they are published or not, and towards an Integral Approach for Modeling Causality Stijn Meganck.

References

SHOWING 1-10 OF 35 REFERENCES

Learning Semi-Markovian Causal Models using Experiments

TLDR
This paper provides a set of rules that indicate which experiments are needed in order to transform a CPAG to a completely oriented SMCM and how the results of these experiments have to be processed and shows how this parametrisation can be used to develop methods to efficiently perform both probabilistic and causal inference.

A Tutorial on Learning with Bayesian Networks

  • D. Heckerman
  • Computer Science
    Innovations in Bayesian Networks
  • 2008
TLDR
Methods for constructing Bayesian networks from prior knowledge are discussed and methods for using data to improve these models are summarized, including techniques for learning with incomplete data.

Active Learning for Structure in Bayesian Networks

TLDR
Experimental results show that active learning can substantially reduce the number of observations required to determine the structure of a domain.

An Introduction to Variational Methods for Graphical Models

TLDR
This paper presents a tutorial introduction to the use of variational methods for inference and learning in graphical models (Bayesian networks and Markov random fields), and describes a general framework for generating variational transformations based on convex duality.

Learning Causal Bayesian Networks from Observations and Experiments: A Decision Theoretic Approach

TLDR
An algorithm is introduced that allows to actively add results of experiments so that arcs can be directed during learning and it is shown that this approach allows to learn a causal Bayesian network optimally with relation to a number of decision criteria.

Causal Inference and Reasoning in Causally Insu-cient Systems

TLDR
This dissertation shows that the FCI algorithm, a sound inference procedure in the literature for inferring features of the unknown causal structure from facts of probabilistic independence and dependence, is complete in the sense that any feature of the causal structure left undecided by the inference procedure is indeed underdetermined by facts of Probabilistic Independence and dependence.

Causal Discovery from a Mixture of Experimental and Observational Data

TLDR
The learning method was applied to predict the causal structure and to estimate the causal parameters that exist among randomly selected pairs of nodes in ALARM that are not confounded.

On the Testable Implications of Causal Models with Hidden Variables

TLDR
This paper offers a systematic way of identifying functional constraints and facilitates the task of testing causal models as well as inferring such models from data.

A Transformational Characterization of Markov Equivalence for Directed Acyclic Graphs with Latent Variables

TLDR
This paper establishes a transformational characterization of Markov equivalence for directed MAGs, which it is expected will have similar uses as it does for DAGs.

Markov Equivalence Classes for Maximal Ancestral Graphs

TLDR
A join operation is defined on ancestral graphs which will associate a unique graph with a Markov equivalence class, thereby facilitating model search and providing a proof of the pairwise Markov property for joined ancestral graphs.