Corpus ID: 235669916

Learning latent causal graphs via mixture oracles

  title={Learning latent causal graphs via mixture oracles},
  author={Bohdan Kivva and Goutham Rajendran and Pradeep Ravikumar and Bryon Aragam},
We study the problem of reconstructing a causal graphical model from data in the presence of latent variables. The main problem of interest is recovering the causal structure over the latent variables while allowing for general, potentially nonlinear dependence between the variables. In many practical problems, the dependence between raw observations (e.g. pixels in an image) is much less relevant than the dependence between certain high-level, latent features (e.g. concepts or objects), and… Expand

Figures and Tables from this paper


Causal discovery with continuous additive noise models
If the observational distribution follows a structural equation model with an additive noise structure, the directed acyclic graph becomes identifiable from the distribution under mild conditions, which constitutes an interesting alternative to traditional methods that assume faithfulness and identify only the Markov equivalence class of the graph, thus leaving some edges undirected. Expand
Robust causal structure learning with some hidden variables
We introduce a new method to estimate the Markov equivalence class of a directed acyclic graph (DAG) in the presence of hidden variables, in settings where the underlying DAG among the observed… Expand
Learning Loopy Graphical Models with Latent Variables: Efficient Methods and Guarantees
Necessary conditions for structural consistency under any algorithm are derived and the method nearly matches the lower bound on sample requirements. Expand
Latent variable graphical model selection via convex optimization
The modeling framework can be viewed as a combination of dimensionality reduction and graphical modeling (to capture remaining statistical structure not attributable to the latent variables) and it consistently estimates both the number of hidden components and the conditional graphical model structure among the observed variables. Expand
Learning high-dimensional directed acyclic graphs with latent and selection variables
This work proposes the new RFCI algorithm, which is much faster than FCI, and proves consistency of FCI and RFCI in sparse high-dimensional settings, and demonstrates in simulations that the estimation performances of the algorithms are very similar. Expand
Smooth, identifiable supermodels of discrete DAG models with latent variables
We provide a parameterization of the discrete nested Markov model, which is a supermodel that approximates DAG models (Bayesian network models) with latent variables. Such models are widely used in… Expand
Learning the Structure of Linear Latent Variable Models
We describe anytime search procedures that (1) find disjoint subsets of recorded variables for which the members of each subset are d-separated by a single common unrecorded cause, if such exists;… Expand
Causal Discovery with General Non-Linear Relationships using Non-Linear ICA
It is shown rigorously that in the case of bivariate causal discovery, such non-linear ICA can be used to infer the causal direction via a series of independence tests, and an alternative measure of causal direction based on asymptotic approximations to the likelihood ratio is proposed. Expand
Identifiability of parameters in latent structure models with many observed variables
While hidden class models of various types arise in many statistical applications, it is often difficult to establish the identifiability of their parameters. Focusing on models in which there is… Expand
Learning Linear Bayesian Networks with Latent Variables
This work considers the problem of learning linear Bayesian networks when some of the variables are unobserved. Identifiability and efficient recovery from low-order observable moments are… Expand