Structure estimation for discrete graphical models: Generalized covariance matrices and their inverses

@inproceedings{Loh2012StructureEF,
  title={Structure estimation for discrete graphical models: Generalized covariance matrices and their inverses},
  author={Po-Ling Loh and Martin J. Wainwright},
  booktitle={NIPS},
  year={2012}
}
We investigate the relationship between the structure of a discrete graphical model and the support of the inverse of a generalized covariance matrix. We show that for certain graph structures, the support of the inverse covariance matrix of indicator variables on the vertices of a graph reflects the conditional independence structure of the graph. Our work extends results that have previously been established only in the context of multivariate Gaussian graphical models, thereby addressing an… 

Figures from this paper

Learning discrete graphical models via generalized inverse covariance matrices
TLDR
The population-level results in Loh and Wainwright 2013 have theoretically rigorous consequences for global graph selection methods and local neighborhood selection methods.
Learning non-Gaussian graphical models via Hessian scores and triangular transport
TLDR
An algorithm for learning the Markov structure of continuous and non-Gaussian distributions is proposed and it is shown that the algorithm recovers the graph structure even with a biased approximation to the density.
A U-statistic Approach to Hypothesis Testing for Structure Discovery in Undirected Graphical Models
TLDR
The proposed test enables one to answer with statistical significance whether an edge is present in the graph, and convergence results are known for a wide range of distributions, enabling the application of the test to large data samples for which computation time becomes a limiting factor.
An additive graphical model for discrete data
TLDR
A nonparametric graphical model for discrete node variables based on additive conditional independence that does not suffer from the restriction of a parametric model such as the Ising model is introduced and the new graphical model reduces to a conditional independence graphical model under certain sparsity conditions.
Bayesian Graphical Models for Multivariate Functional Data
TLDR
This work introduces a notion of conditional independence between random functions, builds a framework for Bayesian inference of undirected, decomposable graphs in the multivariate functional data context, and proposes a hyper-inverse-Wishart-process prior for the covariance kernels of the infinite coefficient sequences of the basis expansion.
High-dimensional learning of linear causal networks via inverse covariance estimation
TLDR
It is shown that when the error variances are known or estimated to close enough precision, the true DAG is the unique minimizer of the score computed using the reweighted squared l2-loss.
Graphical models for extremes
  • Sebastian Engelke, Adrien Hitz
  • Computer Science, Mathematics
    Journal of the Royal Statistical Society: Series B (Statistical Methodology)
  • 2020
TLDR
A general theory of conditional independence for multivariate Pareto distributions is introduced that enables the definition of graphical models and sparsity for extremes and it is shown that, similarly to the Gaussian case, the sparsity pattern of a general extremal graphical model can be read off from suitable inverse covariance matrices.
Beyond normality: Learning sparse probabilistic graphical models in the non-Gaussian setting
TLDR
An algorithm to identify sparse dependence structure in continuous and non-Gaussian probability distributions, given a corresponding set of data is presented, which relies on exploiting the connection between the sparsity of the graph and theSparsity of transport maps, which deterministically couple one probability measure to another.
Discriminant analysis for discrete variables derived from a tree-structured graphical model
TLDR
Discriminant analysis based on tree-structured graphical models, a simple nonlinear method including only some of the pairwise interactions between variables, is competitive with, and sometimes superior to, other methods which assume no interactions, and has the advantage over more complex decomposable models of finding the graph structure in a fast way and exact form.
J an 2 01 6 Bayesian Graphical Models for Multivariate Functional Data
TLDR
This work introduces a notion of conditional independence between random functions, builds a framework for Bayesian inference of undirected, decomposable graphs in the multivariate functional data context, and proposes a hyper-inverse-Wishart-process prior for the covariance kernels of the infinite coefficient sequences of the basis expansion.
...
1
2
3
4
5
...

References

SHOWING 1-10 OF 45 REFERENCES
High-dimensional graphs and variable selection with the Lasso
TLDR
It is shown that neighborhood selection with the Lasso is a computationally attractive alternative to standard covariance selection for sparse high-dimensional graphs and is hence equivalent to variable selection for Gaussian linear models.
On Learning Discrete Graphical Models using Group-Sparse Regularization
TLDR
Surprisingly, it is shown that under slightly more stringent conditions, the pairwise procedure still recovers the graph structure, when the samples scale as n > K(m −1) 2 d 3 2 c 1 log((m − 1) c (p − 1), c 1 ).
Graphical Models, Exponential Families, and Variational Inference
TLDR
The variational approach provides a complementary alternative to Markov chain Monte Carlo as a general source of approximation methods for inference in large-scale statistical models.
Regularized rank-based estimation of high-dimensional nonparanormal graphical models
TLDR
It is shown that the nonparanormal graphical model can be efficiently estimated by using a rank-based estimation scheme which does not require estimating these unknown transformation functions.
High-dimensional covariance estimation by minimizing ℓ1-penalized log-determinant divergence
TLDR
The first result establishes consistency of the estimate b � in the elementwise maximum-norm, which allows us to derive convergence rates in Frobenius and spectral norms, and shows good correspondences between the theoretical predictions and behavior in simulations.
Information-Theoretic Limits of Selecting Binary Graphical Models in High Dimensions
TLDR
The information-theoretic limitations of the problem of graph selection for binary Markov random fields under high-dimensional scaling, in which the graph size and the number of edges k, and/or the maximal node degree d, are allowed to increase to infinity as a function of the sample size n, are analyzed.
Model selection and estimation in the Gaussian graphical model
TLDR
The implementation of the penalized likelihood methods for estimating the concentration matrix in the Gaussian graphical model is nontrivial, but it is shown that the computation can be done effectively by taking advantage of the efficient maxdet algorithm developed in convex optimization.
Sparse inverse covariance estimation with the graphical lasso.
TLDR
Using a coordinate descent procedure for the lasso, a simple algorithm is developed that solves a 1000-node problem in at most a minute and is 30-4000 times faster than competing methods.
High-dimensional structure estimation in Ising models: Local separation criterion
TLDR
A novel criterion for tractable graph families, where this method is efficient, based on the presence of sparse local separators between node pairs in the underlying graph, is introduced.
High Dimensional Semiparametric Gaussian Copula Graphical Models
TLDR
It is proved that the nonparanormal skeptic achieves the optimal parametric rates of convergence for both graph recovery and parameter estimation, and this result suggests that the NonParanormal graphical models can be used as a safe replacement of the popular Gaussian graphical models, even when the data are truly Gaussian.
...
1
2
3
4
5
...