Model selection and estimation in the Gaussian graphical model

@article{Yuan2007ModelSA,
  title={Model selection and estimation in the Gaussian graphical model},
  author={Ming Yuan and Yi Lin},
  journal={Biometrika},
  year={2007},
  volume={94},
  pages={19-35}
}
We propose penalized likelihood methods for estimating the concentration matrix in the Gaussian graphical model. The methods lead to a sparse and shrinkage estimator of the concentration matrix that is positive definite, and thus conduct model selection and estimation simultaneously. The implementation of the methods is nontrivial because of the positive definite constraint on the concentration matrix, but we show that the computation can be done effectively by taking advantage of the efficient… 

Figures from this paper

TUNING PARAMETER SELECTION FOR PENALIZED LIKELIHOOD ESTIMATION OF GAUSSIAN GRAPHICAL MODEL
In a Gaussian graphical model, the conditional independence between two variables are characterized by the corresponding zero entries in the inverse covariance matrix. Maximum likelihood method using
Regularized parameter estimation of high dimensional t distribution
Generalized information criterion for model selection in penalized graphical models
This paper introduces an estimator of the relative directed distance between an estimated model and the true model, based on the Kulback-Leibler divergence and is motivated by the generalized
Bayesian sparse graphical models and their mixtures
TLDR
A novel type of selection prior is introduced that develops a sparse structure on the precision matrix by making most of the elements exactly zero, in addition to ensuring positive definiteness—thus conducting model selection and estimation simultaneously.
High dimensional Sparse Gaussian Graphical Mixture Model
TLDR
This paper considers the problem of networks reconstruction from heterogeneous data using a Gaussian Graphical Mixture Model (GGMM) and proposes a penalized maximum likelihood technique by imposing an $l_{1}$ penalty on the precision matrix resulting in better identifiability and variable selection.
Bayesian Regularization for Graphical Models With Unequal Shrinkage
TLDR
A Bayesian framework for estimating a high-dimensional sparse precision matrix, in which adaptive shrinkage and sparsity are induced by a mixture of Laplace priors is considered, and the MAP (maximum a posteriori) estimator is investigated from a penalized likelihood perspective.
Selection and estimation for mixed graphical models.
TLDR
The selection of edges between nodes whose conditional distributions take different parametric forms is investigated, and it is shown that efficiency can be gained if edge estimates obtained from the regressions of particular nodes are used to reconstruct the graph.
Bayes Regularized Graphical Model Estimation in High Dimensions
TLDR
This work proposes a novel approach suitable for high dimensional settings, by decoupling model fitting and covariance selection, and applies its methods to high dimensional gene expression and microRNA datasets in cancer genomics.
Efficient Computation of ℓ1 Regularized Estimates in Gaussian Graphical Models
TLDR
An effcient algorithm can be used to efficiently approximate the entire solution path of the ℓ1 regularized maximum likelihood estimates, which also facilitates the choice of tuning parameter.
Bayesian structural learning and estimation in Gaussian graphical models
TLDR
The mode oriented stochastic search algorithm for Gaussian graphical models is proposed, and a new Laplace approximation method to the normalizing constant of a G-Wishart distribution is developed.
...
...

References

SHOWING 1-10 OF 22 REFERENCES
Regression Shrinkage and Selection via the Lasso
TLDR
A new method for estimation in linear models called the lasso, which minimizes the residual sum of squares subject to the sum of the absolute value of the coefficients being less than a constant, is proposed.
Consistent neighbourhood selection for sparse high-dimensional graphs with the Lasso
TLDR
It is shown that the proposed neighborhood selection scheme with the Lasso is consistent for sparse high-dimensional graphs, and the consistency hinges on the choice of the penalty parameter.
Gradient directed regularization for sparse Gaussian concentration graphs, with applications to inference of genetic networks.
TLDR
A threshold gradient descent (TGD) regularization procedure for estimating the sparse precision matrix in the setting of Gaussian graphical models is introduced and demonstrated to identify biologically meaningful genetic networks based on microarray gene expression data.
Sparsistency and Rates of Convergence in Large Covariance Matrix Estimation.
TLDR
To guarantee the sparsistency and optimal rate of convergence, the number of nonzero elements should be small: sn'=O(pn) at most, among O(pn2) parameters, for estimating sparse covariance or correlation matrix, sparse precision or inverse correlation matrix or sparse Cholesky factor.
Estimation of Large Precision Matrices Through Block Penalization
TLDR
A one-step estimator is developed, an oracle property which consists of a notion of block sign-consistency and asymptotic normality is proved, and an operator norm convergence result is proved.
Model selection for Gaussian concentration graphs
A multivariate Gaussian graphical Markov model for an undirected graph G, also called a covariance selection model or concentration graph model, is defined in terms of the Markov properties, i.e.
Introduction to Graphical Modelling
TLDR
This book provides a valuable summary of data reduction by showing that, as a class of nonlinear mappings between vector spaces, neural networks can indeed be a useful method of dimension reduction.
Graphical Models in Applied Multivariate Statistics.
TLDR
This introduction to the use of graphical models in the description and modeling of multivariate systems covers conditional independence, several types of independence graphs, Gaussian models, issues in model selection, regression and decomposition.
...
...