# Exact Covariance Thresholding into Connected Components for Large-Scale Graphical Lasso

@article{Mazumder2012ExactCT, title={Exact Covariance Thresholding into Connected Components for Large-Scale Graphical Lasso}, author={Rahul Mazumder and Trevor J. Hastie}, journal={Journal of machine learning research : JMLR}, year={2012}, volume={13}, pages={ 781-794 } }

We consider the sparse inverse covariance regularization problem or graphical lasso with regularization parameter λ. Suppose the sample covariance graph formed by thresholding the entries of the sample covariance matrix at λ is decomposed into connected components. We show that the vertex-partition induced by the connected components of the thresholded sample covariance graph (at λ) is exactly equal to that induced by the connected components of the estimated concentration graph, obtained by…

## 205 Citations

Equivalence of graphical lasso and thresholding for sparse graphs

- Mathematics, Computer Science2014 52nd Annual Allerton Conference on Communication, Control, and Computing (Allerton)
- 2014

A simple condition under which the computationally-expensive graphical lasso behaves the same as the simple heuristic method of thresholding is derived, which depends only on the solution of graphicalLasso and makes no direct use of the sample correlation matrix or the regularization coefficient.

Linear-Time Algorithm for Learning Large-Scale Sparse Graphical Models

- Computer Science, MathematicsIEEE Access
- 2019

A novel Newton-Conjugate Gradient algorithm that can efficiently solve the GGL with general structures is described, which is highly efficient in practice and describes a recursive closed-form solution for the problem when the thresholded sample covariance matrix is chordal.

New Insights and Faster Computations for the Graphical Lasso

- Mathematics
- 2011

We consider the graphical lasso formulation for estimating a Gaussian graphical model in the high-dimensional setting. This approach entails estimating the inverse covariance matrix under a…

Block-Diagonal Covariance Selection for High-Dimensional Gaussian Graphical Models

- Mathematics, Computer ScienceArXiv
- 2015

An application to a real gene expression dataset with a limited sample size is presented: the dimension reduction allows attention to be objectively focused on interactions among smaller subsets of genes, leading to a more parsimonious and interpretable modular network.

Closed-Form Solution and Sparsity Path for Inverse Covariance Estimation Problem

- Mathematics, Computer Science2018 Annual American Control Conference (ACC)
- 2018

It is proved that each change in the sparsity pattern corresponds to the addition or removal of a single edge of the graph, under generic conditions, and it is shown that Graphical Lasso as a conic optimization problem has a closed-form solution if an acyclic graph is sought.

Graphical Lasso and Thresholding: Equivalence and Closed-form Solutions

- Computer Science, MathematicsJ. Mach. Learn. Res.
- 2019

It is proved that the GL method---as a conic optimization problem---has an explicit closed-form solution if the thresholded sample covariance matrix has an acyclic structure, and it is shown that the approximation error of the derived explicit formula decreases exponentially fast with respect to the length of the minimum-length cycle of the sparsity graph.

Sparse Inverse Covariance Estimation for Chordal Structures

- Mathematics, Computer Science2018 European Control Conference (ECC)
- 2018

This paper shows that the GL and thresholding equivalence conditions can significantly be simplified and are expected to hold for high-dimensional problems if the thresholded sample covariance matrix has a chordal structure, and derives a closed-form solution for the GL for graphs with chordal structures.

Exact Hybrid Covariance Thresholding for Joint Graphical Lasso

- Computer Science, MathematicsECML/PKDD
- 2015

A novel hybrid covariance thresholding algorithm is proposed that can effectively identify zero entries in the precision matrices and split a large joint graphical lasso problem into many small subproblems, which can be solved very fast.

The Graphical Lasso: New Insights and Alternatives

- Mathematics, MedicineElectronic journal of statistics
- 2012

This paper explains how GLASSO is solving the dual of the graphical lasso penalized likelihood, by block coordinate ascent, and proposes similar primal algorithms P-GLASSO and DP-GLassO, that also operate by block-coordinate descent, where Θ is the optimization target.

Structural Pursuit Over Multiple Undirected Graphs

- Mathematics, MedicineJournal of the American Statistical Association
- 2014

An efficient method is developed on the basis of difference convex programming, the augmented Lagrangian method and the blockwise coordinate descent method, which is scalable to hundreds of graphs of thousands nodes through a simple necessary and sufficient partition rule.

## References

SHOWING 1-10 OF 28 REFERENCES

New Insights and Faster Computations for the Graphical Lasso

- Mathematics
- 2011

We consider the graphical lasso formulation for estimating a Gaussian graphical model in the high-dimensional setting. This approach entails estimating the inverse covariance matrix under a…

High-dimensional graphs and variable selection with the Lasso

- Mathematics
- 2006

The pattern of zero entries in the inverse covariance matrix of a multivariate normal distribution corresponds to conditional independence restrictions between variables. Covariance selection aims at…

Sparse inverse covariance estimation with the graphical lasso.

- Mathematics, MedicineBiostatistics
- 2008

Using a coordinate descent procedure for the lasso, a simple algorithm is developed that solves a 1000-node problem in at most a minute and is 30-4000 times faster than competing methods.

Sparse Inverse Covariance Selection via Alternating Linearization Methods

- Computer Science, MathematicsNIPS
- 2010

This paper proposes a first-order method based on an alternating linearization technique that exploits the problem's special structure; in particular, the subproblems solved in each iteration have closed-form solutions.

High-dimensional Covariance Estimation Based On Gaussian Graphical Models

- Mathematics, Computer ScienceJ. Mach. Learn. Res.
- 2011

It is shown that under suitable conditions, this approach yields consistent estimation in terms of graphical structure and fast convergence rates with respect to the operator and Frobenius norm for the covariance matrix and its inverse using the maximum likelihood estimator.

Sparse permutation invariant covariance estimation

- Mathematics
- 2008

The paper proposes a method for constructing a sparse estima- tor for the inverse covariance (concentration) matrix in high-dimensional settings. The estimator uses a penalized normal likelihood…

Model Selection Through Sparse Maximum Likelihood Estimation for Multivariate Gaussian or Binary Data

- Computer Science, MathematicsJ. Mach. Learn. Res.
- 2008

This work considers the problem of estimating the parameters of a Gaussian or binary distribution in such a way that the resulting undirected graphical model is sparse, and presents two new algorithms for solving problems with at least a thousand nodes in the Gaussian case.

Sparse Inverse Covariance Estimation via an Adaptive Gradient-Based Method

- Mathematics
- 2011

We study the problem of estimating from data, a sparse approximation to the inverse covariance matrix. Estimating a sparsity constrained inverse covariance matrix is a key component in Gaussian…

Adaptive First-Order Methods for General Sparse Inverse Covariance Selection

- Mathematics, Computer ScienceSIAM J. Matrix Anal. Appl.
- 2010

The computational results demonstrate that the proposed algorithm framework and methods are capable of solving problems of size at least a thousand and number of constraints of nearly a half million within a reasonable amount of time, and that the ASPG method generally outperforms the ANS method and glasso.

Alternating Direction Methods for Sparse Covariance Selection *

- 2009

The mathematical model of the widely-used sparse covariance selection problem (SCSP) is an NP-hard combinatorial problem, whereas it can be well approximately by a convex relaxation problem whose…