# Towards Optimal Sparse Inverse Covariance Selection through Non-Convex Optimization

@article{Misra2017TowardsOS, title={Towards Optimal Sparse Inverse Covariance Selection through Non-Convex Optimization}, author={Sidhant Misra and Marc Vuffray and Andrey Y. Lokhov and Michael Chertkov}, journal={ArXiv}, year={2017}, volume={abs/1703.04886} }

We study the problem of reconstructing the graph of a sparse Gaussian Graphical Model from independent observations, which is equivalent to finding non-zero elements of an inverse covariance matrix. For a model of size p and maximum degree d, information theoretic lower bounds established in prior works require that the number of samples needed for recovering the graph perfectly is at least d log p/κ, where κ is the minimum normalized non-zero entry of the inverse covariance matrix. Existing…

## 5 Citations

### Fast structure learning with modular regularization

- Computer ScienceNeurIPS
- 2019

A novel method is introduced that leverages a newly discovered connection between information-theoretic measures and structured latent factor models to derive an optimization objective which encourages modular structures where each observed variable has a single latent parent.

### Mixed-integer convex optimization : outer approximation algorithms and modeling power

- Computer Science
- 2017

The first finite-time outer approximation methods for problems in general mixed-integer conic form are developed and implemented and implement them in an open-source solver, Pajarito, obtaining competitive performance with the state of the art.

### Optimization and Inference for Physical Flows on Networks (17w5165)

- Computer Science
- 2017

Of particular interest is devoted to models for which the network structure is represented using graphs where flows on edges and conditions at nodes are governed by sets of physical laws, which may be steady or dynamic, and deterministic or stochastic.

### Integrated multi-scale data analytics and machine learning for the distribution grid

- Computer Science2017 IEEE International Conference on Smart Grid Communications (SmartGridComm)
- 2017

The field of machine learning is considered as a subset of analytical techniques, and its ability and limitations to enable the future distribution grid and a series of case studies are presented that illustrate the potential benefits of developing advanced local multi-variate analytics machine-learning-based applications.

### Low Complexity Gaussian Latent Factor Models and a Blessing of Dimensionality

- Computer ScienceArXiv
- 2017

This work designs a new approach to learning Gaussian latent factor models that benefits from dimensionality and relies on an unconstrained information-theoretic objective whose global optima correspond to structured latent factor generative models.

## References

SHOWING 1-10 OF 21 REFERENCES

### Information-theoretic bounds on model selection for Gaussian Markov random fields

- Computer Science, Mathematics2010 IEEE International Symposium on Information Theory
- 2010

The first result establishes a set of necessary conditions on n(p, d) for any recovery method to consistently estimate the underlying graph, and the second result provides necessary conditions for any decoder to produce an estimate of the true inverse covariance matrix T satisfying ‖ Θ̂-Θ ‖ < δin the elementwise ℓ∞-norm.

### High-dimensional Gaussian graphical model selection: walk summability and local separation criterion

- Computer ScienceJ. Mach. Learn. Res.
- 2012

This work identifies a set of graphs for which an efficient estimation algorithm exists, and this algorithm is based on thresholding of empirical conditional covariances, and establishes structural consistency (or sparsistency) for the proposed algorithm, when the number of samples n = Ω(Jmin-2 log p) and Jmin is the minimum (absolute) edge potential of the graphical model.

### High-dimensional graphs and variable selection with the Lasso

- Computer Science
- 2006

It is shown that neighborhood selection with the Lasso is a computationally attractive alternative to standard covariance selection for sparse high-dimensional graphs and is hence equivalent to variable selection for Gaussian linear models.

### A Constrained ℓ1 Minimization Approach to Sparse Precision Matrix Estimation

- Mathematics, Computer Science
- 2011

A constrained ℓ1 minimization method for estimating a sparse inverse covariance matrix based on a sample of n iid p-variate random variables and is applied to analyze a breast cancer dataset and is found to perform favorably compared with existing methods.

### Model Selection in Gaussian Graphical Models: High-Dimensional Consistency of l1-regularized MLE

- Computer Science, MathematicsNIPS
- 2008

This work considers the problem of estimating the graph structure associated with a Gaussian Markov random field (GMRF) from i.i.d. samples and provides sufficient conditions on (n, p, d) for thel1-regularized MLE estimator to recover all the edges of the graph with high probability.

### Model Selection in Gaussian Graphical Models: High-Dimensional Consistency of boldmathell_1-regularized MLE

- Computer Science, MathematicsNIPS 2008
- 2008

This work considers the problem of estimating the graph structure associated with a Gaussian Markov random field (GMRF) from i.i.d. samples and provides sufficient conditions on (n, p, d) for the l1-regularized MLE estimator to recover all the edges of the graph with high probability.

### Estimating Sparse Precision Matrix: Optimal Rates of Convergence and Adaptive Estimation

- Computer Science
- 2012

The upper and lower bounds together yield the optimal rates of convergence for sparse precision matrix estimation and show that the ACLIME estimator is adaptively minimax rate optimal for a collection of parameter spaces and a range of matrix norm losses simultaneously.

### First-Order Methods for Sparse Covariance Selection

- Computer ScienceSIAM J. Matrix Anal. Appl.
- 2008

This work first formulate a convex relaxation of this combinatorial problem, then detail two efficient first-order algorithms with low memory requirements to solve large-scale, dense problem instances.

### Interaction Screening: Efficient and Sample-Optimal Learning of Ising Models

- Computer Science, MathematicsNIPS
- 2016

We consider the problem of learning the underlying graph of an unknown Ising model on p spins from a collection of i.i.d. samples generated from the model. We suggest a new estimator that is…

### Optimal structure and parameter learning of Ising models

- Computer ScienceScience Advances
- 2018

This study shows that the interaction screening method is an exact, tractable, and optimal technique that universally solves the inverse Ising problem.