# EiGLasso for Scalable Sparse Kronecker-Sum Inverse Covariance Estimation

@article{Yoon2021EiGLassoFS, title={EiGLasso for Scalable Sparse Kronecker-Sum Inverse Covariance Estimation}, author={Jun Ho Yoon and Seyoung Kim}, journal={ArXiv}, year={2021}, volume={abs/2105.09872} }

In many real-world data, complex dependencies are present both among samples and among features. The Kronecker sum or the Cartesian product of two graphs, each modeling dependencies across features and across samples, has been used as an inverse covariance matrix for a matrix-variate Gaussian distribution as an alternative to Kronecker-product inverse covariance matrix due to its more intuitive sparse structure. However, the existing methods for sparse Kronecker-sum inverse covariance…

## Figures and Tables from this paper

## References

SHOWING 1-10 OF 45 REFERENCES

EiGLasso: Scalable Estimation of Cartesian Product of Sparse Inverse Covariance Matrices

- Computer Science, MathematicsUAI
- 2020

EiGLasso is introduced, a highly efficient optimization method for estimating the Kronecker-sum-structured inverse covariance matrix from matrix-variate data and an alternative simpler approach for handling the non-identifiability of parameters than the one used in previous work is described.

Covariance Estimation in High Dimensions Via Kronecker Product Expansions

- Computer Science, MathematicsIEEE Transactions on Signal Processing
- 2013

The results establish that PRLS has significantly faster convergence than the standard sample covariance matrix (SCM) estimator, and show that a class of block Toeplitz covariance matrices is approximatable by low separation rank and give bounds on the minimal separation rank r that ensures a given level of bias.

Tensor graphical lasso (TeraLasso)

- Computer ScienceJournal of the Royal Statistical Society: Series B (Statistical Methodology)
- 2019

The paper introduces a multiway tensor generalization of the bigraphical lasso which uses a two‐way sparse Kronecker sum multivariate normal model for the precision matrix to model parsimoniously…

QUIC: quadratic approximation for sparse inverse covariance estimation

- Computer ScienceJ. Mach. Learn. Res.
- 2014

This work proposes a novel algorithm for solving the resulting optimization problem which is a regularized log-determinant program based on Newton's method and employs a quadratic approximation, but with some modifications that leverage the structure of the sparse Gaussian MLE problem.

BIG & QUIC: Sparse Inverse Covariance Estimation for a Million Variables

- Computer ScienceNIPS
- 2013

An algorithm BIGQUIC is developed, which can solve 1 million dimensional l1-regularized Gaussian MLE problems using a single machine, with bounded memory, and can achieve super-linear or even quadratic convergence rates.

Exact Covariance Thresholding into Connected Components for Large-Scale Graphical Lasso

- Computer Science, MathematicsJ. Mach. Learn. Res.
- 2012

For a range of values of λ, this proposal splits a large graphical lasso problem into smaller tractable problems, making it possible to solve an otherwise infeasible large-scale problem.

Sparse inverse covariance estimation with the graphical lasso.

- Computer ScienceBiostatistics
- 2008

Using a coordinate descent procedure for the lasso, a simple algorithm is developed that solves a 1000-node problem in at most a minute and is 30-4000 times faster than competing methods.

Gemini: Graph estimation with matrix variate normal instances

- Mathematics, Computer Science
- 2014

This paper develops new methods for estimating the graphical structures and underlying parameters, namely, the row and column covariance and inverse covariance matrices from the matrix variate data and provides simulation evidence showing that one can recover graphical structures as well as estimating the precision matrices, as predicted by theory.

The Bigraphical Lasso

- Computer ScienceICML
- 2013

The bigraphical lasso is introduced, an estimator for precision matrices of matrix-normals based on the Cartesian product of graph theory, a prominent product in spectral graph theory that has appealing properties for regression, enhanced sparsity and interpretability.

New Insights and Faster Computations for the Graphical Lasso

- Computer Science, Mathematics
- 2011

A very simple necessary and sufficient condition can be employed to determine whether the estimated inverse covariance matrix will be block diagonal, and if so, then to identify the blocks in the graphical lasso solution.