• Corpus ID: 234790143

EiGLasso for Scalable Sparse Kronecker-Sum Inverse Covariance Estimation

  title={EiGLasso for Scalable Sparse Kronecker-Sum Inverse Covariance Estimation},
  author={Jun Ho Yoon and Seyoung Kim},
In many real-world data, complex dependencies are present both among samples and among features. The Kronecker sum or the Cartesian product of two graphs, each modeling dependencies across features and across samples, has been used as an inverse covariance matrix for a matrix-variate Gaussian distribution as an alternative to Kronecker-product inverse covariance matrix due to its more intuitive sparse structure. However, the existing methods for sparse Kronecker-sum inverse covariance… 

Figures and Tables from this paper


EiGLasso: Scalable Estimation of Cartesian Product of Sparse Inverse Covariance Matrices
EiGLasso is introduced, a highly efficient optimization method for estimating the Kronecker-sum-structured inverse covariance matrix from matrix-variate data and an alternative simpler approach for handling the non-identifiability of parameters than the one used in previous work is described.
Covariance Estimation in High Dimensions Via Kronecker Product Expansions
The results establish that PRLS has significantly faster convergence than the standard sample covariance matrix (SCM) estimator, and show that a class of block Toeplitz covariance matrices is approximatable by low separation rank and give bounds on the minimal separation rank r that ensures a given level of bias.
Tensor graphical lasso (TeraLasso)
The paper introduces a multiway tensor generalization of the bigraphical lasso which uses a two‐way sparse Kronecker sum multivariate normal model for the precision matrix to model parsimoniously
QUIC: quadratic approximation for sparse inverse covariance estimation
This work proposes a novel algorithm for solving the resulting optimization problem which is a regularized log-determinant program based on Newton's method and employs a quadratic approximation, but with some modifications that leverage the structure of the sparse Gaussian MLE problem.
BIG & QUIC: Sparse Inverse Covariance Estimation for a Million Variables
An algorithm BIGQUIC is developed, which can solve 1 million dimensional l1-regularized Gaussian MLE problems using a single machine, with bounded memory, and can achieve super-linear or even quadratic convergence rates.
Exact Covariance Thresholding into Connected Components for Large-Scale Graphical Lasso
For a range of values of λ, this proposal splits a large graphical lasso problem into smaller tractable problems, making it possible to solve an otherwise infeasible large-scale problem.
Gemini: Graph estimation with matrix variate normal instances
This paper develops new methods for estimating the graphical structures and underlying parameters, namely, the row and column covariance and inverse covariance matrices from the matrix variate data and provides simulation evidence showing that one can recover graphical structures as well as estimating the precision matrices, as predicted by theory.
Sparse inverse covariance estimation with the graphical lasso.
Using a coordinate descent procedure for the lasso, a simple algorithm is developed that solves a 1000-node problem in at most a minute and is 30-4000 times faster than competing methods.
The Bigraphical Lasso
The bigraphical lasso is introduced, an estimator for precision matrices of matrix-normals based on the Cartesian product of graph theory, a prominent product in spectral graph theory that has appealing properties for regression, enhanced sparsity and interpretability.
Sparse Matrix Graphical Models
This article proposes a novel sparse matrix graphical model that synthetically characterizes the underlying conditional independence structure of the sparse vector-variate graphical model by penalizing, respectively, two precision matrices corresponding to the rows and columns.