# Tensor graphical lasso (TeraLasso)

@article{Greenewald2019TensorGL, title={Tensor graphical lasso (TeraLasso)}, author={Kristjan H. Greenewald and Shuheng Zhou and Alfred O. Hero}, journal={Journal of the Royal Statistical Society: Series B (Statistical Methodology)}, year={2019} }

This paper introduces a multi-way tensor generalization of the Bigraphical Lasso (BiGLasso), which uses a two-way sparse Kronecker-sum multivariate-normal model for the precision matrix to parsimoniously model conditional dependence relationships of matrix-variate data based on the Cartesian product of graphs. We call this generalization the {\bf Te}nsor g{\bf ra}phical Lasso (TeraLasso). We demonstrate using theory and examples that the TeraLasso model can be accurately and scalably estimated… Expand

#### Figures from this paper

#### 13 Citations

EiGLasso for Scalable Sparse Kronecker-Sum Inverse Covariance Estimation

- Computer Science, Mathematics
- ArXiv
- 2021

EiGLasso is introduced, a highly scalable method for sparse Kronecker-sum inverse covariance estimation, based on Newton’s method combined with eigendecomposition of the two graphs for exploiting the structure of Kr onecker sum, and achieves two to three orders-of-magnitude speed-up compared to the existing methods. Expand

The Sylvester Graphical Lasso (SyGlasso)

- Computer Science, Mathematics
- AISTATS
- 2020

The Sylvester graphical lasso (SyGlasso) is introduced that captures multiway dependencies present in tensor-valued data and can simultaneously estimate both the brain connectivity and its temporal dependencies. Expand

Jointly Modeling and Clustering Tensors in High Dimensions

- Mathematics
- 2021

We consider the problem of jointly modeling and clustering populations of tensors by introducing a high-dimensional tensor mixture model with heterogeneous covariances. To effectively tackle the high… Expand

EiGLasso: Scalable Estimation of Cartesian Product of Sparse Inverse Covariance Matrices

- Computer Science
- UAI
- 2020

EiGLasso is introduced, a highly efficient optimization method for estimating the Kronecker-sum-structured inverse covariance matrix from matrix-variate data and an alternative simpler approach for handling the non-identifiability of parameters than the one used in previous work is described. Expand

Online Graph Topology Learning from Matrix-valued Time Series

- Computer Science, Mathematics
- ArXiv
- 2021

A novel multivariate autoregressive model is introduced to infer the graph topology encoded in the coefficient matrix which captures the sparse Granger causality dependency structure present in such matrix-valued time series. Expand

SG-PALM: a Fast Physically Interpretable Tensor Graphical Model

- Computer Science, Mathematics
- ICML
- 2021

The scalability and accuracy of SGPALM are demonstrated for an important but challenging climate prediction problem: spatio-temporal forecasting of solar flares from multimodal imaging data. Expand

Kernel-Based Graph Learning From Smooth Signals: A Functional Viewpoint

- Mathematics, Computer Science
- IEEE Transactions on Signal and Information Processing over Networks
- 2021

A novel graph learning framework that incorporates prior information along node and observation side, and in particular the covariates that help to explain the dependency structures in graph signals, and develops a novel graph-based regularisation method which enables the model to capture both the dependency explained by the graph and the dependency due to graph signals observed under different but related circumstances. Expand

Structure in modern data and how to exploit it: some signal processing applications

- Computer Science
- 2020

This dissertation investigates the advantages of structure exploitation in three applications: (i) signal detection and classification under the union-of-subspaces model, (ii) learning product graphs underlying smooth graph signals, and (iii) distributed radar imaging under position errors and unsynchronized clocks. Expand

Tensors in Modern Statistical Learning

- Computer Science
- 2021

This survey provides an overview of tensor analysis in modern statistical learning, including tensor supervised learning, tensor unsupervised learning, Tensor reinforcement learning, and tensor deep learning. Expand

Correlation Tensor Decomposition and Its Application in Spatial Imaging Data

- Physics
- Journal of the American Statistical Association
- 2021

Multi-dimensional tensor data has gained increasing attention in recent years, especially in biomedical imaging analyses. However, most existing tensor models are only based on the mean information...

#### References

SHOWING 1-10 OF 78 REFERENCES

On Convergence of Kronecker Graphical Lasso Algorithms

- Mathematics, Computer Science
- IEEE Transactions on Signal Processing
- 2013

This paper studies iteration convergence of Kronecker graphical lasso (KGLasso) algorithms for estimating the covariance of an i.i.d. Gaussian random sample under a sparse KrOnecker-product covariance model and MSE convergence rates to establish that KGlasso has significantly faster asymptotic convergence than Glasso and FF. Expand

The Bigraphical Lasso

- Mathematics, Computer Science
- ICML
- 2013

The bigraphical lasso is introduced, an estimator for precision matrices of matrix-normals based on the Cartesian product of graph theory, a prominent product in spectral graph theory that has appealing properties for regression, enhanced sparsity and interpretability. Expand

Equivariant and Scale-Free Tucker Decomposition Models

- Mathematics, Computer Science
- 2013

This article develops methodology to obtain low-rank model-based representations of continuous, discrete and ordinal data arrays using a semiparametric transformation model, and shows how orthogonally equivariant parameter estimates can be obtained from Bayesian procedures under invariant prior distributions. Expand

Sparse permutation invariant covariance estimation

- Mathematics
- 2008

The paper proposes a method for constructing a sparse estima- tor for the inverse covariance (concentration) matrix in high-dimensional settings. The estimator uses a penalized normal likelihood… Expand

Gemini: Graph estimation with matrix variate normal instances

- Mathematics
- 2014

Undirected graphs can be used to describe matrix variate distributions. In this paper, we develop new methods for estimating the graphical structures and underlying parameters, namely, the row and… Expand

TENSOR DECOMPOSITIONS AND SPARSE LOG-LINEAR MODELS.

- Mathematics, Medicine
- Annals of statistics
- 2017

A new collapsed Tucker class of tensor decompositions are proposed, which bridge existing PARAFAC and Tucker decomposition, providing a more flexible framework for parsimoniously characterizing multivariate categorical data. Expand

High-dimensional graphs and variable selection with the Lasso

- Mathematics
- 2006

The pattern of zero entries in the inverse covariance matrix of a multivariate normal distribution corresponds to conditional independence restrictions between variables. Covariance selection aims at… Expand

Covariance Estimation in High Dimensions Via Kronecker Product Expansions

- Mathematics, Computer Science
- IEEE Transactions on Signal Processing
- 2013

The results establish that PRLS has significantly faster convergence than the standard sample covariance matrix (SCM) estimator, and show that a class of block Toeplitz covariance matrices is approximatable by low separation rank and give bounds on the minimal separation rank r that ensures a given level of bias. Expand

High-dimensional Covariance Estimation Based On Gaussian Graphical Models

- Mathematics, Computer Science
- J. Mach. Learn. Res.
- 2011

It is shown that under suitable conditions, this approach yields consistent estimation in terms of graphical structure and fast convergence rates with respect to the operator and Frobenius norm for the covariance matrix and its inverse using the maximum likelihood estimator. Expand

Sparse inverse covariance estimation with the graphical lasso.

- Mathematics, Medicine
- Biostatistics
- 2008

Using a coordinate descent procedure for the lasso, a simple algorithm is developed that solves a 1000-node problem in at most a minute and is 30-4000 times faster than competing methods. Expand