Tensor graphical lasso (TeraLasso)

@article{Greenewald2019TensorGL,
  title={Tensor graphical lasso (TeraLasso)},
  author={Kristjan H. Greenewald and Shuheng Zhou and Alfred O. Hero},
  journal={Journal of the Royal Statistical Society: Series B (Statistical Methodology)},
  year={2019}
}
This paper introduces a multi-way tensor generalization of the Bigraphical Lasso (BiGLasso), which uses a two-way sparse Kronecker-sum multivariate-normal model for the precision matrix to parsimoniously model conditional dependence relationships of matrix-variate data based on the Cartesian product of graphs. We call this generalization the {\bf Te}nsor g{\bf ra}phical Lasso (TeraLasso). We demonstrate using theory and examples that the TeraLasso model can be accurately and scalably estimated… Expand
EiGLasso for Scalable Sparse Kronecker-Sum Inverse Covariance Estimation
TLDR
EiGLasso is introduced, a highly scalable method for sparse Kronecker-sum inverse covariance estimation, based on Newton’s method combined with eigendecomposition of the two graphs for exploiting the structure of Kr onecker sum, and achieves two to three orders-of-magnitude speed-up compared to the existing methods. Expand
The Sylvester Graphical Lasso (SyGlasso)
TLDR
The Sylvester graphical lasso (SyGlasso) is introduced that captures multiway dependencies present in tensor-valued data and can simultaneously estimate both the brain connectivity and its temporal dependencies. Expand
Jointly Modeling and Clustering Tensors in High Dimensions
We consider the problem of jointly modeling and clustering populations of tensors by introducing a high-dimensional tensor mixture model with heterogeneous covariances. To effectively tackle the highExpand
EiGLasso: Scalable Estimation of Cartesian Product of Sparse Inverse Covariance Matrices
TLDR
EiGLasso is introduced, a highly efficient optimization method for estimating the Kronecker-sum-structured inverse covariance matrix from matrix-variate data and an alternative simpler approach for handling the non-identifiability of parameters than the one used in previous work is described. Expand
Online Graph Topology Learning from Matrix-valued Time Series
TLDR
A novel multivariate autoregressive model is introduced to infer the graph topology encoded in the coefficient matrix which captures the sparse Granger causality dependency structure present in such matrix-valued time series. Expand
SG-PALM: a Fast Physically Interpretable Tensor Graphical Model
  • Yu Wang, A. Hero
  • Computer Science, Mathematics
  • ICML
  • 2021
TLDR
The scalability and accuracy of SGPALM are demonstrated for an important but challenging climate prediction problem: spatio-temporal forecasting of solar flares from multimodal imaging data. Expand
Kernel-Based Graph Learning From Smooth Signals: A Functional Viewpoint
TLDR
A novel graph learning framework that incorporates prior information along node and observation side, and in particular the covariates that help to explain the dependency structures in graph signals, and develops a novel graph-based regularisation method which enables the model to capture both the dependency explained by the graph and the dependency due to graph signals observed under different but related circumstances. Expand
Structure in modern data and how to exploit it: some signal processing applications
TLDR
This dissertation investigates the advantages of structure exploitation in three applications: (i) signal detection and classification under the union-of-subspaces model, (ii) learning product graphs underlying smooth graph signals, and (iii) distributed radar imaging under position errors and unsynchronized clocks. Expand
Tensors in Modern Statistical Learning
TLDR
This survey provides an overview of tensor analysis in modern statistical learning, including tensor supervised learning, tensor unsupervised learning, Tensor reinforcement learning, and tensor deep learning. Expand
Correlation Tensor Decomposition and Its Application in Spatial Imaging Data
  • Yujia Deng, Xiwei Tang, A. Qu
  • Physics
  • Journal of the American Statistical Association
  • 2021
Multi-dimensional tensor data has gained increasing attention in recent years, especially in biomedical imaging analyses. However, most existing tensor models are only based on the mean information...
...
1
2
...

References

SHOWING 1-10 OF 78 REFERENCES
On Convergence of Kronecker Graphical Lasso Algorithms
TLDR
This paper studies iteration convergence of Kronecker graphical lasso (KGLasso) algorithms for estimating the covariance of an i.i.d. Gaussian random sample under a sparse KrOnecker-product covariance model and MSE convergence rates to establish that KGlasso has significantly faster asymptotic convergence than Glasso and FF. Expand
The Bigraphical Lasso
TLDR
The bigraphical lasso is introduced, an estimator for precision matrices of matrix-normals based on the Cartesian product of graph theory, a prominent product in spectral graph theory that has appealing properties for regression, enhanced sparsity and interpretability. Expand
Equivariant and Scale-Free Tucker Decomposition Models
TLDR
This article develops methodology to obtain low-rank model-based representations of continuous, discrete and ordinal data arrays using a semiparametric transformation model, and shows how orthogonally equivariant parameter estimates can be obtained from Bayesian procedures under invariant prior distributions. Expand
Sparse permutation invariant covariance estimation
The paper proposes a method for constructing a sparse estima- tor for the inverse covariance (concentration) matrix in high-dimensional settings. The estimator uses a penalized normal likelihoodExpand
Gemini: Graph estimation with matrix variate normal instances
Undirected graphs can be used to describe matrix variate distributions. In this paper, we develop new methods for estimating the graphical structures and underlying parameters, namely, the row andExpand
TENSOR DECOMPOSITIONS AND SPARSE LOG-LINEAR MODELS.
TLDR
A new collapsed Tucker class of tensor decompositions are proposed, which bridge existing PARAFAC and Tucker decomposition, providing a more flexible framework for parsimoniously characterizing multivariate categorical data. Expand
High-dimensional graphs and variable selection with the Lasso
The pattern of zero entries in the inverse covariance matrix of a multivariate normal distribution corresponds to conditional independence restrictions between variables. Covariance selection aims atExpand
Covariance Estimation in High Dimensions Via Kronecker Product Expansions
TLDR
The results establish that PRLS has significantly faster convergence than the standard sample covariance matrix (SCM) estimator, and show that a class of block Toeplitz covariance matrices is approximatable by low separation rank and give bounds on the minimal separation rank r that ensures a given level of bias. Expand
High-dimensional Covariance Estimation Based On Gaussian Graphical Models
TLDR
It is shown that under suitable conditions, this approach yields consistent estimation in terms of graphical structure and fast convergence rates with respect to the operator and Frobenius norm for the covariance matrix and its inverse using the maximum likelihood estimator. Expand
Sparse inverse covariance estimation with the graphical lasso.
TLDR
Using a coordinate descent procedure for the lasso, a simple algorithm is developed that solves a 1000-node problem in at most a minute and is 30-4000 times faster than competing methods. Expand
...
1
2
3
4
5
...