Sparse plus low-rank autoregressive identification in neuroimaging time series

@article{Ligeois2015SparsePL,
  title={Sparse plus low-rank autoregressive identification in neuroimaging time series},
  author={Rapha{\"e}l Li{\'e}geois and Bamdev Mishra and Mattia Zorzi and Rodolphe Sepulchre},
  journal={2015 54th IEEE Conference on Decision and Control (CDC)},
  year={2015},
  pages={3965-3970}
}
This paper considers the problem of identifying multivariate autoregressive (AR) sparse plus low-rank graphical models. Based on a recent problem formulation, we use the alternating direction method of multipliers (ADMM) to solve it efficiently as a convex program for sizes encountered in neuroimaging applications. We apply this algorithm on synthetic and real neuroimaging datasets with a specific focus on the information encoded in the low-rank structure of our model. In particular, we… 

Figures from this paper

Sparse plus low-rank graphical models of time series for functional connectivity in MEG
TLDR
This work introduces a method to learn Gaussian graphical models between multiple time series with latent processes, and allows for heterogeneity between different groups of MEG recordings by using a hierarchical penalty.
Sparse plus low rank network identification: A nonparametric approach
On the Identification of Sparse plus Low-rank Graphical Models
This thesis proposes an identification procedure for periodic, Gaussian, stationary reciprocal processes, under the assumption that the conditional dependence relations among the observed variables
Robust Identification of “Sparse Plus Low-rank” Graphical Models: An Optimization Approach
TLDR
An alternative optimization approach is proposed that appears to be suitable to deal with robustness issues in the “Sparse Plus Low-rank” decomposition problem and is carried over and discussed.
A Scalable Strategy for the Identification of Latent-variable Graphical Models
TLDR
An identification method for latent-variable graphical models associated to autoregressive (AR) Gaussian stationary processes exploiting the approximation of AR processes through stationary reciprocal processes thus benefiting of the numerical advantages of dealing with block-circulant matrices.
Topology Identification under Spatially Correlated Noise
TLDR
This article transforms the given LDM into an LDM with hidden nodes, where the hidden nodes are characterized using maximal cliques in the correlation graph and all the nodes are excited by uncorrelated noise.
Maximum Entropy Expectation-Maximization Algorithm for Fitting Latent-Variable Graphical Models to Multivariate Time Series
TLDR
It is shown how an algorithm which was originally used for finding zeros in the inverse of the covariance matrix can be generalized such that to identify the sparsity pattern of theverse of spectral density matrix.
Empirical Bayesian learning in AR graphical models
...
...

References

SHOWING 1-10 OF 29 REFERENCES
Topology Selection in Graphical Models of Autoregressive Processes
TLDR
An algorithm is presented for topology selection in graphical models of autoregressive Gaussian time series that reduces to a convex optimization problem and is described as a large-scale algorithm that solves the dual problem via the gradient projection method.
AR Identification of Latent-Variable Graphical Models
The paper proposes an identification procedure for autoregressive Gaussian stationary stochastic processes under the assumption that the manifest (or observed) variables are nearly independent when
Graphical interaction models for multivariate time series1
TLDR
A partial correlation graph for time series is defined and the partial spectral coherence between two components given the remaining components to identify the edges of the graph is used.
GROUP SPARSITY VIA LINEAR-TIME PROJECTION
TLDR
An efficient spectral projected-gradient algorithm for optimization subject to a group `1-norm constraint based on a novel linear-time algorithm for Euclidean projection onto the `1 and group ` 1-norm constraints is presented.
Model Selection Through Sparse Maximum Likelihood Estimation for Multivariate Gaussian or Binary Data
TLDR
This work considers the problem of estimating the parameters of a Gaussian or binary distribution in such a way that the resulting undirected graphical model is sparse, and presents two new algorithms for solving problems with at least a thousand nodes in the Gaussian case.
Independent Component Analysis
TLDR
The standardization of the IC model is talked about, and on the basis of n independent copies of x, the aim is to find an estimate of an unmixing matrix Γ such that Γx has independent components.
Distributed Optimization and Statistical Learning via the Alternating Direction Method of Multipliers
TLDR
It is argued that the alternating direction method of multipliers is well suited to distributed convex optimization, and in particular to large-scale problems arising in statistics, machine learning, and related areas.
Complex brain networks: graph theoretical analysis of structural and functional systems
TLDR
This article reviews studies investigating complex brain networks in diverse experimental modalities and provides an accessible introduction to the basic principles of graph theory and highlights the technical challenges and key questions to be addressed by future developments in this rapidly moving field.
Latent variable graphical model selection via convex optimization
TLDR
The modeling framework can be viewed as a combination of dimensionality reduction and graphical modeling (to capture remaining statistical structure not attributable to the latent variables) and it consistently estimates both the number of hidden components and the conditional graphical model structure among the observed variables.
An Accelerated Linearized Alternating Direction Method of Multipliers
TLDR
It is demonstrated that for solving a class of convex composite optimization with linear constraints, the rate of convergence of AADMM is better than that of linearized ADMM, in terms of their dependence on the Lipschitz constant of the smooth component.
...
...