Estimating Multiple Precision Matrices With Cluster Fusion Regularization

@article{Price2021EstimatingMP,
  title={Estimating Multiple Precision Matrices With Cluster Fusion Regularization},
  author={Bradley S. Price and Aaron J. Molstad and Ben Sherwood},
  journal={Journal of Computational and Graphical Statistics},
  year={2021},
  volume={30},
  pages={823 - 834}
}
Abstract We propose a penalized likelihood framework for estimating multiple precision matrices from different classes. Most existing methods either incorporate no information on relationships between the precision matrices or require this information be known a priori. The framework proposed in this article allows for simultaneous estimation of the precision matrices and relationships between the precision matrices. Sparse and nonsparse estimators are proposed, both of which require solving a… 
On the Use of Minimum Penalties in Statistical Learning
Modern multivariate machine learning and statistical methodologies estimate parameters of interest while leveraging prior knowledge of the association between outcome variables. The methods that do
Hierarchical learning of Hidden Markov Models with clustering regularization
TLDR
This paper proposes a novel tree structure variational Bayesian method to learn the individual model and group model simultaneously by treating the group models as the parents of individual models, so that theindividual model is learned from observations and regularized by its parents, and conversely, the parent model will be optimized to best represent its children.
Joint Gaussian Graphical Model Estimation: A Survey
TLDR
This manuscript surveys recent work on statistical inference of joint Gaussian graphical models, identifying model structures that are associated with various data generation processes.
Joint Inference of Multiple Graphs from Matrix Polynomials
TLDR
This work studies the problem of jointly inferring multiple graphs from the observation of signals at their nodes, which are assumed to be stationary in the sought graphs, and provides high-probability bounds on the recovery error as a function of the number of signals observed and other key problem parameters.

References

SHOWING 1-10 OF 81 REFERENCES
Joint Estimation of Precision Matrices in Heterogeneous Populations.
TLDR
To extend the applicability of the method to the settings with unknown populations structure, a Laplacian penalty based on hierarchical clustering is proposed, and conditions under which this data-driven choice results in consistent estimation of precision matrices in heterogenous populations are discussed.
Iterative Thresholding Algorithm for Sparse Inverse Covariance Estimation
TLDR
This paper gives eigenvalue bounds for the G-ISTA iterates, providing a closed-form linear convergence rate, which is shown to be closely related to the condition number of the optimal point.
Model Selection Through Sparse Maximum Likelihood Estimation for Multivariate Gaussian or Binary Data
TLDR
This work considers the problem of estimating the parameters of a Gaussian or binary distribution in such a way that the resulting undirected graphical model is sparse, and presents two new algorithms for solving problems with at least a thousand nodes in the Gaussian case.
Targeted Fused Ridge Estimation of Inverse Covariance Matrices from Multiple High-Dimensional Data Classes
TLDR
The result is a targeted fused ridge estimator that is of use when the precision matrices of the constituent classes are believed to chiefly share the same structure while potentially differing in a number of locations of interest.
Sparse permutation invariant covariance estimation
TLDR
A method for constructing a sparse estimator for the inverse covariance (concentration) matrix in high-dimensional settings using a penalized normal likelihood approach and forces sparsity by using a lasso-type penalty is proposed.
Likelihood-Based Selection and Sharp Parameter Estimation
TLDR
Theoretically, it is shown that constrained L 0 likelihood and its computational surrogate are optimal in that they achieve feature selection consistency andsharp parameter estimation, under one necessary condition required for any method to be selection consistent and to achieve sharp parameter estimation.
Simultaneous Clustering and Estimation of Heterogeneous Graphical Models
TLDR
A non-asymptotic error bound is established for the output directly from the high dimensional ECM algorithm, and it consists of two quantities: statistical error (statistical accuracy) and optimization error (computational complexity).
Scalable Computation of Regularized Precision Matrices via Stochastic Optimization
TLDR
A new algorithmic framework based on stochastic proximal optimization (on the primal problem) that can be used to obtain near optimal solutions with substantial computational savings over deterministic algorithms is proposed.
Model selection and estimation in the Gaussian graphical model
TLDR
The implementation of the penalized likelihood methods for estimating the concentration matrix in the Gaussian graphical model is nontrivial, but it is shown that the computation can be done effectively by taking advantage of the efficient maxdet algorithm developed in convex optimization.
Covariance Estimation: The GLM and Regularization Perspectives
TLDR
A survey of the progress made in modeling covariance matrices from two relatively complementary perspectives: (1) generalized linear models (GLM) or parsimony and use of covariates in low dimensions, and (2) regularization or sparsity for high-dimensional data.
...
1
2
3
4
5
...