Covariance Estimation in Decomposable Gaussian Graphical Models

@article{Wiesel2010CovarianceEI,
  title={Covariance Estimation in Decomposable Gaussian Graphical Models},
  author={Ami Wiesel and Yonina C. Eldar and Alfred O. Hero},
  journal={IEEE Transactions on Signal Processing},
  year={2010},
  volume={58},
  pages={1482-1492}
}
Graphical models are a framework for representing and exploiting prior conditional independence structures within distributions using graphs. In the Gaussian case, these models are directly related to the sparsity of the inverse covariance (concentration) matrix and allow for improved covariance estimation with lower computational complexity. We consider concentration estimation with the mean-squared error (MSE) as the objective, in a special type of model known as decomposable. This model… 

Figures from this paper

Distributed Covariance Estimation in Gaussian Graphical Models
  • A. Wiesel, A. Hero
  • Computer Science, Mathematics
    IEEE Transactions on Signal Processing
  • 2012
TLDR
This work proposes to improve the MSE performance by introducing additional symmetry constraints using averaging and pseudolikelihood estimation approaches, and compute the proposed estimates using message passing protocols, which can be efficiently implemented in large scale graphical models with many nodes.
Distributed Covariance Estimation in
TLDR
This work proposes to improve the MSE performance by introducing additionalsymmetry constraints using averaging and pseudolikelihood estimation approaches, and compute the proposed estimates using message passing protocols, which can be efficiently implemented in large scale graphical models with many nodes.
Distributed Learning of Gaussian Graphical Models via Marginal Likelihoods
TLDR
This paper proposes a general framework for distributed estimation based on a maximum marginal likelihood (MML) approach, and derives and considers solving a convex relaxation of the MML problem, and proves that this relaxed MML estimator is asymptotically consistent.
Multivariate Generalized Gaussian Distribution: Convexity and Graphical Models
TLDR
This work considers covariance estimation in the multivariate generalized Gaussian distribution and shows that the optimizations can be formulated as convex minimization as long the MGGD shape parameter is larger than half and the sparsity pattern is chordal.
Distributionally Robust Inverse Covariance Estimation: The Wasserstein Shrinkage Estimator
Note. The best result in each experiment is highlighted in bold.The optimal solutions of many decision problems such as the Markowitz portfolio allocation and the linear discriminant analysis depend
Joint Learning of Multiple Sparse Matrix Gaussian Graphical Models
TLDR
The proposed approach borrows strength across the different graphical models and is based on the maximum likelihood with penalized row and column precision matrices, respectively, and is more parsimonious and flexible than the joint vector graphical models.
Iterative Reconstruction of High-Dimensional Gaussian Graphical Models Based on a New Method to Estimate Partial Correlations under Constraints
TLDR
This work presents a simple procedure, called PACOSE, to estimate partial correlations under the constraint that some of them are strictly zero, and shows on simulated and real data that iPACOSE shows very interesting properties with regards to sensitivity, positive predictive value and stability.
Regularized Estimation of High-dimensional Covariance Matrices.
TLDR
This dissertation attempts to develop necessary components for covariance estimation in the high-dimensional setting by introducing a state-of-the-art sampling system, the Modulated Wideband Converter (MWC), which is capable of achieving sub-Nyquist sampling for multiband signals with arbitrary carrier frequency over a wide bandwidth.
Approximate least squares parameter estimation with structured observations
  • Atulya Yellepeddi, J. Preisig
  • Mathematics, Computer Science
    2014 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP)
  • 2014
TLDR
This work considers the situation where it is not appropriate to assume a structure for the parameter, but the observations on which the estimate are based are structured; specifically, when the observations are parametrized by a decomposable graphical model.
...
...

References

SHOWING 1-10 OF 77 REFERENCES
FLEXIBLE COVARIANCE ESTIMATION IN GRAPHICAL GAUSSIAN MODELS
In this paper, we propose a class of Bayes estimators for the covariance matrix of graphical Gaussian models Markov with respect to a decomposable graph G. Working with the W PG family defined by
Regularized estimation of large covariance matrices
TLDR
If the population covariance is embeddable in that model and well-conditioned then the banded approximations produce consistent estimates of the eigenvalues and associated eigenvectors of the covariance matrix.
Model selection and estimation in the Gaussian graphical model
TLDR
The implementation of the penalized likelihood methods for estimating the concentration matrix in the Gaussian graphical model is nontrivial, but it is shown that the computation can be done effectively by taking advantage of the efficient maxdet algorithm developed in convex optimization.
Rethinking Biased Estimation: Improving Maximum Likelihood and the Cramér-Rao Bound
  • Yonina C. Eldar
  • Mathematics, Computer Science
    Found. Trends Signal Process.
  • 2007
TLDR
This survey introduces MSE bounds that are lower than the unbiased Cramer–Rao bound for all values of the unknowns and presents a general framework for constructing biased estimators with smaller MSE than the standard maximum-likelihood (ML) approach, regardless of the true unknown values.
Model Selection Through Sparse Maximum Likelihood Estimation
TLDR
Two new algorithms for solving problems with at least a thousand nodes in the Gaussian case are presented, based on Nesterov's first order method, which yields a complexity estimate with a better dependence on problem size than existing interior point methods.
Inadmissibility of the Maximum Likekihood Estimator of Normal Covariance Matrices with the Lattice Conditional Independence
Lattice conditional independence (LCI) models introduced by S. A. Andersson and M. D. Perlman (1993, Ann. Statist.21, 1318?1358) have the pleasant feature of admitting explicit maximum likelihood
Estimation of the multivariate normal precision and covariance matrices in a star-shape model
TLDR
It is shown that the best equivariant estimators with respect to $$\mathcal{G}$$ is the special case of Bayesian estimators, which means that the MLE of the precision matrix is inadmissible under either entropy or symmetric loss.
Model Selection Through Sparse Maximum Likelihood Estimation for Multivariate Gaussian or Binary Data
TLDR
This work considers the problem of estimating the parameters of a Gaussian or binary distribution in such a way that the resulting undirected graphical model is sparse, and presents two new algorithms for solving problems with at least a thousand nodes in the Gaussian case.
Archival Version including Appendicies : Experiments in Stochastic Computation for High-Dimensional Graphical Models
We discuss the implementation, development and performance of methods of stochastic computation in Gaussian graphical models. We view these methods from the perspective of high-dimensional model
Estimation of a Multivariate Normal Covariance Matrix with Staircase Pattern Data
In this paper, we study the problem of estimating a multivariate normal covariance matrix with staircase pattern data. Two kinds of parameterizations in terms of the covariance matrix are used. One
...
...