Contraction of a quasi-Bayesian model with shrinkage priors in precision matrix estimation

  title={Contraction of a quasi-Bayesian model with shrinkage priors in precision matrix estimation},
  author={Ruoyang Zhang and Yi-Bo Yao and Malay Ghosh},
  journal={Journal of Statistical Planning and Inference},

Figures and Tables from this paper


Quasi-Bayesian estimation of large Gaussian graphical models
  • Y. Atchadé
  • Computer Science, Mathematics
    J. Multivar. Anal.
  • 2019
Nearly optimal Bayesian Shrinkage for High Dimensional Regression
If the shrinkage prior has a heavy and flat tail, and allocates a sufficiently large probability mass in a very small neighborhood of zero, then its posterior properties are as good as those of the spike-and-slab prior.
Bayesian structure learning in graphical models
Bayesian shrinkage towards sharp minimaxity
This work discovers that, under the sparse normal means models, the polynomial order does affect the multiplicative constant of the posterior contraction rate, and proposes a Beta-prior modeling, such that the sharply minimax Bayesian procedure is adaptive to unknown.
Posterior convergence rates for estimating large precision matrices using graphical models
A banding structure in the model is considered and a prior distribution on a banded precision matrix is induced through a Gaussian graphical model, where an edge is present only when two vertices are within a given distance.
A convex pseudolikelihood framework for high dimensional partial correlation estimation with convergence guarantees
This work proposes a new pseudolikelihood‐based graphical model selection method that aims to overcome some of the shortcomings of current methods, but at the same time retain all their respective strengths, and introduces a novel framework that leads to a convex formulation of the partial covariance regression graph problem, resulting in an objective function comprised of quadratic forms.
Ultra high-dimensional multivariate posterior contraction rate under shrinkage priors
Contraction properties of shrinkage priors in logistic regression
Bayesian Graphical Lasso Models and Efficient Posterior Computation
  • Hao Wang
  • Computer Science, Mathematics
  • 2012
In terms of both covariance matrix estimation and graphical structure learning, the Bayesian adaptive graphical lasso appears to be the top overall performer among a range of frequentist and Bayesian methods.
Bayesian estimation of sparse signals with a continuous spike-and-slab prior
The main thrust of this paper is to introduce the family of Spike-andSlab LASSO (SS-LASSO) priors, which form a continuum between the Laplace prior and the point-mass spike-and-slab prior, and establish several appealing frequentist properties of SS-LassO priors.