• Corpus ID: 233393740

Contraction of a quasi-Bayesian model with shrinkage priors in precision matrix estimation

@inproceedings{Zhang2021ContractionOA,
  title={Contraction of a quasi-Bayesian model with shrinkage priors in precision matrix estimation},
  author={Ruoyang Zhang and Yi-Bo Yao and Malay Ghosh},
  year={2021}
}
Currently several Bayesian approaches are available to estimate large sparse precision matrices, including Bayesian graphical Lasso (Wang, 2012), Bayesian structure learning (Banerjee and Ghosal, 2015), and graphical horseshoe (Li et al., 2019). Although these methods have exhibited nice empirical performances, in general they are computationally expensive. Moreover, we have limited knowledge about the theoretical properties, e.g., posterior contraction rate, of graphical Bayesian Lasso and… 

Figures and Tables from this paper

References

SHOWING 1-10 OF 83 REFERENCES
Quasi-Bayesian estimation of large Gaussian graphical models
  • Y. Atchadé
  • Mathematics, Computer Science
    J. Multivar. Anal.
  • 2019
Nearly optimal Bayesian Shrinkage for High Dimensional Regression
During the past decade, shrinkage priors have received much attention in Bayesian analysis of high-dimensional data. In this paper, we study the problem for high-dimensional linear regression models.
Bayesian structure learning in graphical models
Bayesian shrinkage towards sharp minimaxity
Shrinkage prior are becoming more and more popular in Bayesian modeling for high dimensional sparse problems due to its computational efficiency. Recent works show that a polynomially decaying prior
Posterior convergence rates for estimating large precision matrices using graphical models
We consider Bayesian estimation of a $p\times p$ precision matrix, when $p$ can be much larger than the available sample size $n$. It is well known that consistent estimation in such ultra-high
Ultra high-dimensional multivariate posterior contraction rate under shrinkage priors
Contraction properties of shrinkage priors in logistic regression
A convex pseudolikelihood framework for high dimensional partial correlation estimation with convergence guarantees
type="main" xml:id="rssb12088-abs-0001"> Sparse high dimensional graphical model selection is a topic of much interest in modern day statistics. A popular approach is to apply l 1 -penalties to
Bayesian Graphical Lasso Models and Efficient Posterior Computation
Recently, the graphical lasso procedure has become popular in estimating Gaussian graphical models. In this paper, we introduce a fully Bayesian treatment of graphical lasso models. We first
Bayesian estimation of sparse signals with a continuous spike-and-slab prior
We introduce a new framework for estimation of sparse normal means, bridging the gap between popular frequentist strategies (LASSO) and popular Bayesian strategies (spike-and-slab). The main thrust
...
1
2
3
4
5
...