• Corpus ID: 236975878

Bayesian Inference using the Proximal Mapping: Uncertainty Quantification under Varying Dimensionality

@inproceedings{Xu2021BayesianIU,
  title={Bayesian Inference using the Proximal Mapping: Uncertainty Quantification under Varying Dimensionality},
  author={Maoran Xu and Hua Zhou and Yujie Hu and Leo L. Duan},
  year={2021}
}
In statistical applications, it is common to encounter parameters supported on a varying or unknown dimensional space. Examples include the fused lasso regression, the matrix recovery under an unknown low rank, etc. Despite the ease of obtaining a point estimate via the optimization, it is much more challenging to quantify their uncertainty — in the Bayesian framework, a major difficulty is that if assigning the prior associated with a p-dimensional measure, then there is zero posterior… 

Figures and Tables from this paper

References

SHOWING 1-10 OF 36 REFERENCES

Bayesian cumulative shrinkage for infinite factorizations.

TLDR
This article proposes a novel increasing shrinkage prior, called the cumulative shrinkage process, for the parameters that control the dimension in overcomplete formulations, and shows that this formulation has theoretical and practical advantages relative to current competitors, including an improved ability to recover the model dimension.

Constrained Bayesian Inference through Posterior Projections

TLDR
This work formalize and provide a unifying Bayesian framework for posterior projections and provides a general formulation of the projected posterior that corresponds to a valid posterior distribution on the constrained space for particular classes of priors and likelihood functions.

Sparse Bayesian infinite factor models.

TLDR
This work proposes a multiplicative gamma process shrinkage prior on the factor loadings which allows introduction of infinitely many factors, with the loadings increasingly shrunk towards zero as the column index increases, and develops an efficient Gibbs sampler that scales well as data dimensionality increases.

Bayesian Inference on Order‐Constrained Parameters in Generalized Linear Models

TLDR
A general Bayesian approach for inference on order‐constrained parameters in generalized linear models is proposed using an isotonic regression transformation, which allows flat regions over which increases in the level of a predictor have no effect.

The Bayesian Lasso

The Lasso estimate for linear regression parameters can be interpreted as a Bayesian posterior mode estimate when the regression parameters have independent Laplace (i.e., double-exponential) priors.

Bayesian Variable Selection in Linear Regression

Abstract This article is concerned with the selection of subsets of predictor variables in a linear regression model for the prediction of a dependent variable. It is based on a Bayesian approach,

Sparse Bayesian time-varying covariance estimation in many dimensions

  • G. Kastner
  • Computer Science
    Journal of Econometrics
  • 2019

A Method for Bayesian Monotonic Multiple Regression

Abstract.  When applicable, an assumed monotonicity property of the regression function w.r.t. covariates has a strong stabilizing effect on the estimates. Because of this, other parametric or

Functional Horseshoe Priors for Subspace Shrinkage

TLDR
A new shrinkage prior on function spaces, called the functional horseshoe (fHS) prior, that encourages shrinkage toward parametric classes of functions, is introduced that achieves smaller estimation error and more accurate model selection than other procedures in several simulated and real examples.

Handling Sparsity via the Horseshoe

TLDR
This paper presents a general, fully Bayesian framework for sparse supervised-learning problems based on the horseshoe prior, which is a member of the family of multivariate scale mixtures of normals and closely related to widely used approaches for sparse Bayesian learning.