• Corpus ID: 15989325

Calibration and Empirical Bayes Variable Selection

@inproceedings{US1997CalibrationAE,
  title={Calibration and Empirical Bayes Variable Selection},
  author={A. U.S.},
  year={1997}
}
For the problem of variable selection for the normal linear model, selection criteria such as AIC, Cp, BIC and RIC have fixed dimensionality penalties. Such criteria are shown to correspond to selection of maximum posterior models under implicit hyperparameter choices for a particular hierarchical Bayes formulation. Based on this calibration, we propose empirical Bayes selection criteria that use hyperparameter estimates instead of fixed choices. For obtaining these estimates, both marginal and… 

Figures from this paper

Empirical Bayes vs. Fully Bayes Variable Selection
A Hierarchical Bayes Approach to Variable Selection for Generalized Linear Models
For the problem of variable selection in generalized linear models, we develop various adaptive Bayesian criteria. Using a hierarchical mixture setup for model uncertainty, combined with an
Bayesian Shrinkage Estimation and Model Selection
TLDR
The proposed method eliminates the need for combinatorial search techniques over a discreet model space, converting the model to a Bayes argument and looking at the problem from a penalized least squares estimation angle.
Constrained empirical Bayes priors on regression coefficients
In the context of model uncertainty and selection, empirical Bayes procedures can have undesirable properties such as extreme estimates of inclusion probabilities (Scott and Berger, 2010) or
Variable Selection Properties of L1 Penalized Regression in Generalized Linear Models
TLDR
Under this Bayesian framework, empirical and fully Bayes variable selection procedures related to Least Absolute Selection and Shrinkage Operator (LASSO) are developed and consistency of Lp penalized estimators in GLMs is established under regularity conditions.
Performance of Variable Selection Methods in Regression Using Variations of the Bayesian Information Criterion
TLDR
It is concluded that the version that includes the Fisher Information often favors regression models having more predictors, depending on the scale and correlation structure of the predictor matrix.
Approximate Bayesian Model Selection with the Deviance Statistic
Bayesian model selection poses two main challenges: the specification of parameter priors for all models, and the computation of the resulting Bayes factors between models. There is now a large
Posterior model consistency in variable selection as the model dimension grows
TLDR
This paper analyzes the consistency of the posterior model probabilities when the number of potential regressors grows as the sample size grows and finds that some classes of priors typically used in variable selection yield posterior model inconsistency, while mixtures of these priors improve this undesirable behavior.
Priors for Bayesian Shrinkage and High-Dimensional Model Selection
TLDR
This dissertation investigates the asymptotic form of the marginal likelihood based on the nonlocal priors and shows that it attains a unique penalty term that adapts to the strength of signal corresponding variable in the model, and remark that this term cannot be attained from local priors such as Gaussian prior densities.
Multiple Testing , Empirical Bayes , and the Variable-Selection Problem
This paper studies the multiplicity-correction effect of standard Bayesian variableselection priors in linear regression. The first goal of the paper is to clarify when, and how, multiplicity
...
...

References

SHOWING 1-10 OF 34 REFERENCES
Nonparametric regression using Bayesian variable selection
The risk inflation criterion for multiple regression
A new criterion is proposed for the evaluation of variable selection procedures in multiple regression. This criterion, which we call the risk inflation, is based on an adjustment to the risk.
APPROACHES FOR BAYESIAN VARIABLE SELECTION
This paper describes and compares various hierarchical mixture prior formulations of variable selection uncertainty in normal linear regression models. These include the nonconjugate SSVS formulation
Empirical Bayes Estimation in Wavelet Nonparametric Regression
TLDR
This chapter uses an Empirical Bayes approach to estimate the hyperparameters for each level of the wavelet decomposition, bypassing the usual difficulty of hyperparameter specification in the hierarchical model.
Bayes Factors and Choice Criteria for Linear Models
SUMMARY Global and local Bayes factors are defined and their respective roles examined as choice criteria among alternative linear models. The global Bayes factor is seen to function, in appropriate
Fractional Bayes factors for model comparison
Bayesian comparison of models is achieved simply by calculation of posterior probabilities of the models themselves. However, there are difficulties with this approach when prior information about
Bayesian Model Averaging for Linear Regression Models
Abstract We consider the problem of accounting for model uncertainty in linear regression models. Conditioning on a single selected model ignores model uncertainty, and thus leads to the
Variable selection via Gibbs sampling
Abstract A crucial problem in building a multiple regression model is the selection of predictors to include. The main thrust of this article is to propose and develop a procedure that uses
The Schwarz criterion and related methods for normal linear models
In this paper we derive Schwarz's information criterion and two modifications for choosing fixed effects in normal linear mixed models. The first modification allows an arbitrary, possibly
Multiple shrinkage and subset selection in wavelets
This paper discusses Bayesian methods for multiple shrinkage estimation in wavelets. Wavelets are used in applications for data denoising, via shrinkage of the coefficients towards zero, and for data
...
...