Penalized complexity priors for degrees of freedom in Bayesian P-splines

@article{Ventrucci2015PenalizedCP,
  title={Penalized complexity priors for degrees of freedom in Bayesian P-splines},
  author={Massimo Ventrucci and H{\aa}vard Rue},
  journal={Statistical Modelling},
  year={2015},
  volume={16},
  pages={429 - 453}
}
Abstract Bayesian penalized splines (P-splines) assume an intrinsic Gaussian Markov random field prior on the spline coefficients, conditional on a precision hyper-parameter τ . Prior elicitation of τ is difficult. To overcome this issue, we aim to building priors on an interpretable property of the model, indicating the complexity of the smooth function to be estimated. Following this idea, we propose penalized complexity (PC) priors for the number of effective degrees of freedom. We present… 

Figures and Tables from this paper

Posterior Concentration Rates for Bayesian Penalized Splines
TLDR
This work studies posterior concentration rates for Bayesian penalized splines in a Gaussian nonparametric regression model and shows that posterior concentration at near optimal rate can be achieved if the hyperprior on the smoothing variance strikes a fine balance between oversmoothing and undersmoothing.
Bayesian Inference of Local Projections with Roughness Penalty Priors
TLDR
This study develops a fully Bayesian approach that can be used to estimate local projections using roughness penalty priors and applies the proposed approach to an analysis of monetary policy in the United States, showing that the roughness Penalty priors successfully estimate the impulse response functions and improve the predictive accuracy of local projections.
Predictive Complexity Priors
TLDR
P predictive complexity priors are proposed: a functional prior that is defined by comparing the model's predictions to those of a reference function via a change of variables, which is originally defined on the model outputs.
Laplacian‐P‐splines for Bayesian inference in the mixture cure model
TLDR
Results show that LPSMC is an appealing alternative to MCMC for approximate Bayesian inference in standard mixture cure models and the proposed Laplacian‐P‐splines mixture cure methodology.
PC priors for residual correlation parameters in one-factor mixed models
Lack of independence in the residuals from linear regression motivates the use of random effect models in many applied fields. We start from the one-way anova model and extend it to a general class
Explorer Bayesian Computing with INLA : A Review
TLDR
The reasons for the success of the INLA approach, the R-INLA package, why it is so accurate, why the approximations are very quick to compute and why LGMs make such a useful concept for Bayesian computing are discussed.
Bayesian Computing with INLA: A Review
TLDR
The reasons for the success of the INLA approach, the R-INLA package, why it is so accurate, why the approximations are very quick to compute, and why LGMs make such a useful concept for Bayesian computing are discussed.
Bayesian Smooth-and-Match strategy for ordinary differential equations models that are linear in the parameters
TLDR
This work focuses on the class of techniques that use smoothing to avoid direct integration and, in particular, on a Bayesian Smooth-and-Match strategy that allows to obtain the ODEs' solution while performing inference on models that are linear in the parameters.
...
...

References

SHOWING 1-10 OF 40 REFERENCES
Robust specification of the roughness penalty prior distribution in spatially adaptive Bayesian P-splines models
Bayesian P-Splines
P-splines are an attractive approach for modeling nonlinear smooth effects of covariates within the additive and varying coefficient models framework. In this article, we first develop a Bayesian
Penalising Model Component Complexity: A Principled, Practical Approach to Constructing Priors
TLDR
A new concept for constructing prior distributions that is invariant to reparameterisations, have a natural connection to Jeffreys’ priors, seem to have excellent robustness properties, and allow this approach to define default prior distributions.
PENALIZED STRUCTURED ADDITIVE REGRESSION FOR SPACE-TIME DATA: A BAYESIAN PERSPECTIVE
TLDR
Extensions of penalized spline generalized additive models for analyzing space-time regression data and study them from a Bayesian per- spective using MCMC techniques is proposed.
Propriety of posteriors in structured additive regression models: Theory and empirical evidence
Conservative prior distributions for variance parameters in hierarchical models
Bayesian hierarchical models typically involve specifying prior distributions for one or more variance components. This is rather removed from the observed data, so specification based on expert
Prior distributions for variance parameters in hierarchical models (comment on article by Browne and Draper)
Various noninformative prior distributions have been suggested for scale parameters in hierarchical models. We construct a new folded-noncentral-t family of conditionally conjugate priors for
Flexible smoothing with B-splines and penalties
TLDR
A relatively large number of knots and a difference penalty on coefficients of adjacent B-splines are proposed to use and connections to the familiar spline penalty on the integral of the squared second derivative are shown.
Bayesian inference for generalized linear mixed models.
TLDR
It is concluded that Bayesian inference is now practically feasible for GLMMs and provides an attractive alternative to likelihood-based approaches such as penalized quasi-likelihood.
Improved auxiliary mixture sampling for hierarchical models of non-Gaussian data
TLDR
An improved method of auxiliary mixture sampling that uses a bounded number of latent variables per observation leads to a substantial increase in efficiency of Auxiliary mixture sampling for highly structured models.
...
...