PARSIMONIOUS PARAMETERIZATION OF AGE-PERIOD-COHORT MODELS BY BAYESIAN SHRINKAGE

@article{Venter2017PARSIMONIOUSPO,
  title={PARSIMONIOUS PARAMETERIZATION OF AGE-PERIOD-COHORT MODELS BY BAYESIAN SHRINKAGE},
  author={Gary Venter and Sule {\"O}nsel Sahin},
  journal={ASTIN Bulletin},
  year={2017},
  volume={48},
  pages={89 - 110}
}
Abstract Age-period-cohort models used in life and general insurance can be over-parameterized, and actuaries have used several methods to avoid this, such as cubic splines. Regularization is a statistical approach for avoiding over-parameterization, and it can reduce estimation and predictive variances compared to MLE. In Markov Chain Monte Carlo (MCMC) estimation, regularization is accomplished by the use of mean-zero priors, and the degree of parsimony can be optimized by numerically… 

Parameter Shrinkage for Joint Age-Period-Cohort Modeling of Related Datasets

TLDR
Joint modeling by shrinking the differences among the same parameters for different datasets to reduce prediction error, measured using penalized log-likelihood, by increasing model parsimony is done.

Regularized Regression for Reserving and Mortality Models

  • G. Venter
  • Mathematics
    Asia-Pacific Journal of Risk and Insurance
  • 2018
Abstract Bayesian regularization, a relatively new method for estimating model parameters, shrinks estimates towards the overall mean by shrinking the parameters. It has been proven to lower

Modeling Mortality of Related Populations via Parameter Shrinkage

Parameter shrinkage is known to reduce fitting and prediction errors in linear models. When the variables are dummies for age, period, etc. shrinkage is more commonly applied to differences between

Semiparametric Regression for Dual Population Mortality

Parameter shrinkage applied optimally can always reduce error and projection variances from those of maximum likelihood estimation. Many variables that actuaries use are on numerical scales, like age

Bayesian model averaging for mortality forecasting using leave-future-out validation

Regularized Age-Period-Cohort Modeling of Opioid Mortality Rates

Opioid mortality rates have been increasing sharply, but not uniformly by age. Peak ages have recently dropped from the mid-40s to the mid-30s. There are two age peaks that have been moving up

Generational Cohort Effects on Trends in the Drug Overdose Mortality Epidemic in the United States

TLDR
Bayesian Lasso is an effective way to parameterize APC mortality trends, and the inclusion of cohort effects provides an improved account of the trends in overdose mortality rates across this population, compared to a model with age and period effects only.

Self-Assembling Insurance Claim Models Using Regularized Regression and Machine Learning

TLDR
The lasso performs well in modelling, identifying known features in the synthetic data, and tracking them accurately, despite complexity in those features that would challenge, and possibly defeat, most loss reserving alternatives.

Loss Reserving Using Estimation Methods Designed for Error Reduction

TLDR
The focus is on methodology, so projection to fill out the triangle is not addressed, but this is usually straightforward, and software packages for Bayesian regularization make it easy to fit more complex models.

Parameter Shrinkage for Age-Period-Cohort Modeling of Opioid Mortality Rates

TLDR
This paper discusses an approach to doing Bayesian parameter shrinkage for age-period-cohort models, and applies it to fitting opioid mortality rates with a generalization of the Lee-Carter model including cohorts.

References

SHOWING 1-10 OF 42 REFERENCES

Bayesian Poisson log-bilinear models for mortality projections with multiple populations

Life insurers, pension funds, health care providers and social security institutions face increasing expenses due to continuing improvements of mortality rates. The actuarial and demographic

A General Procedure for Constructing Mortality Models

Recently a large number of new mortality models have been proposed to analyze historic mortality rates and project them into the future. Many of these suffer from being over-parametrized or have

A unified approach to mortality modelling using state-space framework: characterisation, identification, estimation and forecasting

TLDR
This paper explores and develops alternative statistical representations and estimation approaches for dynamic mortality models and develops a class of Bayesian state-space models which incorporate a priori beliefs about the mortality model characteristics as well as for more flexible and appropriate assumptions relating to heteroscedasticity that present in observed mortality data.

The Application of Affine Processes in Multi-Cohort Mortality Model

Cohort effects have been identified in many countries. However, some mortality models only consider the modelling and projection of age-period effects. Others, that incorporate cohort effects, do not

Modeling and forecasting U. S. mortality

Abstract Time series methods are used to make long-run forecasts, with confidence intervals, of age-specific mortality in the United States from 1990 to 2065. First, the logs of the age-specific

Practical Bayesian model evaluation using leave-one-out cross-validation and WAIC

TLDR
An efficient computation of LOO is introduced using Pareto-smoothed importance sampling (PSIS), a new procedure for regularizing importance weights, and it is demonstrated that PSIS-LOO is more robust in the finite case with weak priors or influential observations.

A Quantitative Comparison of Stochastic Mortality Models Using Data From England and Wales and the United States

Abstract We compare quantitatively eight stochastic models explaining improvements in mortality rates in England and Wales and in the United States. On the basis of the Bayes Information Criterion

On Measuring and Correcting the Effects of Data Mining and Model Selection

TLDR
The concept of GDF offers a unified framework under which complex and highly irregular modeling procedures can be analyzed in the same way as classical linear models and many difficult problems can be solved easily.

A cohort-based extension to the Lee-Carter model for mortality reduction factors

Model Determination Using Predictive Distributions with Implementation via Sampling-Based Methods

TLDR
Model determination is divided into the issues of model adequacy and model selection and it is proposed to validate conditional predictive distributions arising from single point deletion against observed responses.