gk: An R Package for the g-and-k and Generalised g-and-h Distributions

@article{Prangle2020gkAR,
  title={gk: An R Package for the g-and-k and Generalised g-and-h Distributions},
  author={Dennis Prangle},
  journal={R J.},
  year={2020},
  volume={12},
  pages={7}
}
The g-and-k and (generalised) g-and-h distributions are flexible univariate distributions which can model highly skewed or heavy tailed data through only four parameters: location and scale, and two shape parameters influencing the skewness and kurtosis. These distributions have the unusual property that they are defined through their quantile function (inverse cumulative distribution function) and their density is unavailable in closed form, which makes parameter inference complicated. This… 

Figures and Tables from this paper

Testing a parameter restriction on the boundary for the g-and-h distribution: a simulated approach

We develop a likelihood-ratio test for discriminating between the g-and-h and the g distribution, which is a special case of the former obtained when the parameter h is equal to zero. The g

Testing a parameter restriction on the boundary for the g-and-h distribution: a simulated approach

We develop a likelihood-ratio test for discriminating between the g-and-h and the g distribution, which is a special case of the former obtained when the parameter h is equal to zero. The g

Estimating large losses in insurance analytics and operational risk using the g-and-h distribution

This paper introduces two estimation methods: a numerical maximum likelihood technique, and an indirect inference approach with a bootstrap weighting scheme that is computationally more efficient and provides better estimates than the maximum likelihood method in the case of extreme features in the data.

Statistical Inference for Generative Models with Maximum Mean Discrepancy

Theoretical properties of a class of minimum distance estimators for intractable generative models, that is, statistical models for which the likelihood is intracted, but simulation is cheap, are studied, showing that they are consistent, asymptotically normal and robust to model misspecification.

Bayes Calculations from Quantile Implied Likelihood

This article introduces a likelihood function that approximates the exact likelihood through its quantile function, and is defined by an asymptotic chi-square distribution based on confidence distribution theory.

Robust Bayesian Inference for Simulator-based Models via the MMD Posterior Bootstrap

This paper proposes a novel algorithm based on the posterior bootstrap and maximum mean discrepancy estimators that leads to a highly-parallelisable Bayesian inference algorithm with strong robustness properties for simulators.

Componentwise approximate Bayesian computation via Gibbs-like steps

A Gibbs version of the ABC approach that runs component-wise approximate Bayesian computation steps aimed at the corresponding conditional posterior distributions, and based on summary statistics of reduced dimensions is explored.

The Fixed Landscape Inference MethOd (flimo): an alternative to Approximate Bayesian Computation, faster by several orders of magnitude

The Fixed Landscape Inference MethOd is introduced, a new likelihood-free inference method for continuous state-space stochastic models that applies deterministic gradient-based optimization algorithms to obtain a point estimate of the parameters, minimizing the difference between the data and some simulations according to some prescribed summary statistics.

Discrepancy-based Inference for Intractable Generative Models using Quasi-Monte Carlo

Results are sample complexity bounds which demonstrate that, under smoothness conditions on the generator, QMC can significantly reduce the number of samples required to obtain a given level of accuracy when using three of the most common discrepancies: the maximum mean discrepancy, the Wasserstein distance, and the Sinkhorn divergence.

From Denoising Diffusions to Denoising Markov Models

A general framework is proposed which not only unifies and generalizes this approach to a wide class of spaces but also leads to an original extension of score matching.

References

SHOWING 1-10 OF 27 REFERENCES

Numerical maximum likelihood estimation for the g-and-k and generalized g-and-h distributions

Results indicate that sample sizes significantly larger than 100 should be used to obtain reliable estimates through maximum likelihood, and the appropriateness of using asymptotic methods examined.

Bayesian estimation ofg-and-k distributions using MCMC

MC, however, provides a simulation-based alternative for obtaining the maximum likelihood estimates of parameters of these distributions or for deriving posterior estimates of the parameters through a Bayesian framework.

Bayesian estimation of quantile distributions

Approximate Bayesian computation provides an alternative approach requiring only a sampling scheme for the distribution of interest, enabling easier use of quantile distributions under the Bayesian framework.

Likelihood-free Bayesian estimation of multivariate quantile distributions

Estimating Quantile Families of Loss Distributions for Non-Life Insurance Modelling via L-Moments

A novel, efficient, and robust procedure for estimating the parameters of this family of Tukey transform models, based on L-moments is developed, shown to be more efficient than the current state of the art estimation methods for such families of loss models while being simple to implement for practical purposes.

EasyABC: performing efficient approximate Bayesian computation sampling schemes using R

This work introduces the R package ‘EasyABC’ that enables one to launch a series of simulations from the R platform and to retrieve the simulation outputs in an appropriate format for post‐processing, and implements several efficient parameter sampling schemes to speed up the ABC procedure.

An adaptive Metropolis algorithm

An adaptive Metropolis (AM) algorithm, where the Gaussian proposal distribution is updated along the process using the full information cumulated so far, which establishes here that it has the correct ergodic properties.

Constructing Summary Statistics for Approximate Bayesian Computation: Semi-automatic ABC

This work shows how to construct appropriate summary statistics for ABC in a semi-automatic manner, and shows that optimal summary statistics are the posterior means of the parameters, while these cannot be calculated analytically.

On the ergodicity of the adaptive Metropolis algorithm on unbounded domains

This paper describes sufficient conditions to ensure the correct ergodicity of the Adaptive Metropolis (AM) algorithm of Haario, Saksman, and Tamminen (9), for target distributions with a non-compact