# Information-theoretic convergence of extreme values to the Gumbel distribution

@article{Johnson2020InformationtheoreticCO, title={Information-theoretic convergence of extreme values to the Gumbel distribution}, author={Oliver Johnson}, journal={ArXiv}, year={2020}, volume={abs/2007.03569} }

We show how convergence to the Gumbel distribution in an extreme value setting can be understood in an information-theoretic sense. We introduce a new type of score function which behaves well under the maximum operation, and which implies simple expressions for entropy and relative entropy. We show that, assuming certain properties of the von Mises representation, convergence to the Gumbel can be proved in the strong sense of relative entropy.

## References

SHOWING 1-10 OF 23 REFERENCES

Maximal Correlation and the Rate of Fisher Information Convergence in the Central Limit Theorem

- MathematicsIEEE Transactions on Information Theory
- 2020

It is proved that assuming this eigenvalue of the operator associated with the Hirschfeld–Gebelein–Rényi maximal correlation satisfies a strict inequality, an rate of convergence and a strengthened form of monotonicity hold.

Rates of convergence for R enyi entropy in extreme value theory

- Computer Science, Mathematics
- 2010

The rate of convergence for R enyi entropy is studied in the case of linearly normalized partial maxima of iid random variables with continuous dierentiable density.

An entropic view of Pickands’ theorem

- Mathematics2008 IEEE International Symposium on Information Theory
- 2008

It is shown that distributions arising in Renyi-Tsallis maximum entropy setting are related to the generalized Pareto distributions (GPD) that are widely used for modeling the tails of distributions.…

Local limit theorem for sample extremes

- Mathematics
- 1982

Assuming von Mises type conditions, we can prove the density of the normalized maximum of i.i.d. random variables converges to the density of the appropriate extreme value distribution in the Lp…

A note on entropies of l-max stable, p-max stable, generalized Pareto and generalized log-Pareto distributions

- Mathematics
- 2012

Limit laws of partial maxima of independent, identically distributed random variables under linear normalization are called extreme value laws or l-max stable laws and those under power normal-…

Some Inequalities Satisfied by the Quantities of Information of Fisher and Shannon

- MathematicsInf. Control.
- 1959

Rates of convergence towards the Frechet distribution

- Mathematics
- 2013

We develop Stein's method for the Frechet distribution and apply it to com- pute rates of convergence in distribution of renormalized sample maxima to the Frechet distribution.

A bound for the error in the normal approximation to the distribution of a sum of dependent random variables

- Mathematics
- 1972

This paper has two aims, one fairly concrete and the other more abstract. In Section 3, bounds are obtained under certain conditions for the departure of the distribution of the sum of n terms of a…

ENTROPY AND THE CENTRAL LIMIT THEOREM

- Mathematics
- 1986

On etend un argument de Brown (1982) pour montrer que les informations de Fisher convergent vers la reciproque de la variance

Extreme Values, Regular Variation, and Point Processes

- Mathematics
- 1987

Contents: Preface * Preliminaries * Domains of Attraction and Norming Constants * Quality of Convergence * Point Processes * Records and Extremal Processes * Multivariate Extremes * References *…