• Corpus ID: 220381165

Information-theoretic convergence of extreme values to the Gumbel distribution

@article{Johnson2020InformationtheoreticCO,
  title={Information-theoretic convergence of extreme values to the Gumbel distribution},
  author={Oliver Johnson},
  journal={ArXiv},
  year={2020},
  volume={abs/2007.03569}
}
  • O. Johnson
  • Published 7 July 2020
  • Mathematics, Computer Science
  • ArXiv
We show how convergence to the Gumbel distribution in an extreme value setting can be understood in an information-theoretic sense. We introduce a new type of score function which behaves well under the maximum operation, and which implies simple expressions for entropy and relative entropy. We show that, assuming certain properties of the von Mises representation, convergence to the Gumbel can be proved in the strong sense of relative entropy. 

References

SHOWING 1-10 OF 23 REFERENCES

Rates of convergence for Renyi entropy in extreme value theory

  • Ali Saeb
  • Computer Science, Mathematics
  • 2014
TLDR
The rate of convergence result for Rényi entropy for linearly normalized partial maxima is studied, which proves that the RéNYi entropy of order β (β > 1) of linear normalized maximum of iid random variables with continuous differentiable density is convergent to the max stable laws.

Maximal Correlation and the Rate of Fisher Information Convergence in the Central Limit Theorem

  • O. Johnson
  • Mathematics
    IEEE Transactions on Information Theory
  • 2020
TLDR
It is proved that assuming this eigenvalue of the operator associated with the Hirschfeld–Gebelein–Rényi maximal correlation satisfies a strict inequality, an rate of convergence and a strengthened form of monotonicity hold.

Rates of convergence for R enyi entropy in extreme value theory

  • Ali Saeb
  • Computer Science, Mathematics
  • 2010
TLDR
The rate of convergence for R enyi entropy is studied in the case of linearly normalized partial maxima of iid random variables with continuous dierentiable density.

Fisher information inequalities and the central limit theorem

Abstract.We give conditions for an O(1/n) rate of convergence of Fisher information and relative entropy in the Central Limit Theorem. We use the theory of projections in L2 spaces and Poincaré

An entropic view of Pickands’ theorem

It is shown that distributions arising in Renyi-Tsallis maximum entropy setting are related to the generalized Pareto distributions (GPD) that are widely used for modeling the tails of distributions.

Local limit theorem for sample extremes

Assuming von Mises type conditions, we can prove the density of the normalized maximum of i.i.d. random variables converges to the density of the appropriate extreme value distribution in the Lp

Information Theory And The Central Limit Theorem

This book provides a comprehensive description of a new method of proving the central limit theorem, through the use of apparently unrelated results from information theory. It gives a basic

The convolution inequality for entropy powers

TLDR
An improved version of Stam's proof of Shannon's convolution inequality for entropy power is presented, which is obtained by mathematical induction from the one-dimensional case.

Extreme value theory : an introduction

This treatment of extreme value theory is unique in book literature in that it focuses on some beautiful theoretical results along with applications. All the main topics covering the heart of the

A note on entropies of l-max stable, p-max stable, generalized Pareto and generalized log-Pareto distributions

Limit laws of partial maxima of independent, identically distributed random variables under linear normalization are called extreme value laws or l-max stable laws and those under power normal-