• Corpus ID: 220381165

Information-theoretic convergence of extreme values to the Gumbel distribution

@article{Johnson2020InformationtheoreticCO,
  title={Information-theoretic convergence of extreme values to the Gumbel distribution},
  author={Oliver Johnson},
  journal={ArXiv},
  year={2020},
  volume={abs/2007.03569}
}
  • O. Johnson
  • Published 7 July 2020
  • Mathematics, Computer Science
  • ArXiv
We show how convergence to the Gumbel distribution in an extreme value setting can be understood in an information-theoretic sense. We introduce a new type of score function which behaves well under the maximum operation, and which implies simple expressions for entropy and relative entropy. We show that, assuming certain properties of the von Mises representation, convergence to the Gumbel can be proved in the strong sense of relative entropy. 

References

SHOWING 1-10 OF 23 REFERENCES
Maximal Correlation and the Rate of Fisher Information Convergence in the Central Limit Theorem
  • O. Johnson
  • Mathematics
    IEEE Transactions on Information Theory
  • 2020
TLDR
It is proved that assuming this eigenvalue of the operator associated with the Hirschfeld–Gebelein–Rényi maximal correlation satisfies a strict inequality, an rate of convergence and a strengthened form of monotonicity hold.
Rates of convergence for R enyi entropy in extreme value theory
  • Ali Saeb
  • Computer Science, Mathematics
  • 2010
TLDR
The rate of convergence for R enyi entropy is studied in the case of linearly normalized partial maxima of iid random variables with continuous dierentiable density.
An entropic view of Pickands’ theorem
It is shown that distributions arising in Renyi-Tsallis maximum entropy setting are related to the generalized Pareto distributions (GPD) that are widely used for modeling the tails of distributions.
Local limit theorem for sample extremes
Assuming von Mises type conditions, we can prove the density of the normalized maximum of i.i.d. random variables converges to the density of the appropriate extreme value distribution in the Lp
A note on entropies of l-max stable, p-max stable, generalized Pareto and generalized log-Pareto distributions
Limit laws of partial maxima of independent, identically distributed random variables under linear normalization are called extreme value laws or l-max stable laws and those under power normal-
Rates of convergence towards the Frechet distribution
We develop Stein's method for the Frechet distribution and apply it to com- pute rates of convergence in distribution of renormalized sample maxima to the Frechet distribution.
A bound for the error in the normal approximation to the distribution of a sum of dependent random variables
This paper has two aims, one fairly concrete and the other more abstract. In Section 3, bounds are obtained under certain conditions for the departure of the distribution of the sum of n terms of a
ENTROPY AND THE CENTRAL LIMIT THEOREM
On etend un argument de Brown (1982) pour montrer que les informations de Fisher convergent vers la reciproque de la variance
Extreme Values, Regular Variation, and Point Processes
Contents: Preface * Preliminaries * Domains of Attraction and Norming Constants * Quality of Convergence * Point Processes * Records and Extremal Processes * Multivariate Extremes * References *
...
...