Information properties of order statistics and spacings


We explore properties of the entropy, Kullback-Leibler information, and mutual information for order statistics. The probability integral transformation plays a pivotal role in developing our results. We provide bounds for the entropy of order statistics and some results that relate entropy ordering of order statistics to other well-known orderings of random variables. We show that the discrimination information between order statistics and data distribution, the discrimination information among the order statistics, and the mutual information between order statistics are all distribution free and are computable using the distributions of the order statistics of the samples from the uniform distribution. We also discuss information properties of spacings for uniform and exponential samples and provide a large sample distribution-free result on the entropy of spacings. The results show interesting symmetries of information orderings among order statistics.

DOI: 10.1109/TIT.2003.821973

Extracted Key Phrases

Showing 1-10 of 14 references

Strong limit theorems for sums of logarithm of high order spacings

  • M Ekström
  • 1999
1 Excerpt

Nonparametric entropy estimation: And overview

  • J Beirlant, E J Dudewicz, L Györfi, E C Van Der Meulen
  • 1997
1 Excerpt

Limit theorems for the logarithm of sample spacings

  • Y Shao, M G Hahn
  • 1995

On location, scale, skewness and kurtosis of univariate distributions

  • H Oja
  • 1981

Statistical Theory of Reliability and Life Testing

  • R E Barlow, F Prochan
  • 1981

Descriptive statistics for nonparametric models. III. Dispersion

  • P J Bickel, E L Lehmann
  • 1976