Maximal Correlation and the Rate of Fisher Information Convergence in the Central Limit Theorem

  title={Maximal Correlation and the Rate of Fisher Information Convergence in the Central Limit Theorem},
  author={Oliver Johnson},
  journal={IEEE Transactions on Information Theory},
  • O. Johnson
  • Published 28 May 2019
  • Mathematics
  • IEEE Transactions on Information Theory
We consider the behaviour of the Fisher information of scaled sums of independent and identically distributed random variables in the Central Limit Theorem regime. We show how this behaviour can be related to the second-largest non-trivial eigenvalue of the operator associated with the Hirschfeld–Gebelein–Rényi maximal correlation. We prove that assuming this eigenvalue satisfies a strict inequality, an <inline-formula> <tex-math notation="LaTeX">$O(1/n)$ </tex-math></inline-formula> rate of… 

Rates of Fisher information convergence in the central limit theorem for nonlinear statistics

We develop a general method to study the Fisher information distance in central limit theorem for nonlinear statistics. We first construct explicit representations for the score functions. We then use

Information-theoretic convergence of extreme values to the Gumbel distribution

It is shown that, assuming certain properties of the von Mises representation, convergence to the Gumbel can be proved in the strong sense of relative entropy, and a new type of score function is introduced which behaves well under the maximum operation.

A mutual information inequality and some applications

An inequality relating linear combinations of mutual information between subsets of mutually independent random variables and an auxiliary random variable is derived.

Information Fusion and Its Intelligent Sensing for Learning Intervention Model of Educational Big Data

  • Ying PeiGang Li
  • Computer Science
    Wireless Communications and Mobile Computing
  • 2021
This paper proposes to use information fusion and its intelligent sensing technology to take advantage of learning analytics to collect, organize, analyze, and guide the learning data generated by students in the learning process and then to generate interventions that can have an impact on learning and improve learning methods for students.



Fisher information inequalities and the central limit theorem

Abstract.We give conditions for an O(1/n) rate of convergence of Fisher information and relative entropy in the Central Limit Theorem. We use the theory of projections in L2 spaces and Poincaré

Generalized Entropy Power Inequalities and Monotonicity Properties of Information

A simple proof of the monotonicity of information in central limit theorems is obtained, both in theSetting of independent and identically distributed (i.i.d.) summands as well as in the more general setting of independent summands with variance-standardized sums.

Fisher information and the central limit theorem

An Edgeworth-type expansion is established for the relative Fisher information distance to the class of normal distributions of sums of i.i.d. random variables, satisfying moment conditions. The

Entropy jumps for isotropic log-concave random vectors and spectral gap

We prove a quantitative dimension-free bound in the Shannon{Stam en- tropy inequality for the convolution of two log-concave distributions in dimension d in terms of the spectral gap of the density.

An Information-Theoretic Proof of the Central Limit Theorem with Lindeberg Conditions

The proof is based on the “information functional” $l(\int_{ - \infty }^{ + \infty } {p(x)\log p(x)dx} + \tfrac{1} {2}\log {\bf D}(x))$ , $p(x)$ being the density of the random variable X. Some new

Monotonicity of Entropy and Fisher Information: A Quick Proof via Maximal Correlation

A simple proof is given for the monotonicity of entropy and Fisher information associated to sums of i.i.d. random variables. The proof relies on a characterization of maximal correlation for partial

Rate of convergence and Edgeworth-type expansion in the entropic central limit theorem

An Edgeworth-type expansion is established for the entropy distance to the class of normal distributions of sums of i.i.d. random variables or vectors, satisfying minimal moment conditions.

Information Theory And The Central Limit Theorem

This book provides a comprehensive description of a new method of proving the central limit theorem, through the use of apparently unrelated results from information theory. It gives a basic

Monotonic Decrease of the Non-Gaussianness of the Sum of Independent Random Variables: A Simple Proof

A simplified proof using the relationship between non-Gaussianness and minimum mean-square error (MMSE) in Gaussian channels and the more general setting of nonidentically distributed random variables is given.

Existence of Stein Kernels under a Spectral Gap, and Discrepancy Bound

Two general properties enjoyed by the Stein discrepancy are established, holding whenever a Stein kernel exists: Stein discrepancy is strictly decreasing along the CLT, and it controls the skewness of a random vector.