Corpus ID: 226278157

Maximal correlation and monotonicity of free entropy.

  title={Maximal correlation and monotonicity of free entropy.},
  author={Benjamin Dadoun and Pierre Youssef},
  journal={arXiv: Operator Algebras},
We introduce the maximal correlation coefficient $R(M_1,M_2)$ between two noncommutative probability subspaces $M_1$ and $M_2$ and show that the maximal correlation coefficient between the sub-algebras generated by $s_n:=x_1+\ldots +x_n$ and $s_m:=x_1+\ldots +x_m$ equals $\sqrt{m/n}$ for $m\le n$, where $(x_i)_{i\in \mathbb{N}}$ is a sequence of free and identically distributed noncommutative random variables. This is the free-probability analogue of a result by Dembo--Kagan--Shepp in classical… Expand


Operator-valued distributions. I. Characterizations of freeness
Let $M$ be a $B$-probability space. Assume that $B$ itself is a $D$-probability space; then $M$ can be viewed as $D$-probability space as well. Let $X$ be in $M$. We look at the question of relatingExpand
A free analogue of Shannon's problem on monotonicity of entropy
Abstract We prove a free probability analog of a result of [S. Artstein, K. Ball, F. Barthe, A. Naor, Solution of Shannon's problem on monotonicity of entropy, J. Amer. Math. Soc. 17 (2004) 975–982].Expand
Solution of Shannon's problem on the monotonicity of entropy
It is shown that if X1, X2, . . . are independent and identically distributed square-integrable random variables then the entropy of the normalized sum Ent (X1+ · · · + Xn over √n) is an increasingExpand
Monotonicity of Entropy and Fisher Information: A Quick Proof via Maximal Correlation
  • T. Courtade
  • Mathematics, Computer Science
  • Commun. Inf. Syst.
  • 2016
A simple proof is given for the monotonicity of entropy and Fisher information associated to sums of i.i.d. random variables. The proof relies on a characterization of maximal correlation for partialExpand
On the multiplication of free N-tuples of noncommutative random variables
<abstract abstract-type="TeX"><p>Let <i>a</i><sub xmlns:m="" xmlns:mml="" xmlns:xlink="">1</sub>,...,Expand
The semicircle law, free random variables, and entropy
Overview Probability laws and noncommutative random variables The free relation Analytic function theory and infinitely divisible laws Random matrices and asymptotically free relation LargeExpand
The analogues of entropy and of Fisher's information measure in free probability theory, I
Analogues of the entropy and Fisher information measure for random variables in the context of free probability theory are introduced. Monotonicity properties and an analogue of the Cramer-RaoExpand
Remarks on the maximum correlation coefficient
The maximum correlation coefficient between partial sums of independent and identically distributed random variables with finite second moment equals the classical (Pearson) correlation coefficientExpand
Free probability and random matrices
In these lectures notes we will present and focus on free probability as a tool box to study the spectrum of polynomials in several (eventually) random matrices, and provide some applications.