This issuePrevious ArticleDouble circulant and quasi-twisted self-dual codes over $\mathbb F_5$ and $\mathbb F_7$Next ArticleGold and Kasami-Welch functions, quadratic forms, and bent functions
The final form of Tao's inequality relating conditional expectation and conditional mutual information
Recently Terence Tao approached Szemerédi's Regularity Lemma
from the perspectives of Probability Theory and of Information Theory instead of Graph Theory and found a stronger variant of this lemma, which involves a new parameter. To pass from an entropy formulation to an expectation formulation he found the following: Let $Y$ , and $X,X'$ be discrete random variables taking values in $mathcal Y$ and $mathcal X$, respectively, where $mathcal Y \subset$ [ −1, 1 ], and with $X' = f(X)$ for a (deterministic) function $f$. Then we have
$
\E(|\E(Y|X')-\E(Y|X)|)\leq2I(X\wedge Y|X')^{\frac12}.$
We show that the constant $2$ can be improved to $(2 \l n2)^{\frac{1}{2}}$ and that this is the best possible constant.