|Jun 24, 2019
|University of Paris-Diderot
We first recall the definition of normality which is a kind of (very) weak randomness. We consider normal number digit dependencies in their expansion in a given integer base. We quantify precisely how much digit dependence can be allowed such that, still, almost all real numbers are normal. In some cases, we are even able to prove that still, almost all real numbers are absolutely normal.