# Entropy polarization

Sometimes you see quantum popping up everywhere. I just did the opposite and gave a classical talk at a quantum workshop, part of an AMS meeting held at Northeastern University, which poured yet another avalanche of talks onto the Boston area. I spoke about the complexity of distributions, also featured in an earlier post, including a result I posted two weeks ago which gives a boolean function $f:\{0,1\}^{n}\to \{0,1\}$ such that the output distribution of any AC$^{0}$ circuit has statistical distance $1/2-1/n^{\omega (1)}$ from $(Y,f(Y))$ for uniform $Y\in \{0,1\}^{n}$. In particular, no AC$^{0}$ circuit can compute $f$ much better than guessing at random even if the circuit is allowed to sample the input itself. The slides for the talk are here.

The new technique that enables this result I’ve called entropy polarization. Basically, for every AC$^{0}$ circuit mapping any number $L$ of bits into $n$ bits, there exists a small set $S$ of restrictions such that:

(1) the restrictions preserve the output distribution, and

(2) for every restriction $r\in S$, the output distribution of the circuit restricted to $r$ either has min-entropy $0$ or $n^{0.9}$. Whence polarization: the entropy will become either very small or very large.

Such a result is useless and trivial to prove with $|S|=2^{n}$; the critical feature is that one can obtain a much smaller $S$ of size $2^{n-n^{\Omega (1)}}$.

Entropy polarization can be used in conjunction with a previous technique of mine that works for high min-entropy distributions to obtain the said sampling lower bound.

It would be interesting to see if any of this machinery can yield a separation between quantum and classical sampling for constant-depth circuits, which is probably a reason why I was invited to give this talk.