Exercise, diet, and sleep improve brain skills (and health)

This semester I am teaching 80 undergraduates Theory of Computation. I love the material and so every minute is precious, but I decided to sacrifice a few for a quick illustration of the title. After all, I thought to myself, it *is* my job to know, use, and disseminate teaching techniques that improve the students’ performance. So why shouldn’t I tell them the benefits of cardio exercise on learning? So this morning I scrambled together a few slides which you can see here.

I plan to add much more in future versions, but it is a euphemism to say that I am not an expert in these areas. So I’d very much appreciate any pointers, especially to what are the landmark papers in these areas.

Paper X, on the ArXiv, keeps getting rejected

Paper X, on the ArXiv, keeps getting rejected. Years later, Paper Y comes up and makes progress on X, or does something closely related to X. Y may or may not cite X. Y gets published. Now X cannot get published because the referees do not see what the contribution of X is, given that Y has been published and that in light of Y X is not new.

The solution in my opinion, following a series of earlier posts the last one of which is this, is to move the emphasis away from publication towards ArXiv appearance. Citations should refer to the first available version, often the ArXiv one. Journals/conferences can still exist in this new paradigm: their main important job would be to assign badges to ArXiv papers.

Obviously, this plan does not work for the entities behind current journals/conferences. So they enforce the status quo, and in the most degrading way: by forcing authors to fish out, maintain, and format useless references.

Hokuto No Ken and growing up in Italy

It seems that the Hokuto No Ken videogame that should have been made decades ago is finally being made. Thanks to Marco Genovesi for sending me this link. (More about Marco later on this blog.)

I consider watching the Hokuto No Ken series (excluding the more recent garbage) one of the most significant artistical experiences of my life, something that also makes me understand how can some people be so passionate about Dante or Homer. And, if you grow up in Italy there is a special treat for you. You can watch a version where the words are dubbed, but the soundtrack and the screams are from the original Japanese. By contrast, the English-speaking audience can either watch the Japanese version with subtitles — and I always hate subtitles — or they can watch an English dubbed version. I once happened to get a glimpse of the latter and I was horrified: The masterful soundtrack has been replaced by a very cheap synth, not to mention the screams. Compare this to this.


provides an objective ranking of CS departments. It is the revamped version of a previous system which I followed also because it did not include NEU. The new one does. But there are reasons slightly more subtle than the fact that NEU ranks 9 in “theory” — or 11 if you remove “logic and verification”, in both cases beating institutions which are way above NEU in other rankings — why I think having this information around is very valuable. Unobjective rankings tend to be sticky and not reflect recent decay or growth. And if one still wants to ignore data at least now one knows exactly what data is being ignored.

One dimension where I would like the system to be more flexible is in the choice of venues to include in the count. For example, I think journals should be added. Among the conferences, CCC should be included, as the leading conference specialized in computational complexity. I think the user should be allowed to select any weighted subset of journals and conferences.


I strongly recommend playing Oiligarchy a brilliant online game about American politics from the point of view of the oil industry. You should play to the end to see which ending you get: There are four, though one of them may be unattainable. But at the very least make sure to get to a presidential election. Its depiction is memorable. (Incidentally, many other games on the website are well worth a look.)

Turning to what is unfortunately not a videogame, this time we have a candidate, let us call them candidate X, whom many think is exceptionally unqualified to be President. I disagree in an unimportant way. I am not sure X really is more unqualified, or more dangerous, than some of the other candidates in recent history, including at least one who actually became President for two terms. The one distinctive feature of X seems to be that X is more colorful and more openly arrogant than other candidates. But I don’t find this to be a substantial difference. I certainly wouldn’t mind it too much if what X said made any sense at all.

I also suspect that X and their entourage know well that the chance of X winning the election is negligible, but want to rake in and maximize publicity, according to the old dictum that there is no bad publicity.

Bounded indistinguishability

Countless papers study the properties of k-wise independent distributions, which are distributions where any k bits are uniform and independent. One property of interest is which computational models are fooled by such distributions, in the sense that they cannot distinguish any such distribution from a uniformly random one. Recently, Bazzi’s breakthrough, discussed earlier on this blog, shows that k = polylog(n) independence fools any polynomial-size DNF on n bits.

Let us change the question. Let us say that instead of one distribution we have two, and we know that any k bits are distributed identically, but not necessarily uniformly. We call such distributions k-wise indistinguishable. (Bounded independence is the special case when one distribution is uniform.) Can a DNF distinguish the two distributions? In fact, what about a single Or gate?

This is the question that we address in a paper with Bogdanov, Ishai, and Williamson. A big thank you goes to my student Chin Ho Lee for connecting researchers who were working on the same problems on different continents. Here at NEU the question was asked to me by my neighbor Daniel Wichs.

The question turns out to be equivalent to threshold/approximate degree, an influential complexity measure that goes back to the works by Minsky and Papert and by Nisan and Szegedy. The equivalence is a good example of the usefulness of duality theory, and is as follows. For any boolean function f on n bits the following two are equivalent:

1. There exist two k-wise indistinguishable distributions that f tells apart with advantage e;

2. No degree-k real polynomial can approximate f to pointwise error at most e/2.

I have always liked this equivalence, but at times I felt slightly worried that could be considered too “simple.” But hey, I hope my co-authors don’t mind if I disclose that it’s been four different conferences, and not one reviewer filed a complaint about that.

From the body of works on approximate degree one readily sees that bounded indistinguishability behaves very differently from bounded independence. For example, one needs k = Ω(√ n) to fool an Or gate, and that is tight. Yes, to spell this out, there exist two distributions which are 0.001 √ n indistinguishable but Or tells them apart with probability 0.999. But obviously even constant independence fools Or.

The biggest gap is achieved by the Majority function: constant independence suffices, by this, while linear indistinguishability is required by Paturi’s lower bound.

In the paper we apply this equivalence in various settings, and here I am just going to mention the design of secret-sharing schemes. Previous schemes like Shamir’s required the computation of things like parity, while the new schemes use different types of functions, for example of constant depth. Here we also rely on the amazing ability of constant-depth circuits to sample distributions, also pointed out earlier on this blog, and apply some expander tricks to trade alphabet size for other parameters.

The birthday paradox

The birthday paradox is the fact that if you sample t independent variables each uniform in {1, 2,,n} then the probability that two will be equal is at least a constant independent from n when t √n. the The word ”paradox” refers to the fact that t can be as small as √n, as opposed to being closer to n. (Here I am not interested in the precise value of this constant as a function of t.)

The Wikipedia page lists several proofs of the birthday paradox where it is not hard to see why the √n bound arises. Nevertheless, I find the following two-stage approach more intuitive.

Divide the random variables in two sets of 0.5√n each. If there are two in the first set that are equal, then we are done. So we can condition on this event not happening, which means that the variables in the first set are all distinct. Now take any variable in the second set. The probability that it is equal to any variable in the first set is 0.5√n∕n = 0.5√n. Hence, the probability that all the variables in the second set are different from those in the first is

(1 0.5√n)0.5√n e0.25 < 1.

You do not need to leave your room

You do not need to leave your room. Remain sitting at your table and listen. Do not even listen, simply wait. Do not even wait, be quiet still and solitary. The world will freely offer itself to you to be unmasked, it has no choice, it will roll in ecstasy at your feet.

In this time of emphasis on collaborative, interdisciplinary, cross-fertilizing research, I find these words by Kafka refreshing.

How to buy a house

When I asked for a mortgage pre-approval the bank advised me to first find a house that I like, and then get back to them to prepare a pre-approval letter for the exact amount that I was going to offer. At the time I took the advice seriously, but now I can finally laugh with them at my ingenuity.

My idea of buying a house was that it is a slow process that works roughly as follows. First, you get pre-approved. Then you go to open houses, and you start making offers. If one is accepted you have a house inspection. Depending on the results, you go back to the negotiating table. After that is done, you get your mortgage, and finally you get the property. In a few years, you have made a lot of money.

This idea is completely wrong, and not only in the area where I have been looking.

First, if you need a mortgage you are already in big, big trouble. Most of the decent houses are bought for cash, even if another offer with a mortgage goes a lot higher. With a mortgage you only have access to a limited pool of problematic properties that nobody wants.

Second, the process is extremely fast. If you would like to sleep over it, the house will remain your dream house. Sales occur days or even hours after the listing is posted.

Third, if you need an inspection you are also already in trouble. Perhaps you worry a little about structural damage, asbestos, termites, radon gas, or an oil tank or other toxic waste buried in the yard? Then perhaps you should not be on the market. And don’t even think about lead paint, not to mention cell towers beaming right into your bedroom (more about this later on this blog). And if you ask to bring an inspector at the open house the answer is “no — we don’t want the inspector to disturb other potential buyers.” If you really want an expert opinion, short of becoming an inspector yourself your only option seems to be to bring an inspector in incognito. Someone who pretends to be your sibling by engaging in casual conversation on how that picture over there looks just like that common ancestor of yours, but then later whispers to you estimates for repairing the water-pump they spotted in a corner of the basement.

A good model for today’s buyer seems to be that of a derelict in a toxic wasteland who by chance suddenly becomes billionaire and so decides to buy real estate in various “hot spots” of which they know nothing, just in case they happen to be there one day.

Here is how the only offer I made went. First, there was absolutely no way to visit the property before the open house. And at the open house, after one hour the seller’s broker announces that bids close in thirty minutes. I actually scrambled together an offer. So hastily this was done that in hindsight I am very thankful that the seller did not even want to talk to us, despite the fact that we went over the asking price by 20%.

I have heard many calculations of the money you supposedly save by buying a house with a mortgage instead of renting. But all of these seem to omit an important point. Even if you do get the house, you will have to pay a lot more than if you had cash.

From the Simons institute

Last year I had a great sabbatical at Harvard university, and this semester, thanks to the flexibility of the NEU administration, I am at the Simons institute in Berkeley for their program on fine-grained algorithms and complexity. The institute is amazing, and there is an avalanche of interesting talks which is overwhelming, even by Boston standards.

As with all great things, one can wonder if the institute could be made even better. Here are two small thoughts about this. The first is about housing. I think it would be nice to have a block of houses managed by the institute. And I think I would think so even if my experience with finding housing hadn’t been what it has been. The starting point is that I could not find anything that suited me. Whatever the reason for this, it was definitely *not* because of lack of support from the institute. At some point a good listing did come out, and I immediately grabbed it, but after a long exchange the landlord changed their mind. By that time it was very late in the game, and it was harder to find anything. Since I was about to spend more time looking for an apartment than actually living there, I decided to go all-out and pay a rent that is ridiculous even for someone who rented in New York city. Naturally, I ended up in a terrible place. For one thing, I happen to disagree with the landlord’s calculation that a human being needs less than 100 square feet to survive. Shame on me for not inquiring about the size of the unit. So, even if the market looked completely hopeless, I had to get back into it. Here again the institute’s staff was extremely supportive, and in a reversal of fortune they spotted an ad for a place which turned out to be fantastic.

The second is equipment. I was given a nice corner office, substantial financial support, a bike, a keyboard/mouse set, and a monitor, but not a computer. Admittedly, this is not a great loss, because setting up a new machine is almost as much hassle as bringing your own. This brings me to a point which has nothing to do with the institute, but which I’ll make here. What I would like are machines where you just need to push a button to have your exact workbench appear. This would be great even if it took a little installation time, as long as everything was automated, because you could just push that button ahead of time. Of course we do have Chromebooks, rollApp, Remote Desktop, et cetera, but none of this comes close to having your own settings. Indeed, today’s solution to the problem amounts to travelling with your own laptop — something which brings to my mind the expression “downfall of mankind.”

I am reminded of a Java project that I did during my undergraduate at La Sapienza. We had to program an application at home, and then show it to the professors in the department. My group was perhaps the only one which did not drag their own desktop computer to run the application. So much for Java. We were quite proud of showing up with just a floppy disk, until our application did in a few minutes something that had never done in days of intense domestic beta-testing.

Going back to the talks that we had, Zuckerman gave a talk about his impressive 2-source extractor with Eshan Chattopadhyay. One component of their construction is an extractor for n-bit sources where n-n0.99 bits are polylog(n)-wise independent, and the other n0.99 are an arbitrary function of the former, which answers a question raised in a paper of mine. Alexander Kulikov spoke about his better-than-3n lower bound for the circuit complexity of an explicit function, with Magnus Gausdal Find, Alexander Golovnev, and Edward Hirsch. Their strongest lower bounds are for computing extractors for degree-2 GF(2) polynomials, an object that is not currently available. And I talked about the Local Reductions developed with Jahanjou and Miles.