A couple of days ago I started and ended my workday attending very interesting seminars. At 9am I was watching Zeev Dvir talk about his new paper with Manik Dhar proving the Kakeya conjecture over Z/mZ for m squarefree. At 5, as the sun set, Bjorn Poonen was talking at IAS (ok, at his office at MIT, but virtually at IAS) about his work with Kedlaya, Kolpakov and Rubinstein on space vectors forming rational angles. Those two things sound pretty different, but to my surprise, a very similar question appeared at the heart of both papers! Here it is, phrased a bit vaguely:

Describe the N x N matrices with rank much less than N whose entries are all 0 or roots of unity.

Such matrices exist, of course; A could be 0, or A could be the rank 1 matrix where v and w have root-of-unity entries. Or you could just make a block-diagonal matrix with each block of rank 1 as above. But are there more interesting cases? And what does this have to do with the results above? (**Update**: it occurs to me that you could pithily phrase the question as: what can we say about matrices of low rank and low height?)

Let’s start with Poonen, where the connection is a little more direct. He and his collaborators are interested in tetrahedra whose face angles are all rational. The unit vectors normal to those faces will thus have the property that

for every pair of vectors u,v. Bjorn & co. decide to go further, asking for a classification of *all* sets of unit vectors in R^3 such that every pair forms a rational angle. If you have a set of k such vectors, you can put them together into a kx3 matrix U, and the condition above says that the matrix UU^T (the so-called Gram matrix) has all entries of the form . But more is true: U U^T visibly has rank at most 3! Now UU^T doesn’t have entries that are roots of unity, but it does have roots drawn from a very similar countable subset. Conversely, if you have a matrix with 1’s on the diagonal, all other entries rational cosines, and rank at most r, then you have found a subset of S^{r-1} which forms only rational angles. So this has the same flavor as the question above. And indeed one way you could obtain such a matrix is by finding a matrix A of rank < r/2 with all entries roots of unity and then averaging A with its conjugate.

Now what about Zeev? He’s working on one of my favorite problems, Kakeya over rings. Of course it was Zeev himself who famously proved the Kakeya conjecture over finite fields, showing that if S is a subset of F_q^n containing a line in every direction, then |S| must be at least c_n q^n. This is meant to be an analogue of the original Kakeya problem from harmonic analysis, which asks whether a subset of **R**^n containing a unite line segment in every direction has to have full Hausdorff dimension. Dvir’s theorem proves this over finite fields; but actually, it proves more — that the Kakeya set has *positive measure* (namely, measure at least c_n, normalizing the whole space to have measure 1.) That’s a problem if you like analogies, because the analysis analogue of that statement isn’t true! A famous example of Besicovitch shows that a set can have a unit line segment in every direction but have measure 0. (It’s hard to draw. Very spiky.) **R** and F_p are different!

One reason they’re different is that **R** has *scales* and F_p does not. Two distinct real numbers can be very close together or very far apart. In a finite field, two numbers are either the same or they’re not. A better analogue for **R** (of course I’d think this, I’m a number theorist) is the ring of p-adic integers, Z_p, which metrically looks a lot more like the reals. Even the finite ring Z/p^k Z has scales, though only k, not infinitely many. (For more about this, see this old post.)

Dhar and Dvir don’t prove Kakeya over Z/p^k Z, a long-time goal of mine, but they *do* prove it over Z/mZ for m squarefree, which is very cool, and they lay out a very appealing strategy for proving it over R = Z/p^k Z, which I’ll now explain. Suppose S is a subset of R^n which contains a line in every direction. (By a “direction” let’s just mean a nonzero vector y in R^n; we could identify vectors that differ by a scalar but I don’t think that makes a real difference.) For each direction y, choose a line L_y in that direction whose p^k points are all contained in S. Now we have a linear map F_S from the C-vector space spanned by directions to the C-vector space spanned by S, defined by

.

And we have a map G from C[S] to the space spanned by R^n, which is just the discrete Fourier transform

with a primitive p^k-th root of unity in C.

What happens if we compose F with G? The points of L_y are just x+ty for some x in R^n, with t ranging over R. So

When is anything but 0, the sum over t cancels everything and you get a coefficient of 0. When , on the other hand, the dependence on t in the summand disappears and you just get . Write M for the matrix expressing the linear transformation .

To sum up: this composition has rank at most |S|, because it factors through C[S]. On the other hand, expressed as a matrix, M has y,v entry 0 whenever y and v are not orthogonal, and some p^k-th root of unity whenever y and v are orthogonal. Dhar and Dvir ask, in the spirit of the question we started with: how small can the rank of such a matrix actually be? Any lower bound on the rank of M is a lower bound for the size of the Kakeya set S.

*Is* that in the spirit of the question we started with? The fact that the order of the root of unity is known makes it feel a little different from the question arising for Bjorn, where bounding the order is part of the game. Your allowable entries are drawn from a finite set instead of an infinite one. And of course in this case you’re specifying *which* matrix entries are roots of unity and which are zero.

Still, I found it an enjoyably thematic day of math!

Hi Jordan,

Thanks for coming to the talk and for the nice post!

Another (very famous) problem related to these questions is the “Log-Rank Conjecture”. See here for a great survey. It basically asks if any low rank real matrix with 0/1 entries can be written as a sum of a small number of rank one matrices with entries also in 0/1 (in this case, monochromatic rectangles). Your post makes me wonder if the same should be true for complex matrices with entries either 0 or in some finite group G…

-zeev