And I have to be honest, whatever this may say about me: I felt an incredible warmth and safety and satisfaction, standing there, being clapped for and adored by a recording of a crowd. Reader, I stayed for a second cycle.

]]>on the National Express.

Bristol, besides having lots of great mathematicians to talk to, is much lovelier than I knew. There’s lots of terrain! It seems every time you turn a corner there’s another fine vista of pastel-painted row houses and the green English hills far away. There’s a famous bridge. I walked across it, then sat on a bench at the other side doing some math, in the hopes I’d think of something really good, because I’ve always wanted to scratch some math on a British bridge, William Rowan Hamilton-style. Didn’t happen. There was a bus strike in Bristol for civil rights because the bus companies didn’t allow black or Indian drivers; the bus lines gave in to the strikers and integrated on the same day Martin Luther King, Jr. was saying “I have a dream” in Washington, DC. There’s a chain of tea shops in Bristol called Boston Tea Party. I think it’s slightly weird to have a commercial operation named after an anti-colonial uprising against your own country, but my colleagues said no one there really thinks of it that way. The University of Bristol, by the way, is sort of the Duke of the UK, in that it was founded by a limitless bequest from the biggest tobacco family in the country, the Willses. Bristol also has this clock:

]]>Anyway, the point is, this instinctive response is wrong! At least it’s wrong if you interpret the question the way I have in mind, which is to ask: given a random curve X of genus g over F_q, *with g growing as q stays fixed*, is there a limiting probability that X has ordinary Jacobian? And this might not be 1, in the same way that the probability that a random polynomial over F_q is squarefree is not 1, but 1-1/q.

Bryden Cais, David Zureick-Brown and I worked out some heuristic guesses for this problem several years ago, based on the idea that the Dieudonne module for a random curve might be a random Dieudonne module, and then working out in some detail what in the Sam Hill one might mean by “random Dieudonne module.” Then we did some numerical experiments which showed that our heuristic looked basically OK for plane curves of high degree, but pretty badly wrong for hyperelliptic curves of high genus. But there was no family of curves for which one could *prove *either that our heuristic was right or that it was wrong.

Now there is, thanks to my Ph.D. student Soumya Sankar. Unfortunately, there are still no families of curves for which our heuristics are provably right. But there are now several for which it is provably wrong!

15.7% of Artin-Schreier curves over F_2 (that is: Z/2Z-covers of P^1/F_2) are ordinary. (The heuristic proportion given in my paper with Cais and DZB is about 42%, which matches data drawn from plane curves reasonably well.) The reason Sankar can prove this is because, for Artin-Schreier curves, you can test ordinarity (or, more generally, compute the a-number) in terms of the numerical invariants of the ramification points; the a-number doesn’t care *where* the ramification points are, which would be a more difficult question.

On the other hand, 0% of Artin-Schreier curves over F are ordinary for any finite field of odd characteristic! What’s going on? It turns out that it’s only in characteristic 2 that the Artin-Schreier locus is irreducible; in larger characteristics, it turns out that the locus has irreducible components whose number grows with genus, and the ordinary curves live on only one of these components. This “explains” the rarity of ordinarity (though this fact alone doesn’t prove that the proportion of ordinarity goes to 0; Sankar does that another way.) Natural question: if you just look at the ordinary component, does the proportion of ordinary curves approach a limit? Sankar shows this proportion is bounded away from 0 in characteristic 3, but in larger characteristics the combinatorics get complicated! (All this stuff, you won’t be surprised to hear, relies on Rachel Pries’s work on the interaction of special loci in M_g with the Newton stratification.)

Sankar also treats the case of superelliptic curves y^n = f(x) in characteristic 2, which turns out to be like that of Artin-Schreier in odd characteristics; a lot of components, only one with ordinary points, probability of ordinarity going to zero.

Really nice paper which raises lots of questions! What about more refined invariants, like the shape of the Newton polygon? What about other families of curves? I’d be particularly interested to know what happens with trigonal curves which (at least in characteristic not 2 or 3, and maybe even then) feel more “generic” to me than curves with extra endomorphisms. Is there any hope for our poor suffering heuristics in a family like that?

]]>The pitching is terrible. 6.15 ERA in the early going, a half-run worse than anyone else in the league; flashes of goodness from Hess, Cashner, and Means, all of whom *could* be OK, but there’s no real reason for confidence any of them will be. And of course the team could make the choice, as they did last year, to flip Mancini, Means, and anybody else who’s producing for prospects at midsummer and lose their last 70 games; who knows? But for now; why not?

From the publisher it was $41, with free shipping.

I think it really did used to be true that the Amazon price was basically certain to be the best price. Not anymore. Shop around!

]]>Everybody says rural America is collapsing. But I keep going to places with more moral coherence and social commitment than we have in booming urban areas. These visits prompt the same question: How can we spread the civic mind-set they have in abundance?

For example, I spent this week in Nebraska, in towns like McCook and Grand Island. These places are not rich. At many of the schools, 50 percent of the students receive free or reduced-cost lunch. But they don’t have the pathologies we associate with poverty.

Maybe that’s because those places aren’t high in poverty! The poverty rate in McCook is 9.6%; in Grand Island it’s 15%. The national rate is 12.3%. Here’s a Census page with those numbers. What about the lunches? 50 percent of students receiving free or reduced-price lunch sounds like a lot, unless you know that slightly more than half of *all* US public school students are eligible for free and reduced-price lunch. (Brooks says “receive,” not “are eligible for,” but it’s the latter statistics that are widely reported and I’m guessing that’s what he means; apologies if I’m wrong.)

Crime is low. Many people leave their homes and cars unlocked.

Is it? And do they? I didn’t immediately find city-level crime data that looked rock solid to me, but if you trust city-data.com, crime in Grand Island roughly tracks national levels while crime in McCook is a little lower. And long-time Grand Island resident Gary Christensen has a different take than Brooks does:

Gary Christensen, a Grand Island resident for over 68 years says times are changing.

“It was a community that you could leave you doors open leave the keys in your car and that kind of thing, and nobody ever bothered it. But those days are long gone,” said Gary Christensen, resident.

One way you can respond to this is to say I’m missing the point of Brooks’s article. Isn’t he just saying civic involvement is important and it’s healthy when people feel a sense of community with their neighbors? Are the statistics really that important?

Yes. They’re important. Because what Brooks is really doing here is inviting us to lower ourselves into a warm comfortable stereotype; that where the civic virtues are to be found in full bloom, where people are “just folks,” are in the rural parts of Nebraska, not in New Orleans, or Seattle, or Laredo, or Madison, and *most definitely* not in Brooklyn or Brookline or Bethesda. But he can’t just say “you know how those people are.” There needs to be some vaguely evidentiary throat-clearing before you launch into what you were going to say anyway.

Which is that Nebraska people are simple dewy real Americans, not like *you,* urbanized coastal reader of the New York Times. I don’t buy it. McCook, Nebraska sounds nice; but it sounds nice in the same way that urbanized coastal communities are nice. You go someplace and talk to a guy who’s on the city council, you’re gonna be talking to a guy who cares about his community and thinks a lot about how to improve it. Even in Bethesda.

Constantly they are thinking: Does this help my town or hurt it? And when you tell them that this pervasive civic mind-set is an unusual way to be, they look at you blankly because they can’t fathom any other.

There’s Brooks in a nutshell. The only good people are the people who *don’t know any better than to be good.* By saying so, he condescends to his subjects, his readers, and himself all at once. I don’t buy it. I’ll bet people in southwest Nebraska can fathom a lot more than Brooks thinks they can. I think they probably fathom David Brooks better than he fathoms them.

Get the bullshit out of the camp and set it on fire. Not every parsha offers such actionable and contemporary advice!

]]>This works pretty well! The main axis of variation (horizontal here) is Soglin vote, which is higher on the left and lower on the right; this vector is negatively weighted on Rhodes-Conway and Shukla but doesn’t pay much attention to Cheeks. The vertical axis mostly ignores Shukla and represents Cheeks taking votes from Rhodes-Conway at the top, and losing votes to Rhodes-Conway at the bottom. You can see a nice cluster of Isthmus and Near West wards in the lower right; Rhodes-Conway did really well there. 57 and 48 are off by themselves in the upper right corner; those are student wards, distinguished in the vote count by Grumpy Old Incumbent Paul Soglin getting next to no votes. And I mean “next to no” in the literal sense; he got one vote in each of those wards!

You can also do some off-the-shelf k-means clustering of those vectors in R^4 and you get meaningful results there. Essentially arbitrarily I broke the wards into 5 clusters and got:

[28, 29, 30, 32, 39, 40, 41, 42, 44, 45, 51, 52, 53, 62, 63, 64, 65, 81, 82, 105]

(east side Isthmus and near West)

[3, 4, 5, 7, 9, 10, 11, 17, 18, 22, 23, 24, 26, 38, 75, 88, 89, 90, 94, 96, 106, 107, 110, 111]

(far east and far west)

[15, 43, 46, 47, 48, 49, 50, 55, 56, 57, 58, 59, 60, 61, 66, 68, 69]

(campus and south Park)

[2, 12, 13, 14, 16, 21, 31, 33, 34, 35, 36, 37, 67, 80, 83, 84, 85, 86, 87, 93, 108, 109]

(west side, Hill Farms, north side, east of Monona)

[1, 6, 8, 19, 20, 25, 70, 71, 72, 73, 74, 76, 77, 78, 79, 91, 92, 95, 97, 98, 99, 100, 101, 102, 103, 104]

(southwest Madison and south of Monona)

Now what would be interesting is to go back and compare this with the ward-by-ward results of the gubernatorial primary last August! But I have other stuff to do today. Here’s some code so I remember it; this stuff is all simple and I have made no attempt to make the analysis robust.

**Update: **I did the comparison with the August primary; interestingly, I didn’t see very many strong relationships. Soglin-for-mayor wards were typically also Soglin-for-governor wards. Wards that were strong for Kelda Helen Roys were also strong for Raj Shukla and weak for Soglin, but there wasn’t a strong relationship between Roys vote and Rhodes-Conway vote. On the other hand, Rhodes-Conway’s good wards also tended to be good ones for… Mike McCabe??

import csv

import numpy as np

import matplotlib.pyplot as plt

from sklearn.decomposition import PCA

S = csv.reader(open(‘MadisonPrimaryFeb19.csv’,’rb’))

Wards = [s for s in S]

#votes for Rhodes-Conway, Soglin, Shukla,Cheeks in that order

Alabel = np.array([[int(s[1]),int(s[2]),int(s[4]),int(s[5]),i+1] for i,s in enumerate(Wards[11:122])])

# strip out small districts

indices = [i for (i,row) in enumerate(Alabel) if sum(row[0:3]) < 20]

Astripped = np.delete(Alabel,indices,0)

Anorm = np.array([row/(sum(row**2)**(0.5)) for row in A])

# here’s the PCA

wards2d = np.transpose(PCA(n_components=2).fit_transform(Anorm))

plt.scatter(wards2d[0],wards2d[1],s=0)

for i in range(len(Astripped)):

plt.annotate(Astripped[i][4],(wards2d[0][i],wards2d[1][i]))

plt.show()

# and here’s the kmeans stuff

from sklearn.cluster import KMeans

kmeans = KMeans(n_clusters=5).fit(Anorm)

kmeans.labels_

Here’s the deal. You want to build some component out of metal, which metal is to be contained in a solid block. So you can think of the problem as: you start with a region V in R^3, and your component is going to be some subregion W in R^3. For each choice of W there’s some measure of “compliance” which you want to minimize; maybe it’s fragility, maybe it’s flexibility, I dunno, depends on the problem. (Sidenote: I think lay English speakers would want “compliance” to refer to something you’d like to *maximize*, but I’m told this usage is standard in engineering.) (Subsidenote: I looked into this and now I get it — compliance literally refers to flexibility; it is the inverse of stiffness, just like in the lay sense. If you’re a doctor you want your patient to comply to their medication schedule, thus bending to outside pressure, but bending to outside pressure is precisely what you do *not* want your metal widget to do.)

So you want to minimize compliance, but you also want to minimize the weight of your component, which means you want vol(W) to be as small as possible. These goals are in conflict. Little lacy structures are highly compliant.

It turns out you can estimate compliance by breaking W up into a bunch of little hexahedral regions, computing compliance on each one, and summing. For reasons beyond my knowledge you definitely don’t want to restrict to chopping uniformly into cubes. So a priori you have millions and millions of differently shaped hexahedra. And part of the source of Suresh’s speedup is to gather these into approximate congruence classes so you can do a compliance computation for a whole bunch of nearly congruent hexahedra at once. And here’s where the solid geometry comes in; an old theorem of Cauchy tells you that if you know what a convex polyhedron’s 1-skeleton looks like as a graph, and you know the congruence classes of all the faces, you know the polyhedron up to rigid motion. In partiuclar, you can just triangulate each face of the hexahedron with a diagonal, and record the congruence class by 18 numbers, which you can then record in a hash table. You sort the hashes and then you can instantly see your equivalence classes of hexahedra.

(Related: the edge lengths of a tetrahedron determine its volume but the areas of the faces don’t.)

]]>