A post on Decision Science about a problem of Uri Wilensky‘s has been making the rounds:

Imagine a room full of 100 people with 100 dollars each. With every tick of the clock, every person with money gives a dollar to one randomly chosen other person. After some time progresses, how will the money be distributed?

People often expect the distribution to be close to uniform. But this isn’t right; the simulations in the post show clearly that inequality of wealth rapidly appears and then persists (though each individual person bobs up and down from rich to poor.) What’s going on? Why would this utterly fair and random process generate winners and losers?

Here’s one way to think about it. The possible states of the system are the sets of nonnegative integers (m_1, .. m_100) summing to 10,000; if you like, the lattice points inside a simplex. (From now on, let’s write N for 100 because who cares if it’s 100?)

The process is a random walk on a graph G, whose vertices are these states and where two vertices are connected if you can get from one to the other by taking a dollar from one person and giving it to another. We are asking: when you run the random walk for a long time, where are you on this graph? Well, we know what the stationary distribution for random walk on an undirected graph is; it gives each vertex a probability proportional to its degree. On a regular graph, you get uniform distribution.

Our state graph G isn’t regular, but it almost is; most nodes have degree N, where by “most” I mean “about 1-1/e”; since the number of states is

and, of these, the ones with degree N are exactly those in which nobody’s out of money; if each person has a dollar, the number of ways to distribute the remaining N^2 – N dollars is

and so the proportion of states where someone’s out of money is about

.

So, apart from those states where somebody’s broke, in the long run *every possible state is equally likely;* we are just as likely to see $9,901 in one person’s hands and everybody else with $1 as we are to see exact equidistribution again.

What is a random lattice point in this simplex like? Good question! An argument just like the one above shows that the probability nobody goes below $c is on order e^-c, at least when c is small relative to N; in other words, it’s highly likely that somebody’s very nearly out of money.

If X is the maximal amount of money held by any player, what’s the distribution of X? I didn’t immediately see how to figure this out. You might consider the continuous version, where you pick a point at random from the *real* simplex

.

Equivalently; break a stick at N-1 randomly chosen points; what is the length of the longest piece? This is a well-studied problem; the mean size of the longest piece is about N log N. So I guess I think maybe that’s the expected value of the net worth of the richest player?

But it’s not obvious to me whether you can safely approximate the finite problem by its continuous limit (which corresponds to the case where we keep the number of players at N but reduce the step size so that each player can give each other a cent, or a picocent, or whatever.)

What happens if you give each of the N players just one dollar? Now the uniformity really breaks down, because it’s incredibly unlikely that nobody’s broke. The probability distribution on the set of (m_1, .. m_N) summing to N assigns each vector a probability proportional to the size of its support (i.e. the number of m_i that are nonzero.) That must be a well-known distribution, right? What does the corresponding distribution on partitions of N look like?

**Update**: Kenny Easwaran points out that this is basically the same computation physicists do when they compute the Boltzmann distribution, which was new to me.

If this has some sort of relevance to what happens when people actually go shopping, I’m disturbed.

But why do you call this a fair process? If you have $2 to your name, why is it fair for the Powers That Be to still force you to give a dollar away on the next tick? Maybe the question is “Why would this utterly unfair process be considered fair?”.

In the large N limit, each round, an individual person gives up a dollar if they have one and then gains a number of dollars described by a Poisson distribution with expectation equal to the probability that any given person has at least one dollar. If p is the probability that any given person has no money, then the first process represents a loss of p+p^2 units of variance (It’s equivalent to work with gaining 1 dollar if we have no dollars, where p units are lost relative to the old mean of 1, and p^2 more or lost when we shift to the new mean of 1+p) and the second process represents a gain of 1-p units of variance (the variance of Poisson), giving the equation p+p^2 = 1-p in the steady state, or p^2+2p-1 =0, so p =sqrt(2)-1. So the proportion of states where someone’s out of money is not 1/e=.368…, but rather sqrt(2)-1 = .414…, slightly larger.

(Sorry, I should have mentioned that this is in the case where everyone starts with no dollars, which you mention at the end).

If p_n is the probability of having n dollars, then in the steady state we have p_0 = 1-p, p_n= sum_{i=0}^n p_{n+1-i} (1-p)^i / e^(-(1-p)} i! + p_0 (1-p)^n / e^(- (1-n)) n!, and using this we can inductively calculate p_(n+1) from p_0,…, p_n to find the steady-state distribution.

I don’t think this is any kind of well-known distribution, but I could be wrong – there could be a more direct way to extract this from a Poisson distribution.

[…] will keep increasing until one person has everything. Jordan Ellenberg shares the insight over on his blog that this process is a random walk on a network where the nodes are distributions and two nodes are […]

[…] When random people give money to random other people | Quomodocumque https://quomodocumque.wordpress.com/2017/06/27/when-random-people-give-money-to-random-other-people/ […]

[…] When random people give money to random other people | Quomodocumque https://quomodocumque.wordpress.com/2017/06/27/when-random-people-give-money-to-random-other-people/ […]

[…] When random people give money to random other people | Quomodocumque https://quomodocumque.wordpress.com/2017/06/27/when-random-people-give-money-to-random-other-people/ […]

@Rob – it’s fair in the sense that everyone is treated and acting equally. Fair in the sense of a “fair coin”. It’s a mathematical thought experiment; there’s no government or law enforcement making this happen. It says nothing about society and this isn’t a libertarian’s nightmare.

@Jordan – thanks for the nice write-up.

[…] When random people give money to random other people | Quomodocumque […]

Shouldn’t the edges of G be connected if you can get from one set to the next by all N people giving a dollar instead of just one random person giving a dollar?

I can imagine some sort of theorem saying that both problems have the same stationary distribution though.

[…] Counterintuitive problem: Everyone in a room keeps giving dollars to random others. You’ll never guess what happens next. – Decision Science Newshttp://www.decisionsciencenews.com/2017/06/19/counterintuitive-problem-everyone-room-keeps-giving-do… […]

[…] problema lo encontré en este post, y si quieren explorar por qué pasa esto les recomiendo este post. […]

[…] usual? (From Decision Science News, a once-weekly signup newsletter.) There’s more discussion here.link to this […]

Related to Pareto distribution?

[…] I wanted to try, on my own, because I did not understood most of the posts. Because my first thought is that the problem is ill-posed. First of all, what is this “giving dollars”? is it a fixed amount or a random one ? Let us start by assuming that it is fixed. Now, if you know a little bit about gambling and ruin, you guess that it’s very likelely that some one will get banckrupt (at least on a very very long range)… what should we do with that person? Actually, those points were clarified in Jordan’s post […]