Malcolm Gladwell wrote:
It doesn’t take much imagination to see how risk homeostasis applies to NASA and the space shuttle. In one frequently quoted phrase, Richard Feynman, the Nobel Prize- winning physicist who served on the Challenger commission, said that at NASA decision-making was “a kind of Russian roulette.” When the O-rings began to have problems and nothing happened, the agency began to believe that “the risk is no longer so high for the next flights,” Feynman said, and that “we can lower our standards a little bit because we got away with it last time.” But fixing the O-rings doesn’t mean that this kind of risk-taking stops. There are six whole volumes of shuttle components that are deemed by NASA to be as risky as O-rings. It is entirely possible that better O-rings just give NASA the confidence to play Russian roulette with something else.
If this is really what Feynman said, wasn’t he wrong? In Russian roulette, you know there’s one bullet in the gun. The chance of a catastrophe is just one in six the first time you put the gun to your head; but if you survive the first try, you know the round is in one of the remaining five chambers and the chance of death next time you pull the trigger climbs to 20%. The longer you play, the more likely disaster becomes.
But what if you don’t know how many chambers are loaded? Suppose you play “Bayes Roulette,” in which the number of bullets is equally likely to be anywhere from 1 to 6. Then the chance of survival on the first try is (5/6) if there’s 1 bullet in the cylinder, (4/6) if 2 bullets, and so on, for a total of
(1/6)(5/6) + (1/6)(4/6) + … (1/6)(0/6) = 5/12
which is about 41%. Pretty bad. But let’s say you pull the trigger once and live. Now by Bayes’ theorem, the chance that there’s 1 bullet in the cylinder is
Pr(1 bullet in cylinder and I survived the first try) / P(I survived the first try)
(5/36)/(5/12) = 1/3.
Similarly, the chance that there are 5 bullets in the cylinder is
(1/36)/(5/12) = 1/15.
And the chance that there were 6 bullets in the cylinder is 0, because if there had been, well, you would be a former Bayesian.
All in all, your chance of surviving the next shot is
(5/15)*(4/5) + (4/15)*(3/5) + (3/15)*(2/5) + (2/15)*(1/5) + (1/15)*0= 8/15.
In other words, once you survive the first try, you’re more likely, not less, to survive the next one; because you’ve increased the odds that the gun is mostly empty.
Or suppose the gun is either fully loaded or empty, but you don’t know which. The first time you pull the trigger, you have no idea what your odds of death are. But the second time, you know you’re completely safe.
I think the space shuttle are a lot more like Bayes Roulette than Russian Roulette. You don’t know how likely an O-ring failure is to cause a crash, just as you don’t know how many bullets are in the gun. And if the O-rings fail now and then, with no adverse consequences, you are in principle perfectly justified in worrying less about O-rings. If you shoot yourself four times and no bullet comes out, you ought to be getting more confident the gun is empty.
Taleb makes a similar point in his “suspicious coin” story,
Despite the presence of subjective factors in Bayesian reasoning, I find that this type of logic is less prone to biases caused by a flawed model than most other types of reasoning; it really should be promoted more often than it actually is.
One place Feynman made this comment is in his appendix to the Challenger disaster report. The full quote is below, but his point is that one should be cautious in applying Bayesian roulette when there are clear signs that the equipment is not working as designed and you don’t understand why and/or why the equipment continues to work despite these deviations. In your context, perhaps it’s like you’re playing russian roulette with a gun that seems like maybe it’s heavier than it would be if it was unloaded…
The phenomenon of accepting for flight, seals that had shown erosion and blow-by in previous flights, is very clear. The Challenger flight is an excellent example. There are several references to flights that had gone before. The acceptance and success of these flights is taken as evidence of safety. But erosion and blow-by are not what the design expected. They are warnings that something is wrong. The equipment is not operating as expected, and therefore there is a danger that it can operate with even wider deviations in this unexpected and not thoroughly understood way. The fact that this danger did not lead to a catastrophe before is no guarantee that it will not the next time, unless it is completely understood. When playing Russian roulette the fact that the first shot got off safely is little comfort for the next. The origin and consequences of the erosion and blow-by were not understood. They did not occur equally on all flights and all joints; sometimes more, and sometimes less. Why not sometime, when whatever conditions determined it were right, still more leading to catastrophe?
Now I’m not exactly an expert on this, but my understanding of Russian roulette is that after each shot, the next player opens the chamber, verifies that there is exactly one bullet in it, spins the chamber, and closes it without looking. So the events are independent…
Your way sounds crazy — are you trying to get someone killed?!?
If Wikipedia can be trusted, both variants are played.
Re Nathan’s comment: if you can estimate the number of loaded chambers by weight, that affects your prior distribution. Note that in my example, the gun was _definitely_ loaded — the question is “how loaded is it?”
The article Russian Roulette and Risk-Taking Behavior: A Medical Examiner Study http://www.scribd.com/doc/15829912/Russian-Roulette-and-RiskTaking-Behavior-A-Medical-Examiner-Study is fascinating reading and has some data. It indicates that participants are often high and suicidal and sometimes put multiple bullets (as many as 5) in the chambers.