Nuclear Mangos

This blog is intended to provide reliable technical analysis of nuclear issues with non-state actors and nuclear beginner states. Some technical issues have important policy implications that citizens in a democracy should be able to make informed decisions about. The motivation for the blog has been the incredible amount of lies & hyperbole on the Iran situation of early 2006. The blog title is to remind you constantly of the quality of minds in charge of our nuclear security today.

Location: MA

Until recently I was a physics professor at Harvard, where I taught the nuclear and particle physics course, among others. I've recently left that position to work as an R&D physicist in security applications. I have never done classified weapons work.

Sunday, April 23, 2006

Nattering Nabobs of Negativity (RD Part V)

I get a fairly small number of different responses when I talk about my concerns in Iran. The most common is, “This is insane.” The next most common, though, is “They won’t actually attack Iran”; and the third is, “Even if they attack Iran, not much will happen after.” These latter two are examples of using the most likely outcome to judge the expected value of a payoff. Even the people saying them know that they could attack Iran, with some small probability; or that something bad could happen after an attack, again, but with some small probability. However, they discount the outcome because they judge its probability to be relatively small.

Again, in most everyday situations this is a very acceptable way to make decisions. However, the expected value of a decision is not a robust quantity. That is, the expected value of a decision is very sensitive to outliers—possible outcomes with extremely high or low payoffs. They can change the rational expectation dramatically.

When an action has some outcomes with extremely large payoffs or extremely large losses, these outcomes can strongly influence the rational expectation of the action, even when their probability is very low.

It hardly needs explication that nuclear strikes on Iran have potential to lead to gigantic losses. Even if their probability is judged low (say, 1%) they still likely are the most important outcomes to consider, in the rational expectation framework.

The main graphic shows two decisions to be made, on the right and on the left. Each decision has two possible options (top and bottom.) The yellow curve shows the probability for each outcome for the option; the blue dotted line shows the rational expectation.

The decision on the left has a fairly standard menu of options. If you manufacture widgets, you’ll make a fairly predictable profit, with almost no possibility of loss. If you manufacture super-widgets, you will probably make a larger profit, but there is some probability of a loss. Super-widgets are the high-risk, high-gain scenario. In a pure rational decision framework (for those who care: assumes risk-neutral), the super-widgets are the better option, as the rational expectation (blue line) is further to the gain side. As noted earlier, in these cases, the rational expectation coincides with the mostly likely outcome.

The decision on the right has a fairly nightmarish menu of options. (I do not mean to imply they are the only ones in Iran, or even that the payoff curves reflect reality in any important way. They are merely illustrative of the problem.) On top, there is an option with essentially no probability of gain, with a fairly predictable loss.

On the bottom, there is an option with “long tails”: some probability of apocalyptically bad outcomes. However, it does have some non-zero probability for positive outcomes. The rational expectation for the upper option is shown by the blue line, and it is negative. It coincides, again, with the most likely outcome. The most likely outcome for the bottom option is shown by the blue dashed line. However, the rational expectation is shown by the red line. (It is shown in more detail in the second graphic.) It is at a much more negative value of payoff than the most likely outcome—which is still pretty negative. Even though the most likely outcome is just a “cost of doing business”, the rational expectation is “disaster”. And exactly where the rational expectation is, depends very strongly on just exactly how negative the negative outcomes are.

It is worth noting that from this nightmarish menu, you might choose a different option if you work in a framework other than the rational expectation. If you think it important to maximize the probability of a gain, perhaps the second option is better. (I am not arguing you should think that! Just that if you did, you might choose the second option.)

When an option has nonzero probability for large losses, the rational expectation is much lower than the most likely outcome. You had better, in this case, be ready to return fire when William Safire turns his rhetorical guns on you. Because rationally, the most important scenarios to consider when making the decision, are precisely those negative outcomes that are least likely to occur.


Post a Comment

<< Home