20/20 Bias By Jane Galt

Mark Thoma is telling the hawks they’ve squandered their credibility for a mess of pottage:

Sorry Jonathan, maybe we don’t drum you out of the profession — there aren’t simply two extremes where we listen fully or don’t listen at all — but we are going to pay less attention to what you have to say. That’s how it to goes when you are wrong about important things. And unlike the parade of polar extremes presented to us in your argument, there are people who have been generally correct all along and I prefer to give more weight to their views than to those who have been so spectacularly wrong.

Now, of course, I supported the war, so I can be expected to say something like what I am about to say. My only excuse is that I have been thinking hard about this, trying to pick out what went wrong, and I think that I am willing to admit where I was wrong. I was wrong to impute too much confidence to my ability to interpret Saddam Hussein’s actions; I was wrong to not foresee how humiliating Iraqis would find being liberated by the westerners who have been tramping around their country, breaking things for their own reasons and with little regard for the Iraqi people, for several hundred years. I was wrong to impute excessive competence to the government–and not just the Bush administration, but to any government occupation.


This has not convinced me of the brilliance of the doves, because precisely none of the ones that I argued with predicted that things would go wrong in the way they did. If you get the right result, with the wrong mechanism, do you get credit for being right, or being lucky? In some way, they got it just as wrong as I did: nothing that they predicted came to pass. It’s just that independantly, things they didn’t predict made the invasion not work. If I say we shouldn’t go to dinner downtown because we’re going to be robbed, and we don’t get robbed but we do get food poisoning, was I “right”? Only in some trivial sense. Food poisoning and robbery are completely unrelated, so my belief that we would regret going to dinner was validated only by random chance. Yet, the incident will probably increase my confidence in my prediction abilities, even though my prediction was 100% wrong.

I’m trying to assess my decisionmaking process without developing a massive case of hindsight bias. Hindsight bias is a familiar phenomenon to most of us–that’s why it has its own proverb–but most people don’t realise just how bad it is. The CIA explains:

Analysts interested in improving their own performance need to evaluate their past estimates in the light of subsequent developments. To do this, analysts must either remember (or be able to refer to) their past estimates or must reconstruct their past estimates on the basis of what they remember having known about the situation at the time the estimates were made. The effectiveness of the evaluation process, and of the learning process to which it gives impetus, depends in part upon the accuracy of these remembered or reconstructed estimates.

Experimental evidence suggests a systematic tendency toward faulty memory of past estimates.150 That is, when events occur, people tend to overestimate the extent to which they had previously expected them to occur. And conversely, when events do not occur, people tend to underestimate the probability they had previously assigned to their occurrence. In short, events generally seem less surprising than they should on the basis of past estimates. This experimental evidence accords with analysts’ intuitive experience. Analysts rarely appear–or allow themselves to appear–very surprised by the course of events they are following.

In experiments to test the bias in memory of past estimates, 119 subjects were asked to estimate the probability that a number of events would or would not occur during President Nixon’s trips to Peking and Moscow in 1972. Fifteen possible outcomes were identified for each trip, and each subject assigned a probability to each of these outcomes. The outcomes were selected to cover the range of possible developments and to elicit a wide range of probability values.

At varying time periods after the trips, the same subjects were asked to remember or reconstruct their own predictions as accurately as possible. (No mention was made of the memory task at the time of the original prediction.) Then the subjects were asked to indicate whether they thought each event had or had not occurred during these trips.

When three to six months were allowed to elapse between the subjects’ estimates and their recollection of these estimates, 84 percent of the subjects exhibited the bias when dealing with events they believed actually did happen. That is, the probabilities they remembered having estimated were higher than their actual estimates of events they believed actually did occur. Similarly, for events they believed did not occur, the probabilities they remembered having estimated were lower than their actual estimates, although here the bias was not as great. For both kinds of events, the bias was more pronounced after three to six months had elapsed than when subjects were asked to recall estimates they had given only two weeks earlier.

In summary, knowledge of the outcomes somehow affected most test subjects’ memory of their previous estimates of these outcomes, and the more time that was allowed for memories to fade, the greater the effect of the bias. The developments during the President’s trips were perceived as less surprising than they would have been if actual estimates were compared with actual outcomes. For the 84 percent of subjects who showed the anticipated bias, their retrospective evaluation of their estimative performance was clearly more favorable than warranted by the facts.

Many of the doves seem to be reconstructing their memory of why they objected to the war, crediting themselves with having predicted that the invasion would fail in this way. Many hawks are also reconstructing their memories to make themselves less hawkish. Fortunately, or unfortunately for me, I wrote my predictions down, so I know that I was an unabashed hawk, 100% convinced that Saddam had WMD.

The lesson that I can unequivocally take out of this is: do not be so confident in your ability to read other people and situations. Saddam was behaving exactly as I would have behaved if I had WMD, so I concluded that he had them. I will never again be so confident in the future.

At the same time, though, in a similar situation this shouldn’t necessarily make me listen to the hawks next time. North Korea was behaving exactly like a country that had WMD, and it turned out that this was because they had them. What the doves would like to see the hawk’s do–“I was wrong, wrong, wrong, wrong, wrong about everything, I am a stupid idiot, you are a brilliant figure with god-like omniscience”–is no better a guide to future decisionmaking than ignoring the fact that you were seriously wrong about the Iraq invasion. They are both ways of being completely stupid, not that this has stopped anyone.

When I look back at the decision I made, and I try to imagine making it without what I know now, which is that Saddam didn’t have WMD, could I change it? I’m not sure. I don’t see any way that I could have known, without actually checking, that he didn’t have at least an advanced programme. And even with the chaos now, had we found an advanced nuclear programme, most of the doves would be finding it much harder to argue that the invasion was a disastrous mistake. Perhaps even if he had had them we should have left him alone, but that’s a difficult argument. And given the number of Democrats, including President Clinton, who clearly believed that we would find an advanced weapons programme, I have to conclude that without benefit of hindsight, the information painted at least a 50% chance that he had them.

As I see it, doves have, in effect, benefitted from winning a random game. Not that the result was random–obviously, there was only one true state of the world. But at the time of making the decision, the game was random to the observer, with no way to know the true state until you open the box and poke the cat. Having won a random game, they are now crediting themselves with brilliant foresight. And yet, if the hawks had won the game, they would be preening themselves on their analytical ability, and demanding that the doves prostrate themselves in an extensive grovel.

That doesn’t mean that my decisionmaking wasn’t faulty. It was, in all sorts of ways, and I am trying to learn from them with proper humility. But I think the doves are crediting themselves with way too much analytical brilliance, which is fine to a point, but not so very fine that I am willing to turn over my decisionmaking to their allegedly more capable hands. World War II, after all, came in part out of learning lessons from World War I that weren’t actually there. And the sight of doves saying, in effect, “I don’t have to listen to you any more” does not make me sanguine that they are doing much better.

This content was used with the permission of Asymmetrical Information.

Share this!

Enjoy reading? Share it with your friends!

Send this to friend