Mistakes Were Made (But Not By Me)-Book Review

MISTAKES-WERE MADEOne of the best things I came across this  past week was this terrific review by  Morgan Housel where he shared insights  from the book “Mistakes Were Made (But  Bot By Me)” by Elliot Aronson and Carol  Tavris. Several members have  recommended this book to me so I was  very interested to read his review.
According to Mr. Housel, this are the six  most important things all of us should  learn from this book, many of which are  very important to investors and traders alike:

1. Everyone wants to be right and hates admitting the  possibility of being wrong.As fallible human beings, all of us share the impulse to justify  ourselves and avoid taking responsibility for any actions that turn
out to be harmful, immoral, or stupid. Most of us will never be in a  position to make decisions affecting the lives and deaths of  millions of people, but whether the consequences of our mistakes  are trivial or tragic, on a small scale or a national canvas, most of  us find it difficult, if not impossible, to say, “I was wrong; I made a  terrible mistake.”
The higher the stakes — emotional, financial, moral — the greater the difficulty. It goes further than that: Most people, when directly  confronted by evidence that they are wrong, do not change their  point of view or course of action but justify it even more tenaciously. Even irrefutable evidence is rarely enough to pierce  the mental armor of self-justification.

2. You brain is designed to shut out conflicting information.In a study of people who were being monitored by magnetic  resonance imaging (MRI) while they were trying to process  dissonant or consonant information about George Bush or John Kerry, Drew Westen and his colleagues found that the reasoning  areas of the brain virtually shut down when participants were  confronted with dissonant information, and the emotion circuits of
the brain lit up happily when consonance was restored. These mechanisms provide a neurological basis for the observation that  once our minds are made up, it is hard to change them.

3. The higher the stakes, the harder it is to think clearly.The racetrack is an ideal place to study irrevocability, because  once you’ve placed your bet, you can’t go back and tell the nice  man behind the window you’ve changed your mind. In this study, the researchers simply intercepted people who were standing in  line to place two-dollar bets and other people who had just left the  window. The investigators asked everyone how certain they were that their horses would win. The bettors who had placed their bets  were far more certain about their choice than were the folks  waiting in line. But, of course, nothing had changed except the  finality of placing the bet. People become more certain they are right about something they just did if they can’t undo it.

4. Experience doesn’t help you think better. It might actually  hurt.Hundreds of studies have shown that predictions based on an  expert’s “personal experience” or “years of training” are rarely  better than chance, in contrast to predictions based on actuarial  data. But when experts are wrong, the centerpiece of their  professional identity is threatened. Therefore, as dissonance  theory would predict, the more self-confident and famous they are, the less likely they will be to admit mistakes. And that is just  what Tetlock found. Experts reduce the dissonance caused by  their failed forecasts by coming up with explanations of why they  would have been right “if only” — if only that improbable calamity  had not intervened; if only the timing of events had been different;  if only blah blah blah.

5. You are twice as biased as you think you are.The brain is designed with blind spots, optical and psychological, and one of its cleverest tricks is to confer on us the comforting  delusion that we, personally, do not have any. In a sense, dissonance theory is a theory of blind spots — of how and why  people unintentionally blind themselves so that they fail to notice vital events and information that might make them question their
behavior or their convictions. Along with the confirmation bias, the  brain comes packaged with other self-serving habits that allow us  to justify our own perceptions and beliefs as being accurate, realistic, and unbiased.
Social psychologist Lee Ross calls this phenomenon “naive  realism,” the inescapable conviction that we perceive objects and  events clearly, “as they really are.” We assume that other  reasonable people see things the same way we do. If they  disagree with us, they obviously aren’t seeing clearly. Naive  realism creates a logical labyrinth because it presupposes two  things: One, people who are open-minded and fair ought to agree  with a reasonable opinion. And two, any opinion I hold must be  reasonable; if it weren’t, I wouldn’t hold it. Therefore, if I can just  get my opponents to sit down here and listen to me, so I can tell  them how things really are, they will agree with me. And if they  don’t, it must be because they are biased.

6. How we view others’ opinions is totally skewed.
Ross and his colleagues have found that we believe our own  judgments are less biased and more independent than those of  others partly because we rely on introspection to tell us what we  are thinking and feeling, but we have no way of knowing what  others are really thinking. And when we introspect, looking into  our souls and hearts, the need to avoid dissonance assures us  that we have only the best and most honorable of motives. We
take our own involvement in an issue as a source of accuracy and  enlightenment — “I’ve felt strongly about gun control for years; therefore, I know what I’m talking about” — but we regard such  personal feelings on the part of others who hold different views as a source of bias — “She can’t possibly be impartial about gun  control because she’s felt strongly about it for years.

Go to top