Vote and Rate Jamie Clubb's articles and reviews

Friday 12 November 2010

Justify this! A review of "Mistakes were Made (but not by me)"

Cover of "Mistakes Were Made (But Not by ...Cover via Amazon
This is the second book I have read that looked at the nature of human error. The first one, "Being Wrong: Adventures in the Margin of Error" by Kathryn Schulz, was a half-philosophical work that looked at the whole scope of wrongness and our relationship with it. Although scientifically sound, Schulz's excellent work was far more to do with making peace with mistakes. Similar aspects are also present in Carol Tavris and Elliot Aronson's joint work, "Mistakes Were Made (but not by me)", but this is a far less sympathetic book driven by the passion of two straightforward and eminent psychologists who focus specifically on the human reaction to personal error. Tavris was given a special award for her working contribution towards the empirical scientific sceptical movement and "Mistakes Were Made (but not by me)" seems destined to be the sceptic's go-to work on the phenomena of confirmation bias.
The book argues that self-justification through cognitive dissonance (the uncomfortable feeling of holding two different views at the same time) is responsible for an enormous number of social problems. This all hinges around the human inability to properly reconcile themselves with their mistakes. It would seem that the expression "learn from your mistakes" is not just consciously ignored it actually goes against our evolutionary hardwiring. Fortunately, Tavris and Aronson explain, we can override our impulse for self-justification and abdication of responsibility and research clearly demonstrates certain cultures have done a far better job of this than others. However, it would seem that no professional field has escaped self-justification and the results have proven to be disastrous.
The introductory chapter, "Knaves, Fools Villains, and Hypocrites: How Do They Live With Themselves?" begins with George W. Bush as the perfect example of how powerful and unwavering self-justification can be. Even with his closest advisors producing concrete evidence that contradicted his beliefs, he would not be swayed. It then goes on to show other examples of blatant hypocrites who did not seem to show a shred of remorse for the obvious harm their mistakes had caused. The authors go onto to explain how this seeming lack of conscience or ability to face the "error of their ways" not only prevents people from putting matters right, it often encourages the individual to do worse at an accelerated rate.
Chapter one helps define the main problem beneath all the future topics and examples to be discussed, cognitive dissonance. Tavris and Aronson describe it in the chapter title as "The Engine of Self-Justification". This introduces to something they refer to as "The Pyramid of Choice". The pyramid is a constant metaphor used throughout the book to explain how cognitive dissonance occurs. If two people with similar moderate views on a certain subject are at the apex of the pyramid together and they make different choices relating to that same subject their views will separate dramatically. Self-justification for the different decision they made sends them down the slope of the pyramid until they both arrive at the bottom as polar opposites.
Chapter two is an examination of blind spots. The theory of blind spots is pretty much the instinctive base level of confirmation bias. We self-justify our own beliefs and believe others to be unreasonable and close-minded because we cannot read minds, only introspect. Our minds then steer away from the discomfort of cognitive dissonance and helps cause blind spots to prevent us from seeing certain fallibilities in our argument. It is, as the authors explain in a sub-chapter of this one, "The Gift that Keeps on Giving". The more we look for self-confirming evidence the more we find and the less we see of any opposing argument.
The incredible fallibility of memory is a theme focused on in one of Schulz's chapters and it is a part of our error making process that always seems most confusing. It is a crucial area that needs regular exploring and explaining, and Tavris and Aronson do an equally good job of handling it in their book. It is little surprising that we get so defensive over our memories. Both Schulz and the authors here discuss the peculiar language we use to describe them, as if they were tangible recording devices. The truth is much different as lengthy repeated experiments involving huge numbers of people has proven time and again. Our memories get worse over time and biases play an enormous role in justifying what has happened. However, this chapter goes further than just explaining how an event is remembered favourably. With the top scientific sceptic Michael Shermer, no less, as a guest star we are treated to the memories of those who feel they were abducted by aliens. Shermer, of course, being armed with critical thinking and the knowledge of how the brain works under certain conditions quickly came to terms with the false memory he created regarding an extra terrestrial close encounter. But others have really bought into these memories and this chapter helps explain exactly why these memories appear so real.
This nicely leads into the next chapter where we see false memories being prompted by others with a confirmation bias. The most infamous recent example of this occurred with the Satanic Ritual Abuse and Multiple Personality Disorder mass hysteria of the 1980s and '90s. Therapists inspired by outmoded Freudian concepts regarding the repression of memories began reporting case after case of patients suffering abuse at the hands of a parent or parents. The idea spread like wildfire and numerous innocent people were convicted of paedophilia and other crimes against their children. It's a very dark period in the highly developed world where we all returned to the type of witch-hunting hysteria not seen since the days of McCarthyism and the Red Scare. Tavris and Aronson delve deep not so much into the minds of the deluded patients, but into the psychoanalysts who refused to admit they were wrong despite the overwhelming scientific evidence. They trace the whole repressed memory theory back to the attitude of Freud who gave plenty quotable examples of his own confirmation bias. Freud, like his disciples in the 1980s and '90s, saw whatever reaction a patient gave to be confirmation of his original diagnosis. So if the therapist suspected a patient had suffered abuse this would be proven by either the patient agreeing or disagreeing. The former was simple confirmation and the latter an example of them repressing painful memories. Freud refused to acknowledge any scientific scrutiny or criticism of his work.
If the last chapter wasn't scary enough in how self-justification can lead to prosecution, then the next chapter will chill you to the bone as the book looks at cognitive dissonance in the justice system. This chapter also has a fascinating parallel in Schulz's work, as both discuss the way DNA has been used to overturn erroneous convictions. This chapter links up well with the previous chapter in the way it looks at how a deeply flawed interviewing manual, considered to be an interrogator's bible, is used by many police departments. Like the therapists, the interviewing officer made up his mind at the beginning of the interview and everything the person said confirmed his decision. The chapter explains how a system of lying and wearing down was used. From the very beginning the manual asserts that such procedures will not work on innocent people. However, contrary to proving that only the guilty would admit their guilt under these conditions Tavris and Aronson show how they actually help create false memories in suspects. The chapter doesn't stop with the police either, it looks at prosecutors - perhaps one of the most susceptible victims of self-justification given their job- and judges who pass hasty sentences and judges who steer juries in the wrong direction.
The next chapter moves onto love and relationships. This chapter looks at the way self-justification prevents us from accepting each other's flaws and our own mistakes. Many of the problems addressed in the previous chapter come into play, including the reshaping of memories and blind spots. Self-justification, the authors argue, murders marriages because it blinds each partner to the other's good qualities. Having covered love the next chapter somewhat appropriately moves onto war, feuds and rifts. Considering the serious consequences of these events, this is perhaps the book's most dramatic and distilled case for self-justification. Longstanding disputes, like troubled marriages, thrive on both parties justifying their own actions and vilifying their opponent's position. This chapter goes deeper into the nature of war, especially into the treatment of prisoners, and the justification that leads soldiers to humiliate and torture the captives in their care.
The final chapter offers psychological advice on overcoming cognitive dissonance and the benefits of acknowledging personal mistakes. It here that we see how certain cultures are ahead of others in the way they handle mistakes. The Japanese education ethos does not attach anything like the amount of embarrassment and emotional trauma the western world does to making errors. Their worst mathematical students do better, on average, than the best American mathematical student. The authors believe this is based on the way they see errors as a natural part of the growing process.
This chapter also puts forward the counterintuitive argument that people who convincingly own up to their mistakes and take full responsibility for them by making amends win considerable support. It would be interesting to look at this argument alongside Schulz's point about the attractiveness of uncertainty. Schulz put it that bad and erroneous leaders can be more popular than their opposition, provided they come across as more decisive and certain. "Being Wrong" says that people lose faith in those who exhibit uncertainty, despite its far higher intellectual credibility. However, we can see some optimism in "Mistakes Were Made" in that an uncertain leader can win favour back by making strong positive steps by reconciling themselves with what the nature of their error and demonstrating what they have learnt from the mistake.
Another book I found that linked well with "Mistakes were Made" - and I have to say this is what is so appealing about the scientific approach - was "59 Seconds: Think a Little Change a Lot". Like the authors of "Mistakes were Made", Richard Wiseman cited extensive psychological research to support his conclusion that a child's education benefits from praise for effort as opposed to praise for the child. How does this link with error? Well, it helps us separate the mistake from the person. This is the book's greatest message.
As to be expected with a book of this nature, it cites peer reviewed scientific research and papers that can be easily traced. The style doesn't lack humour and is accessible to the lay reader. I am not sure who did the bulk of the reading or the research, and it is a bit difficult to put a voice to the writer. It's a small point and didn't really take anything away for me, but I could imagine it might put off the more casual reader. In its defence, this is a book where the work should speak for itself and it does! "Mistakes were Made (but not by me)" is well-documented and meticulously researched piece of psychological non-fiction, one of the best I have read in a long time.
If you enjoyed my review please vote for it on Ciao and Dooyoo

Other books mentioned in this review I have reviewed on here and I highly recommend:

Don't forget to check out Jamie Clubb's main blog
Enhanced by Zemanta