Archive for April 3rd, 2010
McClatchy has published a video and a written summary of conservatives’ recent efforts to rewrite history.
This evidence-free approach to history is surreal. How can this possibly be happening? It is apparent that these rewrites of history are evidence of the confirmation bias running at full throttle. I recently came across this vivid description of this phenomenon in a book called A Mind of Its Own: How Your Brain Distorts and Deceives (2006), by Cordelia Fine:
Reasoning is the vain brain’s of . . . powerful protectorate. This might seem a little odd. Isn’t reasoning supposed to be the compass that guides us toward the truth, not saves us from it? It seems not–particularly when our ego is under attack. In fact, the best we can say for our gift of thinking in these circumstances is that we do at least recognize that conclusions cannot be drawn out of thin air: we need a bit of evidence to support our case. The problem is that we behave like a smart lawyer searching for evidence to bolster his client’s case, rather than a jury searching for the truth. As we’ve seen, memory is often the overzealous secretary who assists in this process by hiding or destroying files that harbor unwanted information. Only when enough of the objectionable stuff has been shredded dare we take a look. Evidence that supports your case is quickly accepted, and the legal assistants are sent out to find more of the same. However, evidence that threatens reason’s most important client–you–is subjected to grueling cross-examination. Accuracy, validity, and plausibility all come under attack on the witness stand. The case is soon won. A victory for justice and truth, you think, conveniently ignoring the fact that yours was the only lawyer in the courtroom.
Fine adds this additional description toward the end of her book:
Evidence that fits with our beliefs is quickly waved through the mental border control. Counter-evidence, on the other hand, must submit to close interrogation and even then will probably not be allowed in. As a result, people can wind up holding their beliefs even more strongly after seeing counter-evidence. It’s as if we think, “Well, if that’s the best that the other side can come up with then I really must be right.” This phenomenon, called “belief polarization,” may help to explain why attempting to disillusion people of their perverse misconceptions is so often futile.