In an earlier post, I summarized an Elliot Aronson interview on the topic of cognitive dissonance. Because I find this to be such a critically compelling topic, I took the time to listen to a recent Point of Inquiry podcast featuring Dr. Carol Tavris, Aronson’s co-author regarding Mistakes Were Made (But Not by ME): Why we justify foolish beliefs, bad decisions, and hurtful acts (Harcourt, 2007). This book applies cognitive dissonance theory to a wide variety of important real-world topics, including politics, conflicts of interest, memory the criminal justice system, police interrogation, the daycare sex-abuse epidemic, family quarrels, international conflicts, and business.
In her interview, Tavris explains herself clearly and convincingly. She does not shy away from applying cognitive business to such real-world topics as religious beliefs and the political decisions made by George W. Bush. It also applies to alleged medical cures, approaches to education and child rearing techniques. It applies to any topic where the evidence might clash with a person’s need to maintain a self-image that conflicts with the evidence.
“Cognitive dissonance” is the state of discomfort humans experience when one of their beliefs is contradicted by evidence or when two of their beliefs conflict with each other. It is curious to see what usually happens when someone’s deep belief is challenged by new evidence. “They almost never say thank you.” Instead, they’re likely to say “Piss off.” This rejection is likely to occur even if the new evidence is carefully gathered scientific data. This new conflicting evidence forces people into a state of dissonance, causing them to test their beliefs. New evidence that challenges existing beliefs and actions is thus up against a formidable foe: the mind is hardwired to look for confirming evidence.
When we are armed with new evidence that others reject on religious or political grounds, for example, we sometimes wonder how these “hypocrites” can live with themselves. It is true that some scam artists do exist; there is such a thing as hypocrisy. On the other hand, most challenging evidence is rejected by well-meaning people who hold beliefs that they feel compelled to defend because of cognitive dissonance. It is not an act of intentional ignorance or maliciousness, for the most part. The usual choice is between a) I’m a good person and b) I’ve just done something criminal/stupid/harmful. Our usual decision as humans is to reduce the impact of b), preserving our self-image of integrity.
This topic is so important because this phenomenon is ubiquitous. Whenever any of us make decisions, we are always facing the need to reduce dissonance. As Tavris indicates, when people “get” the concept of cognitive dissonance, they really get it. “You’ll see it everywhere. It’s like an optical illusion.” It becomes impossible not to see it any longer. It’s important that we take this topic to heart, because all of us repeatedly face this challenge. All of us need to stop ourselves and question ourselves. We all need to make sure we do our best to get out of our own “self-justifying spirals.”
What is the best way to avoid that ill-effects of cognitive dissonance? The best way is to widen our circle of friends to include different-thinking others. We should actively surround ourselves with people who will go even further than to merely think differently; we need the people in our trusted circle to actively disagree with us. This was done by John F. Kennedy and Abraham Lincoln, who both (eventually) recognized the dangers of groupthink dissent embedded in our trusted advisors helps to keep us honest. The truth will indeed set us free, but the truth can often be quite painful . Only when we work hard to shatter the “shell of self-justification” can we keep ourselves from making the same mistakes again and again.
Most people don’t really understand enough about the proper method of resolving conflicts. They very easily slip into cognitive dissonance. Rather than process challenging new facts with an open mind, good people become virtual lawyers, arguing for the legitimacy of their past actions and their consistent virtuous character. Again, it’s not that people intentionally try to do destructive things or think illogical thoughts. Rather, we steer ourselves away from considering evidence that challenges our pristine notions of ourselves as consistently good/intelligent people. After we buy a car, for instance, we avoid reading new information that might cause us to conclude that we actually made a bad decision in our purchase.
How do outsiders with new good information push through then, when cognitive dissonance keeps people from considering their destructive ways? How do we puncture the protective cocoons of other people. Tavris warns that we won’t be able to do this directly. The more illogical/ignorant/harmful is a person’s past thoughts or actions, the stronger cognitive dissonance pushes back. People are not very often rational, but rather self-rationalizing.
Tavris considered various conflicts between religion and science. A fact that repeatedly surprises skeptics and scientists is that religious faith is often strengthened by evidence that disconfirms net religious belief. Believers often come to the conclusion that God is testing them or that it was the devil’s fault when the evidence shows that a “good God” has done ill to someone in the world. A lone survivor of a plane crash will consider that God was looking out after him or her, not that God was killing everyone else. Tavris warns that pointing out irrational beliefs with forceful arguments usually doesn’t help. It puts the other person into a state of dissonance, making them feel stupid. They’ll say “To hell with you.”
Therefore, we need to be careful to avoid the “tone of certitude” when making arguments to people who are emotionally invested in protecting their past actions or beliefs. We need to consider what those beliefs mean to them. We might need to use some patient diplomacy. Rather than framing the argument as one of science versus religion, present the issue as one that divides people of varying religious beliefs. That way, it will the less likely that a religious person will feel attacked by evidence conflicting with their personal religious beliefs.
I have not yet read the book by Tavris and Aronson, but neither of their interviews touch on the topic of social pressures to conform to certain beliefs. Rather, their interviews both hammered on the effect new evidence has upon personal beliefs. I would be interested in knowing more about the relationship between cognitive dissonance and the effect that Solomon Asch described in many of his experiments, the “pressure” people often feel to conform their beliefs to the stated beliefs or preferences of third parties.
Some people can juggle three tennis balls for minutes on end without dropping them. Most people can’t. Some people can whistle a happy tune beautifully, but most people can’t. It is obvious, is it not, that whether you can juggle or whistle has nothing at all to do with whether you are a good, honest, loving person. If only it were equally obvious that those who can manage the intellectual gymnastics required to keep alive a conviction that God exists in the face of all the grounds for doubting it have no moral superiority at all over those who find this proposition frankly incredible! In fact, there is good reason to believe that the varieties of self-admonition and self-blinding that people have to indulge in to gird their creedal loins may actually cost them something substantial in the moral agency department: a debilitating willingness to profess solemnly in the utter absence of conviction, a well-entrenched habit of deflecting their attention from evidence that is crying out for consideration, and plenty of experience biting their tongues and saying nothing when others around them make assumptions that they know in their hearts to be false.
Daniel Dennett – writing in Newsweek.
Hi Erich,
Thanks for this post. I found it very interesting. I would like to dispute it. Not every new idea is a good one (whole language, the ritalin panacea), and not all evidence is evaluated correctly. Since cognitive dissonance is a theory and since science is "hardwired to look for confirming evidence" of the truth of theories, it looks like the scientific method itself suffers from its own cognitive dissonance. Not to mention scientists themselves. (remember Einstein's refusal to believe, in the face of the evidence, that god plays dice? He may be right.) Pursuing "scientific truth" can actually be immoral and inhumane. ie. nuclear weapon development, eugenics, animal testing. On the other hand, in psychology, the Stanford prison experiment has been banned for being too inhumane, and I expect the Milgrim experiment also. By banning research we are dooming ourselves to the ongoing scourge of sadism and destructive conformity, aren't we? So we have a conflict here. A dissonance. It has to be struggled with. If life were chess, the endgame would prove the correct choice. But in life what is a correct move? If DaVinci hadn't been "illegitimate" he would have gone into accounting. The distasteful notion of illegitimacy had at least some utility. Resolving dissonance by rejecting the evidence is valid, even if on the basis of (you guessed it) general scepticism. ie. the jury's still out.
Recently released poll data on religion has dueling atheist camps (militants vs appeasers) each claiming victory on how best to promote science. Of course, the militant atheist are correct as usual. 😛
http://scienceblogs.com/pharyngula/2007/09/someti…
(btw, I can juggle 4 balls… )