According to Truthout, Bill Moyers recently gave a talk at History Makers, and had this disturbing information: well documented facts often backfire:
As Joe Keohane reported last year in The Boston Globe, political scientists have begun to discover a human tendency “deeply discouraging to anyone with faith in the power of information.” He was reporting on research at the University of Michigan, which found that when misinformed people, particularly political partisans, were exposed to corrected facts in news stories, they rarely changed their minds. In fact, they often became even more strongly set in their beliefs. Facts were not curing misinformation. “Like an underpowered antibiotic, facts could actually make misinformation even stronger.” You can read the entire article online.
I won’t spoil it for you by a lengthy summary here. Suffice it to say that, while “most of us like to believe that our opinions have been formed over time by careful, rational consideration of facts and ideas and that the decisions based on those opinions, therefore, have the ring of soundness and intelligence,” the research found that actually “we often base our opinions on our beliefs … and rather than facts driving beliefs, our beliefs can dictate the facts we chose to accept. They can cause us to twist facts so they fit better with our preconceived notions.”
moral behaviors are typically the product of multiple levels of moral functioning, and are usually energized by the “hotter” levels of intuition, emotion, and behavioral virtue/vice. The “cooler” levels of values, reasoning, and willpower, while still important, are proposed to be secondary to the more affect-intensive processes.
Haidt has used the metaphor of an intellectually-nimble lawyer riding on top of a huge emotion-permeated elephant to illustrate his counter-intuitive approach, suggesting that the small articulate lawyer on top often lacks meaningful control over the elephant. Moral judgments are usually dominated by emotions such as empathy and disgust (the strength of these is represented by the big-ness of the elephant). In short, Haidt is quite sympathetic to David Hume’s suggestion that moral reasoning is essentially “the slave of the passions.”
In the March 25, 2010 edition of Nature (available here), Paul Bloom expressed concern that something important has been left out of Haidt’s model. In reaction, Haidt defended himself against Bloom’s attack (see below), indicating that Bloom (whose work Haidt admires, for the most part) has misconstrued Haidt’s Social Intuitionist Model. I believe that summarizing this exchange between Haidt and Bloom sharpens the focus on the meaning of Haidt’s Social Intuitionist Model.
[More . . . ]
Most religious adherents would be aghast if one suggested that they, or their religion, were fundamentally and consistently dishonest. However I believe that is indeed the case.
I read a comment on a recent blog post by Ed Brayton (honesty vs intellectually honest).
Ed’s post argued about the distinction between honesty and intellectual honesty, and noted that intellectual honesty must recognize not only the arguments in support of a position, but also any evidence or arguments against that position.
One of the commenters (Sastra) then made the following case that faith was fundamentally intellectually dishonest:
[…] An intellectually dishonest person blurs the distinction [between being intellectually honest, and being emotionally honest], and seems to confuse fact claims with meaning or value claims. To a person who places emphasis on emotional honesty, strength of conviction is evidence. An attack on an idea, then, is an attack on the person who holds it. The idea is true because it’s emotionally fulfilling: intentions and sincerity matter the most. Therefore, you don’t question, search, or respect dissent. A person who is trying to change your mind, is trying to change you.
For example, I consider religious faith […] to be intellectually dishonest. It is, however, sincerely emotionally honest.
[…] “Faith is the substance of things hoped for; the evidence of what is not seen.” There’s a huge emotional component to it, so that one chooses to keep faith in X, the way one might remain loyal to a friend. You defend him with ingenuity and love, finding reasons to explain or excuse evidence against him. He cannot fail: you, however, can fail him, by allowing yourself to be lead into doubt.
Being able to spin any result into support then is a sign of good will, loyalty, reliability, and the ability to stand fast. The focus isn’t on establishing what’s true, but on establishing that you can be “true.” This emotional honesty may or may not be rewarded: the real point, I think, is to value it for its own sake, as a fulfillment of a perceived duty.
This is exactly the case with religion, and religious adherents.
Their faith in their god is entirely emotional, and no amount of material evidence will alter their belief.
They may be entirely honest in their belief, and may be entirely honest in their objection to evidence (cf Karl, Rabel, Walter, et al) but in doing so are being intellectually dishonest, because they refuse to recognize valid and entirely relevant evidence – they conflate with great consistency and verve fact claims with value claims, and deny any difference between them stating it’s all ‘interpretation’.
No, it isn’t all interpretation.
Another inquiry has determined that the “Climategate” scientists’ “rigor and honesty as scientists are not in doubt.” Not that this will slow down attacks on inconvenient science.
Perhaps the biggest lesson illustrated is that when you show know-nothings that they are wrong, it has no effect on their opinions. For an equally good example, read about the “Lenski Affair,” where the scientists had conducted 20 years of rigorous experiments that clearly demonstrated evolution of E. coli in the lab. Evidence just isn’t good enough for zealots.
The next time someone mentions that humans are “rational” you might want to refer them to Wikipedia’s list of dozens of cognitive biases. How handy to have all of these biases listed in one place. The list includes each of the following biases, each of them liked to specific Wikipedia articles.
Decision-making and behavioral biases
Base rate fallacy
Bias blind spot
Experimenter’s or Expectation bias
Illusion of control
Mere exposure effect
Moral credential effect
Need for Closure
Neglect of probability
Status quo bias
Von Restorff effect
Biases in probability and belief
[More . . . ]