The function of reason

Chris Mooney reports on the work of Hugo Mercier and Dan Sperber, who have argued that (in Mooney's words): "the human capacity for reasoning evolved not so much to get at truth, as to facilitate argumentation." I haven't yet heard Mooney's interview of Mercier, which will soon be posted at Point of Inquiry. I do look forward to this interview, because the conclusions of Mercier and Sperber (which I scanned in their recent journal article, "Why do Humans Reason? Arguments for an Argumentative Theory") make much sense in light of the ubiquitous failings of human reason-in-action. Here is an excerpt from the abstract from their article:

Reasoning is generally seen as a means to improve knowledge and make better decisions. However, much evidence shows that reasoning often leads to epistemic distortions and poor decisions. This suggests that the function of reasoning should be rethought. Our hypothesis is that the function of reasoning is argumentative. It is to devise and evaluate arguments intended to persuade. Reasoning so conceived is adaptive given the exceptional dependence of humans on communication and their vulnerability to misinformation. A wide range of evidence in the psychology of reasoning and decision making can be reinterpreted and better explained in the light of this hypothesis. Poor performance in standard reasoning tasks is explained by the lack of argumentative context. When the same problems are placed in a proper argumentative setting, people turn out to be skilled arguers. Skilled arguers, however, are not after the truth but after arguments supporting their views. This explains the notorious confirmation bias. This bias is apparent not only when people are actually arguing, but also when they are reasoning proactively from the perspective of having to defend their opinions. Reasoning so motivated can distort evaluations and attitudes and allow erroneous beliefs to persist. Proactively used reasoning also favors decisions that are easy to justify but not necessarily better. In all these instances traditionally described as failures or flaws, reasoning does exactly what can be expected of an argumentative device: Look for arguments that support a given conclusion, and, ceteris paribus, favor conclusions for which arguments can be found. Reasoning is generally seen as a means to improve knowledge and make better decisions. However, much evidence shows that reasoning often leads to epistemic distortions and poor decisions. This suggests that the function of reasoning should be rethought.Our hypothesis is that the function of reasoning is argumentative. It is to devise and evaluate arguments intended to persuade.
These ideas resonate strongly with me. [More . . . ]

Continue ReadingThe function of reason

How Fox “news” manipulates its viewers

Here is a comprehensive list of the techniques Fox News uses to manipulate its viewers, compliments of Dr. Cynthia Boaz. It seems to me that responsible thinkers would anticipate these techniques, recognize them and turn this drivel off. Here is the list of techniques, but I would highly recommend visiting the main article for clear explanations of each. 1. Panic Mongering. 2. Character Assassination/Ad Hominem. 3. Projection/Flipping. 4. Rewriting History. 5. Scapegoating/Othering. 6. Conflating Violence With Power and Opposition to Violence With Weakness. 7. Bullying. 8. Confusion. 9. Populism. 10. Invoking the Christian God. 11. Saturation. 12. Disparaging Education. 13. Guilt by Association. 14. Diversion. I freely admit that FOX News is not the only "news" channel that employs these techniques--I've seen most of these used on other networks, though FOX is famous for proudly using these techniques.

Continue ReadingHow Fox “news” manipulates its viewers

Dotted lines on paper

Wray Herbert writes in a Scientific American article titled "Border Bias and Our Perception of Risk" of a study by husband-and-wife team Arul and Himanshu Mishra at the University of Utah on how people perceive events within a bias of arbitrary political borders. Asked to imagine a vacation home in either "North Mountain Resort, Washington, or in West Mountain Resort, Oregon" the study group was given details about a hypothetical seismic event striking a distance that vacation home, but details differing as to where the event occurred:

Some heard that the earthquake had hit Wells, Wash., 200 miles from both vacation home sites. Others heard that the earthquake had struck Wells, Ore., also 200 miles from both home locations. They were warned of continuing seismic activity, and they were also given maps showing the locations of both home sites and the earthquake, to help them make their choice of vacation homes.
The results revealed a bias in that people felt a greater risk when the event was in-state as opposed to out of state. A second study involved a not-in-my-backyard look at a radioactive waste storage site and the Mishras used maps with thick lines and thin dotted lines to help people visualize the distances and state borders. It isn't hard to guess which lines conveyed a greater feeling of risk. I recall a story my brother told me about 17 years ago in which he was helping an old friend change the oil in his farm tractor. My brother asked, "Hey, Jack, where do you want me to put this [the used oil]?" Jack said, "Pour it over there on the stone wall." (We lived in Connecticut, where they grow those things everywhere). Brother Marshall said, "Jack, you can't do that anymore." Jack thought a short second or two, and said, "Yeah, you're right. Better pour it on the other side."

Continue ReadingDotted lines on paper

Affirmative action for conservatives?

I have written several posts holding that we are all blinded by our sacred cows. Not simply those of us who are religions. This blindness occurs to almost of us, at least some of the time. Two of my more recent posts making this argument are titled "Mending Fences" and "Religion: It's almost like falling in love." In arriving at these conclusions, I've relied heavily upon the writings of other thinkers, including the writings of moral psychologist Jonathan Haidt. Several years ago, Haidt posited four principals summing up the state-of-the-art in moral psychology: 1. Intuitive primacy (but not dictatorship) 2. Moral thinking is for social doing. 3. Morality is about more than harm and fairness. 4. Morality binds and blinds. In a recent article at Edge.org, Haidt argued that this fourth principle has proven to be particularly helpful, and it can "reveal a rut we've gotten ourselves into and it will show us a way out." You can read Haidt's talk at the annual convention for the Society of Personality and Social Psychology, or listen to his reconstruction of that talk (including slides) here. This talk has been making waves lately, exemplified by John Tierney's New York Times article. Haidt begins his talk by recognizing that human animals are not simply social, but ultrasocial. How social are we? Imagine if someone offered you a brand-new laptop computer with the fastest commercially available processor, but assume that this computer was broken in such a way that it could never be connected to the Internet. In this day and age of connectivity, that computer will get very little use, if any. According to Haidt, human ultrasociality means that we "live together in very large [caption id="attachment_16630" align="alignright" width="300" caption="Image by Jeremy Richards at Dreamstime.com (with permission)"][/caption] groups of hundreds or thousands, with a massive division of labor and a willingness to sacrifice for the group." Very few species are ultrasocial, and most of them do it through a breeding trick by which all members of the group are first-degree relatives and they all concentrate their efforts at breeding with regard to a common queen. Humans beings are the only animals that doesn't use this breeding trick to maintain their ultrasociality. [More . . . ]

Continue ReadingAffirmative action for conservatives?

Bill Moyers: Facts threaten us.

According to Truthout, Bill Moyers recently gave a talk at History Makers, and had this disturbing information: well documented facts often backfire:

As Joe Keohane reported last year in The Boston Globe, political scientists have begun to discover a human tendency "deeply discouraging to anyone with faith in the power of information." He was reporting on research at the University of Michigan, which found that when misinformed people, particularly political partisans, were exposed to corrected facts in news stories, they rarely changed their minds. In fact, they often became even more strongly set in their beliefs. Facts were not curing misinformation. "Like an underpowered antibiotic, facts could actually make misinformation even stronger." You can read the entire article online. I won't spoil it for you by a lengthy summary here. Suffice it to say that, while "most of us like to believe that our opinions have been formed over time by careful, rational consideration of facts and ideas and that the decisions based on those opinions, therefore, have the ring of soundness and intelligence," the research found that actually "we often base our opinions on our beliefs ... and rather than facts driving beliefs, our beliefs can dictate the facts we chose to accept. They can cause us to twist facts so they fit better with our preconceived notions."

Continue ReadingBill Moyers: Facts threaten us.