RSSCategory: cognitive biases

Cascades of terror

January 1, 2012 | By | 2 Replies More
Cascades of terror

A few years ago, I wrote a post where I pointed out that early innocuous-seeming intellectual moves can result in huge consequences further down the road. I illustrated this point by mentioning that, for many people, the uncritical acceptance that cognition allegedly occurs in the absence of a neural network capable of able to support that cognition had led to the belief in souls (as well as ghosts and gods). We need to be careful about our early assumptions.

I have recently finished reading Thinking, Fast and Slow, an excellent new book by Daniel Kahneman. In his new book, Kahneman writes that the availability heuristic “like other heuristics of judgment, substitutes one question for another: you wish to estimate the size of the category or the frequency of an event, but your report and impression of the ease with which instances come to mind. Substitution of questions inevitably produces systematic errors.”  [Page 130] I have often considered the great power of the availability heuristic. It is a phenomenon “in which people predict the frequency of an event, or a proportion within a population, based on how easily an example can be brought to mind.” We tend to recall information based upon whether the event is salient, dramatic or personal. It is difficult to set these aside when determining relevant evidence.

In chapter 13, “Availability, a Motion, and Risk,” Kahneman reminds us that “availability” provides a heuristic for a wide variety of judgments, including judgments other than frequency. In particular, the importance of an idea is often judged by the fluency (an emotional charge) with which that idea comes to mind.” He describes the availability heuristic as perhaps the most dominant heuristic in social contexts, and describes how it can result in immense social damage when it is applied in cascaded fashion.

The availability cascade is a self-sustaining chain of events, which may start from media reports of a relatively minor event and lead up to public panic and large-scale government action. On some occasions, a media story about a risk catches the attention of a segment of the public, which becomes aroused and worried. This emotional reaction becomes a story in itself, prompting additional coverage in the media, which in turn produces greater concern and involvement. The cycle is sometimes sped along deliberately by “availability enterprise orders,” individuals or organizations who work to ensure a continuous flow of worrying news. The danger is increasingly exaggerated as the media compete for attention grabbing headlines. Scientists and others who try to dampen the increasing fear and repulsion attract little attention, most of it hostile: anyone who claims that the danger is overstated is suspected of association with a “heinous cover-up.” The issue becomes politically important because it is on everyone’s mind, and the response of the political system is guided by the intensity of public sentiment. The availability cascade has now reset priorities. Other risks, and other ways that resources could be applied for the public good, all have faded into the background.

.   .   .

[W]e either ignore [small risks] altogether or give them far too much weight-nothing in between.… The amount of concern is not adequately sensitive to the probability of harm; you are imagining the numerator-the tragic story you saw on the news-and not thinking about the denominator. [Cass] Sunstein has coined the phrase “probability neglect” to describe the pattern. The combination of probability neglect with the social mechanisms of availability cascades inevitably leads to gross exaggeration of minor threats, sometimes with important consequences.

In today’s world, terrorists are the most significant practitioners of the art of inducing availability cascades. With a few horrible exceptions such as 9/11, the number of casualties from terror attacks is very small relative to other causes of death. Even in countries that have been targets of intensive terror campaigns, such as Israel, the weekly number of casualties almost never came close to the number of traffic deaths. The differences in the availability of the two risks, the ease in the frequency with which they come to mind. Gruesome images, endlessly repeated in the media, cause everyone to be on edge. As I know from experience, it is difficult to reason oneself into a state of complete calm.

[Page 142-144]

Anyone who has bothered to watch what passes as “the news” understands the ways in which the “news media” works the availability cascade. I saw this firsthand after TWA Flight 800 exploded and crashed. I was approached by local TV news station in St. Louis while I was walking in downtown St. Louis. An extremely intense reporter wanted to get my opinion (this was within an hour after flight 800 had exploded and crashed into the ocean). She asked me something like this: “What is your reaction to the fact that it appears as though terrorists have shot down TWA flight 800, killing hundreds of people?” My response to her was, “Do we actually know that flight 800 was shot down by terrorists?” She was flabbergasted, and not interested in anything else I had to say. I watched the news that night to see what they did put on, and I saw several people reacting in horror that terrorists would dare shoot down an American commercial flight. The station was interested in stirring up anger and hysteria, not in asking or answering a simple question that I asked. It turned out, of course, that there is no evidence that terrorists had anything to do with the crash of TWA flight 800.

Kahneman’s book talks indeed tale about the availability heuristic, as well as numerous other cognitive tricks and traps, with warnings that these mental shortcuts often have real-world significant (and even devastating) effects, and offering lots of good advice as to how to anticipate and avoid falling into these traps.

I will be writing about this book for many months and years to come. It is a real gem.

Share

Read More

What Most Sets of Commandments Get Wrong

December 11, 2011 | By | Reply More
What Most Sets of Commandments Get Wrong

I recently read Penn Jillette’s 10 Commandments for atheists, written as a response to a challenge by Glenn Beck. Most of Penn’s rules made good sense. But one went off the rails, I opine.

He included one found in most mistranslations of the Christian Ten: “Don’t Lie.” Penn explicitly adds the caveat: “(You know, unless you’re doing magic tricks and it’s part of your job. Does that make it OK for politicians, too?)”

But the premise is basically flawed. The original line in Exodus 20:16 (KJV) is Thou shalt not bear false witness against thy neighbour. This is a very specific form of lie. Even too specific. Not only is it an injunction against perjury, but only against perjury against your landholding neighbor, as opposed to people from other places, or to property such as women and slaves.

Of course we all must lie on occasion. How else can we answer, “Isn’t she the most beautiful baby ever?” or “Honey, do I look puffy?” Would it be false testimony to confirm a harmless bias one on one?

Yet I suggest that the proper commandment should be, “Don’t bear false witness.” Period. Don’t testify to things of which you are not absolutely sure; that you have not personally experienced. Not in a public forum. Don’t repeat “what everybody knows” unless you preface it with an appropriate waffle, such as “I heard that someone else heard that…”

But this might make it difficult to testify to the all-embracing love of a demonstrably genocidal God. A Google image search of “Testify” gives mostly Christian imagery.

Share

Read More

Religiosity and the “just world” hypothesis

October 3, 2011 | By | Reply More
Religiosity and the “just world” hypothesis

Over at Daylight Atheism, Ebonmuse discusses the “just world” hypothesis: [P]eople are uncomfortable believing that suffering is random, that sometimes bad things happen for no reason at all. Instead, we prefer to believe that people must have done something to deserve what they get. This is obviously a reassuring and comforting belief, which explains its […]

Share

Read More

Money talks

October 3, 2011 | By | 2 Replies More
Money talks

During a recent conversation with a friend, I found myself wondering whether I had sufficient evidence for my claim. My claim was that most corporate newspapers and electronic media are reluctant publish stories that make big corporations look bad, the motivation being that big corporations by expensive ads. Don’t bite the hand that feeds you. My friend reminded me that her husband works for a newspaper and he’s never seen the “smoking gun memo” that substantiates the that corporations are telling the news media what stories to avoid covering. She says that the problem is that the media is understaffed and lazy, not that they are biased.

I responded that I don’t think that there actually NEEDS to be a memo. As long as the media picks on little targets and celebrity news, there isn’t much blow-back. But if they were to take on a big target in a big way, the reporters and editors already KNOW that the switchboard would light up and email will come pouring in from big shots affiliated with corporations, making them wish they they had just stuck with the tried and true (e.g., celebrity news, sports, shootings and accidents). There is a substitute for a smoking gun memo, and it’s the overall lack of reporting critical of corporations that is not simply reporting on an ongoing legal dispute or where one corporation criticizes another. Many people think that circumstantial cases are necessarily weak, but this is not true. Criminals are sent to prison based on circumstantial evidence.

I’ll be on the lookout for a good study that demonstrates the problem, and I’m certainly open to evidence to the contrary. In the meantime, I’ve just noticed two recent stories that exemplify the political power of money.

Example 1: The New Yorker has just published a detailed article explaining how concentrated money is buying elections in North Carolina.

Example 2: Contrary to strong studies to the contrary, the Susan G. Komen for the Cure organization is claiming that the common chemical bisphenol A (BPA) presents no risk of cancer. Here’s an excerpt from a recent Mother Jones article, “Is Susan G. Komen Denying the BPA-Breast Cancer Link?”:

In April 2010 Komen posted an online statement saying that BPA had been “deemed safe.” And a more recent statement on Komen’s website about BPA, from February 2011, begins, “Links between plastics and cancer are often reported by the media and in email hoaxes.” Komen acknowledges in its older statement that the Food and Drug Administration is doing more studies on BPA, but also says that there is currently “no evidence to suggest a link between BPA and risk of breast cancer.”

Share

Read More

Jonathan Haidt: What the moral sciences should look like

September 29, 2011 | By | Reply More
Jonathan Haidt: What the moral sciences should look like

At Edge Video, psychologist Jonathan Haidt has given a briskly presented 30-minute lecture on what the moral sciences should look like in the 21st century. He opened his talk by indicating that we are now in a period of a new synthesis in ethics, meaning that in order to do meaningful work in the field of moral psychology, one has to draw from numerous other fields, including biology, computer science, mathematics, neuroscience, primatology and many other fields. The bottom line is that one needs to be careful to not attempt to reduce moral psychology to a single principle, as is often done by those who advocate that morality is a code word for a single test, such as welfare-maximization or justice-fairness.

I have followed Jonathan Haidt’s work for several years now, and I am highly impressed with his breadth of knowledge, his many original ideas, and the way he (in keeping with his idea of what moral psychology should be like) synthesizes the work of numerous disparate fields of study. In this post, I am sharing my own notes from my viewing of heights two-part video lecture.

In Haidt’s approach, the sense of taste serves as a good metaphor for morality. There are only a few dominant bases for moral taste (akin to the four types of taste receptors), taste can be generally categorized as “good” or “bad,” and despite the fact that there are a limited number of foundations for moral and sensory taste, there is plenty of room for cultural variation–every culture has its own approach to making good moral decisions (and making good tasting food).

Haidt warns that those studying moral psychology should be careful to avoid two common errors that are well illustrated by two recent journal articles. The first article, titled “The Weirdest People in the World,” indicates that most of the psychology research done in the entire world is done in the United States, and the subjects tend to be Western, educated, industrialized, rich and democratic (“WEIRD”). Not that one cannot do psychology with this homogenous group of subjects (typically college students), but one needs be careful to avoid generalizing to the entire world based upon a WEIRD set of subjects. In fact, WEIRD people tend to see the world much differently than people in many other cultures. They tend to see separate objects (versus relationships), and they tend to rely on analytical thinking (categories and laws, reason and logic) versus holistic thinking (patterns and context). Does this make us WEIRD people more accurate since we think in these analytical terms? Not necessarily, but before generalizing, we need to take it to heart that we live in an unusual culture. Haidt warns that this problem is exacerbated because our psychologists tend to surround themselves with similar-thinking others, and when this happens, the confirmation bias kicks in and they will inevitably find lots of evidence to condemn those who think differently.

[More . . . ]

Share

Read More

The function of reason

August 15, 2011 | By | 7 Replies More
The function of reason

Chris Mooney reports on the work of Hugo Mercier and Dan Sperber, who have argued that (in Mooney’s words): “the human capacity for reasoning evolved not so much to get at truth, as to facilitate argumentation.”

I haven’t yet heard Mooney’s interview of Mercier, which will soon be posted at Point of Inquiry. I do look forward to this interview, because the conclusions of Mercier and Sperber (which I scanned in their recent journal article, “Why do Humans Reason? Arguments for an Argumentative Theory”) make much sense in light of the ubiquitous failings of human reason-in-action. Here is an excerpt from the abstract from their article:

Reasoning is generally seen as a means to improve knowledge and make better decisions. However, much evidence shows that reasoning often leads to epistemic distortions and poor decisions. This suggests that the function of reasoning should be rethought. Our hypothesis is that the function of reasoning is argumentative. It is to devise and evaluate arguments intended to persuade. Reasoning so conceived is adaptive given the exceptional dependence of humans on communication and their vulnerability to misinformation. A wide range of evidence in the psychology of reasoning and decision making can be reinterpreted and better explained in the light of this hypothesis. Poor performance in standard reasoning tasks is explained by the lack of argumentative context. When the same problems are placed in a proper argumentative setting, people turn out to be skilled arguers. Skilled arguers, however, are not after the truth but after arguments supporting their views. This explains the notorious confirmation bias. This bias is apparent not only when people are actually arguing, but also when they are reasoning proactively from the perspective of having to defend their opinions. Reasoning so motivated can distort evaluations and attitudes and allow erroneous beliefs to persist. Proactively used reasoning also favors decisions that are easy to justify but not necessarily better. In all these instances traditionally described as failures or flaws, reasoning does exactly what can be expected of an argumentative device: Look for arguments that support a given conclusion, and, ceteris paribus, favor conclusions for which arguments can be found. Reasoning is generally seen as a means to improve knowledge and make better decisions. However, much evidence shows that reasoning often leads to epistemic distortions and poor decisions. This suggests that the function of reasoning should be rethought.Our hypothesis is that the function of reasoning is argumentative. It is to devise and evaluate arguments intended to persuade.

These ideas resonate strongly with me.

[More . . . ]

Share

Read More

How Fox “news” manipulates its viewers

July 2, 2011 | By | Reply More
How Fox “news” manipulates its viewers

Here is a comprehensive list of the techniques Fox News uses to manipulate its viewers, compliments of Dr. Cynthia Boaz. It seems to me that responsible thinkers would anticipate these techniques, recognize them and turn this drivel off.

Here is the list of techniques, but I would highly recommend visiting the main article for clear explanations of each.

1. Panic Mongering.
2. Character Assassination/Ad Hominem.
3. Projection/Flipping.
4. Rewriting History.
5. Scapegoating/Othering.
6. Conflating Violence With Power and Opposition to Violence With Weakness.
7. Bullying.
8. Confusion.
9. Populism.
10. Invoking the Christian God.
11. Saturation.
12. Disparaging Education.
13. Guilt by Association.
14. Diversion.

I freely admit that FOX News is not the only “news” channel that employs these techniques–I’ve seen most of these used on other networks, though FOX is famous for proudly using these techniques.

Share

Read More

Dotted lines on paper

February 21, 2011 | By | 3 Replies More
Dotted lines on paper

Wray Herbert writes in a Scientific American article titled “Border Bias and Our Perception of Risk” of a study by husband-and-wife team Arul and Himanshu Mishra at the University of Utah on how people perceive events within a bias of arbitrary political borders.

Asked to imagine a vacation home in either “North Mountain Resort, Washington, or in West Mountain Resort, Oregon” the study group was given details about a hypothetical seismic event striking a distance that vacation home, but details differing as to where the event occurred:

Some heard that the earthquake had hit Wells, Wash., 200 miles from both vacation home sites. Others heard that the earthquake had struck Wells, Ore., also 200 miles from both home locations. They were warned of continuing seismic activity, and they were also given maps showing the locations of both home sites and the earthquake, to help them make their choice of vacation homes.

The results revealed a bias in that people felt a greater risk when the event was in-state as opposed to out of state. A second study involved a not-in-my-backyard look at a radioactive waste storage site and the Mishras used maps with thick lines and thin dotted lines to help people visualize the distances and state borders. It isn’t hard to guess which lines conveyed a greater feeling of risk.

I recall a story my brother told me about 17 years ago in which he was helping an old friend change the oil in his farm tractor. My brother asked, “Hey, Jack, where do you want me to put this [the used oil]?” Jack said, “Pour it over there on the stone wall.” (We lived in Connecticut, where they grow those things everywhere). Brother Marshall said, “Jack, you can’t do that anymore.”

Jack thought a short second or two, and said, “Yeah, you’re right. Better pour it on the other side.”

Share

Read More

Affirmative action for conservatives?

February 20, 2011 | By | 13 Replies More
Affirmative action for conservatives?

I have written several posts holding that we are all blinded by our sacred cows. Not simply those of us who are religions. This blindness occurs to almost of us, at least some of the time. Two of my more recent posts making this argument are titled “Mending Fences” and “Religion: It’s almost like falling in love.” In arriving at these conclusions, I’ve relied heavily upon the writings of other thinkers, including the writings of moral psychologist Jonathan Haidt. Several years ago, Haidt posited four principals summing up the state-of-the-art in moral psychology:

1. Intuitive primacy (but not dictatorship)
2. Moral thinking is for social doing.
3. Morality is about more than harm and fairness.
4. Morality binds and blinds.

In a recent article at Edge.org, Haidt argued that this fourth principle has proven to be particularly helpful, and it can “reveal a rut we’ve gotten ourselves into and it will show us a way out.” You can read Haidt’s talk at the annual convention for the Society of Personality and Social Psychology, or listen to his reconstruction of that talk (including slides) here. This talk has been making waves lately, exemplified by John Tierney’s New York Times article.

Haidt begins his talk by recognizing that human animals are not simply social, but ultrasocial. How social are we? Imagine if someone offered you a brand-new laptop computer with the fastest commercially available processor, but assume that this computer was broken in such a way that it could never be connected to the Internet. In this day and age of connectivity, that computer will get very little use, if any. According to Haidt, human ultrasociality means that we “live together in very large

[caption id="attachment_16630" align="alignright" width="300" caption="Image by Jeremy Richards at Dreamstime.com (with permission)"][/caption]

groups of hundreds or thousands, with a massive division of labor and a willingness to sacrifice for the group.” Very few species are ultrasocial, and most of them do it through a breeding trick by which all members of the group are first-degree relatives and they all concentrate their efforts at breeding with regard to a common queen. Humans beings are the only animals that doesn’t use this breeding trick to maintain their ultrasociality.

[More . . . ]

Share

Read More