A few years ago, I wrote a post where I pointed out that early innocuous-seeming intellectual moves can result in huge consequences further down the road. I illustrated this point by mentioning that, for many people, the uncritical acceptance that cognition allegedly occurs in the absence of a neural network capable of able to support that cognition had led to the belief in souls (as well as ghosts and gods). We need to be careful about our early assumptions.
I have recently finished reading Thinking, Fast and Slow, an excellent new book by Daniel Kahneman. In his new book, Kahneman writes that the availability heuristic "like other heuristics of judgment, substitutes one question for another: you wish to estimate the size of the category or the frequency of an event, but your report and impression of the ease with which instances come to mind. Substitution of questions inevitably produces systematic errors." [Page 130] I have often considered the great power of the availability heuristic. It is a phenomenon "in which people predict the frequency of an event, or a proportion within a population, based on how easily an example can be brought to mind." We tend to recall information based upon whether the event is salient, dramatic or personal. It is difficult to set these aside when determining relevant evidence.
In chapter 13, "Availability, a Motion, and Risk," Kahneman reminds us that "availability" provides a heuristic for a wide variety of judgments, including judgments other than frequency. In particular, the importance of an idea is often judged by the fluency (an emotional charge) with which that idea comes to mind." He describes the availability heuristic as perhaps the most dominant heuristic in social contexts, and describes how it can result in immense social damage when it is applied in cascaded fashion.
The availability cascade is a self-sustaining chain of events, which may start from media reports of a relatively minor event and lead up to public panic and large-scale government action. On some occasions, a media story about a risk catches the attention of a segment of the public, which becomes aroused and worried. This emotional reaction becomes a story in itself, prompting additional coverage in the media, which in turn produces greater concern and involvement. The cycle is sometimes sped along deliberately by "availability enterprise orders," individuals or organizations who work to ensure a continuous flow of worrying news. The danger is increasingly exaggerated as the media compete for attention grabbing headlines. Scientists and others who try to dampen the increasing fear and repulsion attract little attention, most of it hostile: anyone who claims that the danger is overstated is suspected of association with a "heinous cover-up." The issue becomes politically important because it is on everyone's mind, and the response of the political system is guided by the intensity of public sentiment. The availability cascade has now reset priorities. Other risks, and other ways that resources could be applied for the public good, all have faded into the background.
. . .
[W]e either ignore [small risks] altogether or give them far too much weight-nothing in between.… The amount of concern is not adequately sensitive to the probability of harm; you are imagining the numerator-the tragic story you saw on the news-and not thinking about the denominator. [Cass] Sunstein has coined the phrase "probability neglect" to describe the pattern. The combination of probability neglect with the social mechanisms of availability cascades inevitably leads to gross exaggeration of minor threats, sometimes with important consequences.
In today's world, terrorists are the most significant practitioners of the art of inducing availability cascades. With a few horrible exceptions such as 9/11, the number of casualties from terror attacks is very small relative to other causes of death. Even in countries that have been targets of intensive terror campaigns, such as Israel, the weekly number of casualties almost never came close to the number of traffic deaths. The differences in the availability of the two risks, the ease in the frequency with which they come to mind. Gruesome images, endlessly repeated in the media, cause everyone to be on edge. As I know from experience, it is difficult to reason oneself into a state of complete calm.
[Page 142-144]
Anyone who has bothered to watch what passes as "the news" understands the ways in which the "news media" works the availability cascade. I saw this firsthand
after TWA Flight 800 exploded and crashed. I was approached by local TV news station in St. Louis while I was walking in downtown St. Louis. An extremely intense reporter wanted to get my opinion (this was within an hour after flight 800 had exploded and crashed into the ocean). She asked me something like this: "What is your reaction to the fact that it appears as though terrorists have shot down TWA flight 800, killing hundreds of people?" My response to her was, "Do we actually know that flight 800 was shot down by terrorists?" She was flabbergasted, and not interested in anything else I had to say. I watched the news that night to see what they did put on, and I saw several people reacting in horror that terrorists would dare shoot down an American commercial flight. The station was interested in stirring up anger and hysteria, not in asking or answering a simple question that I asked. It turned out, of course, that there is no evidence that terrorists had anything to do with the crash of TWA flight 800.
Kahneman's book talks indeed tale about the availability heuristic, as well as numerous other cognitive tricks and traps, with warnings that these mental shortcuts often have real-world significant (and even devastating) effects, and offering lots of good advice as to how to anticipate and avoid falling into these traps.
I will be writing about this book for many months and years to come. It is a real gem.