Jason’s post about conspiracies reminded me of several books that support Jason’s argument.
The first book is How We Know What Isn’t so: the Fallibility of Human Reason in Everyday Life, by Thomas Gilovich (1991). Gilovich points to a number of experiments demonstrating that people strive to find order in the world where there is none. We don’t find random distributions easy to process. Rather, we allow our imaginations to run wild on randomness:
With hindsight it is always possible to spot the most anomalous features of the data and build a favorable statistical analysis around them. However, if properly trained scientist (or simply a wise person) avoids doing so because he or she recognizes that constructing a statistical analysis retrospectively capitalizes too much on chance and renders the analysis meaningless. . . . unfortunately, the intuitive assessments of the average person are not bound by these constraints.
Here’s another good example of people finding order where there isn’t, on Mars.
People are also “extraordinarily good at ad hoc explanations.” Our motives and fears ignite our imaginations:
Once a person has misidentified a random pattern as a “real” phenomenon, it will not exist as a puzzling, isolated fact about the world. Rather, it is quickly explained and readily integrated into the person’s pre-existing theories and beliefs. These theories, furthermore, then serve to bias the person’s evaluation of new information in such a way that the initial belief becomes solidly entrenched. . . . people cling tenaciously to their beliefs in the face of hostile evidence. (Page 21-24)
This last point is a recognition of what is known as the “confirmation bias.” In Inevitable Illusions: How Mistakes of Reason rule Our Minds (1996), M.P. Palmarini writes that:
There is a psychological law that has been endlessly confirmed, even among professionals and experts, among doctors, psychiatrists, judges, teachers . . .: When someone is convinced of a positive correlation, however illusory that correlation can objectively be shown to be, that person will always find new confirmations and justify why it should be so . . . [W]e are naturally and spontaneously verifiers rather than falsifiers.
We constantly seek confirmation of our own hypotheses, because we become anchored by them and overconfident in them. The confirmation bias consists of the tendency to “seek out evidence which confirms rather than contradicts current beliefs . . .”
Gilovich writes that the susceptibility of misinterpreting random data is caused by a human willingness to base conclusions on incomplete or unrepresentative information; the result of this is “a common cause of people’s questionable and erroneous beliefs.”
Another relevant book is one I discussed before: Innumeracy: Mathematical Illiteracy and Its Consequences, by John Paulos (1988). Paulos dedicates an entire chapter (Chapter 2) to probability and coincidence. He opens Chapter 2 by pointing out how often people are over-impressed with coincidences (or “synchronicities”). Coincidences are ubiquitous.
Paulos writes that people have “a tendency to drastically underestimate the frequency of coincidences.” This tendency is a prime characteristic of those who are “innumerate” (those who are unable “to deal comfortably with the fundamental notions of number and chance”). A classic example of how people become over enchanted with coincidences are the “spooky” similarities between John F. Kennedy and Abraham Lincoln. Paulos presents an another example: how many people would there have to be in a room in order for there to be a 50% probability that at least two people in the room have the same birthday. The surprising answer is 23 (page 36). If you were innumerate and you kept finding that two or more people in small gatherings had the same birthday, you might start getting paranoid, right?
According to Paulos, predictions by those involved in medical quackery and television evangelism also take advantage of the numerous coincidences of life by making the predations highly vague. Gilovich also points this out (59) when he writes of the common failure to carefully specify criteria for predictions prior to the event.
Our expectations can often be confirmed by any of a set of multiple endpoints after the fact, some of which we would not be willing to except as criteria for success before hand. When a psychic predicts that “a famous politician will die this year,” it is important to specify then and there at the range of events that will constitute a success. Otherwise, we are likely to be overly impressed by various tenuous connections between the prediction and any of a number of subsequent events.
The point I’m trying to make is that cognitive science has shown that humans are over-eager to use flimsy coincidences to buttress good stories. Once they’ve got that story running in their heads, the confirmation bias compels them to ignore conflicting evidence and to put weak evidence in favor on a pedastal.
As Jason writes, “So when are they going to begin those courses in critical thinking for grade schoolers?”