Of course I knew how things would turn out back then: The Illusion of Inevitability

May 20, 2012 | By | Reply More

In his new book, Thinking: Fast and Slow, Daniel Kahneman notes that human beings constantly claim that they understood the past much better than they actually did at the time. Referring to Nissam Taleb’s concept of “narrative fallacy,” Kahneman details how we employ flawed stories from the past to shape our current views of the world. This is not a good thing (though it often feels good while we engage in over-confident reasoning, as pointed out by Robert Burton); the narrative fallacy is a pernicious problem often a dangerous one.

Narrative fallacies arise inevitably from our continuous attempt to make sense of the world. The explanatory stories of people find compelling are simple; are concrete rather than abstract; assign a larger role to talent, stupidity, and intentions than to luck; and focus on a few striking events that happened rather than on countless events that failed to happen. Any recent civilian event is a candidate to become the kernel of a causal narrative. Taleb suggests that we humans constantly fool ourselves by constructing flimsy accounts of the past and believing that they are true.

[Page 199]. Human beings strive to create and embrace simple stories that give simple causal accounts based upon general propensities and personality traits. The “halo effect” contributes to this coherence–we tend to assign a generalized valence to other humans, and to assume that those people always act in accordance with our generalized positive or negative characterization of them. In this world, handsome people are also smart, moral and athletic. The halo effect keeps our narratives simple and it leaves little room for true statements such as the following shocker: “Hitler loved dogs and little children.” Our simplistic stories don’t leave room for outlier qualities. We resist the fact that obtuse people are sometimes correct and that the people we admire sometimes act foolishly.

All of this is prelude to what Kahneman terms the “illusion of inevitability.” He uses the example of Google, a monstrously successful company. Many people insist that it was inevitable that Google would rise to prominence because that is what, indeed, happened, and from our present vantage points, we already know that this happened. Kahneman outlines many details from the history of Google, making an airtight case that Google’s rise to greatness depended greatly on luck and the many hapless competitors of Google (Kahneman characterizes Google’s competitors as “blind, slow and altogether inadequate in dealing with the threat that eventually overwhelmed them”).

Kahneman insists that the ultimate test for an explanation is whether it would have made the event predictable in advance. He points out, however, that no story of Google’s “unlikely success will meet that test, because no story can include the myriad of events that would’ve caused a different outcome. Kahneman repeatedly insists that humans overlook the importance of luck when telling good stories. “[T]here was a great deal of skill in the Google story, but luck played a more important role in the actual event than it does in the telling of it.”

What we see, once again, is the WYSIATI rule. “You cannot help dealing with the limited information you have as if it were all that there is to know. You build the best story from the information available to you, and if it is a good story, you believe it. Paradoxically, it is easier to construct a coherent story when you know little, when there are fewer pieces to fit into the puzzle. Our comforting conviction that the world makes sense rests on a secure foundation “our almost unlimited ability to ignore our ignorance.”

[Page 200]. Kahneman also refers to the 2008 financial crisis. Many people insist that the crisis was “inevitable.” Kahneman insists that the word “knew” should be removed from our vocabulary when discussing “major events.” Virtually every person who now says they “knew” there would be a crisis did not actually know this. “They now say they knew it because the crisis did in fact happen.… The people who thought there would be a crisis… could not conclusively show it at the time.… What is perverse about the use of know in this context is not that some individuals get credit for precedents that they do not deserve. It is that the language implies that the world is more knowable than it is. It helps perpetuate a pernicious illusion. The core of the illusion is that we believe we understand the past, which implies that the future also should be knowable, but in fact we understand the past less than we believe we do.”

[Page 201]. This problem is, indeed, pernicious because we tend to blame those in positions of power for “good decisions that work out badly and to give them too little credit for successful moves that appear obvious only after the fact. There is a clear outcome bias.” Another reason this hindsight bias is pernicious is that those in positions of authority are driven by bureaucratic solutions and “extreme reluctance to take risks.” It’s pernicious in one other way: those in positions of authority who take crazy gambles and get lucky are “never punished for having taken too much risk.” [Page 204].

Image by Darrenw at Dreamstime (with permission)

How knowable was the future back in the past? About as knowable as it is now. Go ahead and write down some predictions for the next ten years, if you dare. You’ll be wildly inaccurate if you have the courage to be detailed. So many you’ll be tempted to make vague predictions like the laughable “predictions” of astrologers. Perhaps you’d rather assume the role of one of those pompous stock analysts (the one’s you see on TV) who wait to see how the market fared before pinning a simplistic explanation onto the situation (e.g., “Investors were nervous about the European situation,” even though the “European situation” is not raised as an excuse when the market goes up). Everything is so incredibly obvious in hindsight. We are all such excellent Monday Morning Quarterbacks.

Kahneman’s point about the perniciousness of the cognitive fallacy of “the Illusion of Inevitability” is critically important. If only we had a bit more humility about what we knew, that would make us much more likely to be self-critical about what we now know, and that would be a very good thing. Perhaps, then, we would be less likely to assume the role of history-know-it-alls, especially the modern day politicians who are so ready and willing to recklessly plunge us into wars and other dangerous circumstances based upon their fallacious claims that they had a firm grip on reality and the future back in the past.



Category: cognitive biases, Psychology Cognition

About the Author ()

Erich Vieth is an attorney focusing on consumer law litigation and appellate practice. He is also a working musician and a writer, having founded Dangerous Intersection in 2006. Erich lives in the Shaw Neighborhood of St. Louis, Missouri, where he lives half-time with his two extraordinary daughters.

Leave a Reply