What You See is All There Is: Not-much information usually seems like enough.

Back in 1996, I wrote a paper I called “Decision Making, the Failure of Principles, and the Seduction of Attention,” in which I claimed that most of our dramatic “moral lapses” are not the result people intentionally trying to hurt others. Rather, most of the harm humans inflict on other humans results from the manner in which we deploy attention. We are able to make any moral issue vanish simply by not paying attention to it. Quite often we develop habits of not paying attention to certain aspects of the world—a classic habit for Americans is not considering that on planet Earth, a child starves to death every 5 seconds. If you have habituated yourself to not-think about this horrible and undeniable fact, it is quite easy to blow a large sums of money on things like poodle-haircuts, vacation homes, and even a steady stream of fancy meals.

Near the beginning of my paper, I argued that human animals are more than happy to act out of ignorance because it never actually seems that we are acting out of ignorance. Instead, humans readily assume that they have sufficient information for making important decisions even when a smidgen of self-critical conscious thought would instantly reveal that they are woefully under-informed.  When it comes to making decisions, we are fearless in our ignorance.

In the paper I mentioned above, I described various ways that cognitive science has demonstrated that human attention is severely limited. Thanks to cognitive science (but not thanks to common sense) we know that we can only see eighteen characters of text per saccade while we read, which invites computer-assisted experimenters to continually, and in real-time, fill extra-foveal regions with garbage, unbeknownst to readers. See “A Critique of Pure Vision,” P. Churchland, V. Ramachandran, & T. Sejnowski, p. 37-38. Using conversation shadowing, Broadbent and Treisman demonstrated that one’s ability to absorb multiple simultaneous conversations is severely limited. Attention is bottlenecked at the site of working memory, as well as during perception. As George Miller pointed out long ago, “[T]he span of absolute judgment and the span of immediate memory impose severe limitations on the amount of information that we are able to receive, process and remember.” George A. Miller, “The Magical Number Seven, Plus or Minus Two: Some Limits on our Capacity for Processing Information,” The Psychological Review, Vol. 63, No. 2 (March, 1956). Given that humans have such tiny attentional windows, it is surprising the extent to which we take it for granted that we share the same world.  The world is laughingly beyond our capacity to fathom without rampant simplification.

We perceive only a tiny portion of our environment, yet perception seems so very full. This illusion of fullness invites us to make wild cognitive leaps in blissful ignorance, and we repeatedly oblige.

eye with duck
Photo by Erich Vieth

Despite the overwhelming world we inhabit, and despite our ability to attend to only narrow slices of it, we always seem to be dealing with a full perceptual platter. Our world does not seem at all gappy. We are willing to make important decision based on whatever evidence (or lack thereof) is before us.  The spotlight of attention fills in our huge knowledge gaps so incredibly smoothly that we don’t notice our own severe limitations. I think of this “fullness” of Attention as an illusion of omniscience. We rarely get panicky, even though we perceive the world through such small windows. “Our representation of the visual world, Daniel Dennett speculates, is probably more gappy than our introspective access would ever lead us to suspect.” [“Ships in the Night: Churchland and Ramachandran on Dennett’s Theory of Consciousness,” Perception, ed. K. Akins, p. 179].

Given that none of us has a God’s-eye view, perceptual and memory gaps are not normally apparent.  At any given moment, it’s difficult to think of what might be “missing” from our perception of the world.  The answer, of course, is shitloads of stuff.

Even with the death of a close friend, one’s loss is quickly obscured by the “fullness” of one’s perceptions. Despite such a loss, an unbroken stream of thing-after-thing completely fills our perceptual field with people, places, things and ideas, which flies in the face of the undeniable absence of the person who just died. It just seems that there should be some visual, auditory or tactile “gap,” some obvious and incessant break in sensory “fullness,” in that the deceased person no longer physically walks and talks on this planet. It would seem that we should be seeing something like a big puzzle missing one critical piece. After the death of a loved one, however, the rest of this world quickly crowds in like thick fluid.

We almost always seem to be looking at a full picture. While making decisions, then, we don’t often question the things we don’t see, and maybe this is for our own good. We would harass ourselves to the point of insanity were we to seek out each of our perceptual and memory gaps.

I thought about this illusion of “Fullness” and our happy willingness to make decisions even when we are woefully informed, as I read Daniel Kahneman’s description of a similar concept, What-you-see-is-all-there-is (“WYSIATI”), in his extraordinary book, Thinking, Fast and Slow. Kahneman starts his book by pointing out that human animals have two often-conflicting systems for making decisions. System 1 is quick, dirty and automatic. System 2 must be engaged consciously, and with some effort:

System 1 operates automatically and quickly, with little or no effort and no sense of voluntary control. System 2 allocates attention to the effortful mental activities that demanded, including complex computations. The operations of system 2 are often associated with subjective experience of agency, choice, and concentration … When we think of ourselves we identify with System 2, the conscious, reasoning self that has beliefs, makes choices, and decides what to think about and what to do. Although System 2 believes itself to be where the action is, the automatic System 1 is the hero of the book. I describe System 1 as effortlessly originating impressions and feelings that are the main sources of the explicit beliefs and deliberate choices of System 2.… You will be invited to think of the two systems as agents with their individual abilities, limitations and functions… The highly diverse operations of System 2 have one feature in common: they require attention and are disrupted when attention is drawn away.… System 2 has some ability to change the way System 1 works, by programming the normally automatic functions of attention and memory.

Pages 20-24.

WYSIATI is a function of “System I,” and it encourages us to jump to conclusions based on that which is in front of us.

Jumping to conclusions on the basis of limited evidence is so important to an understanding of intuitive thinking, and comes up so often in this book, that I will use a cumbersome abbreviation for it: WYSIATI, which stands for what you see is all there is. System 1 is radically insensitive to both the quality and the quantity of the information that gives rise to impressions and intuitions.

Page 86. We run and gun based on the evidence that we do know, rarely questioning the things that we don’t yet know. In fact, the less we know, the better, because in the absence of detailed knowledge, we are better able to construct a story that supports our beliefs:

Participants [of a psychological study] who saw one-sided evidence were more confident of their judgments of those who saw both sides. This is just what you would expect if the confidence that people experience is determined by the coherence of the story they managed to construct from available information. It is the consistency of the information that matters for a good story, not its completeness.

(Page 87).  WYSIATI occurs because “associative memory quickly and automatically constructs the best possible story from the information available.” We are not good at weighing quality and quantity of evidence. Rather, we tend to look for a coherent story.” (Page 186).  The combination of WYSIATI and associative coherence [we seek out facts that together tend to evoke an explanation in the form of a coherent story] tends to make us believe in stories we spin for ourselves.”  (Page 75, 154).

You build the best possible story from the information available to you, and if it is a good story, you believe it. Paradoxically, it is easier to construct a coherent story when you know little, when there are fewer pieces to fit into the puzzle. Our comforting conviction that the world makes sense rests on a secure foundation: our almost unlimited ability to ignore our ignorance.

Page 201.

WYSIATI gives us the freedom to easily make friends, enemies and gods:

Our minds are always eager to identify agents, assigning them personality traits and specific intentions.

Page 77.

WYSIATI also gives rise to the “focusing illusion,”: “Nothing In Life Is As Important As You Think It Is, While You Are Thinking About It.”

All we need to be confident that we are correct are a few facts and a story. For example, WYSIATI explains many cognitive biases including overconfidence (“we often fail to allow for the possibility that evidence that should be critical to our judgment is missing–what we see is all there is”), framing effects (different ways of presenting the same information often evoke different emotions”), and “base rate neglect” (our judgments regarding probability are often warped by vivid exemplars). (Page 87.).

Gappy information is no problem at all.  Unless a message is rejected as a lie, “it will have the same effect on the associative system regardless of its reliability. The gist of the message is the story, which is based on whatever information is available, even if the quantity of the information is slight and its quality is poor. WYSIATI.” (Page 127). to come full circle, Kahneman also concurs that, thanks to WYSIATI, we are fearless in our profound ignorance:

A mind that follows WYSIATI will achieve high confidence much too easily by ignoring what it does not know. It is therefore not surprising that many of us are prone to have high confidence in unfounded intuitions.

Page 239.

I’ve written about high confidence before, citing the work of Robert Burton, who indicates that a feeling of certitude fluidly substitutes for careful self-critical inquiry.  Also relevant to the above, I’ve also commented, repeatedly on the Dunning-Kruger effect:

[T]he unskilled suffer from illusory superiority, mistakenly rating their ability much higher than average. This bias is attributed to a metacognitive inability of the unskilled to recognize their mistakes

It appears that WYSIATI makes us all victims to Dunning-Kruger.  Perhaps this gives rise to the best argument for why “no man is an island,” and we need each other to knock each other off-course, to consider that the world is always bigger and more complex than any of our momentary views of our world.

Share

Erich Vieth

Erich Vieth is an attorney focusing on civil rights (including First Amendment), consumer law litigation and appellate practice. At this website often writes about censorship, corporate news media corruption and cognitive science. He is also a working musician, artist and a writer, having founded Dangerous Intersection in 2006. Erich lives in St. Louis, Missouri with his two daughters.

Leave a Reply