Excuse me . . . my mortality is showing: meditations on life and death

Have you ever wondered why so many Americans wear clothing when it's warm outside? Are they really covering up for sexual propriety—because of shame? Or could it be that they are wearing clothes to cover up their animal-ness-- their mortality? I'm intrigued by this issue, as you can tell from my previous writings, including my posts about "terror management theory," and nipples. This issue came to mind again recently when I found a website that allows you to completely undress people. The site has nothing to do with sex, I can assure you, but it has a powerful set of images that raise interesting questions about human nakedness. To get the full experience, go to the website and select an image of a fully clothed person. These are absolutely ordinary looking people, as you will see. Then click on the images of any of these men or women and watch their clothes disappear. If you are like me, when their clothing disappears, this will not cause you to any think sexual thoughts. If you are like me, you will find yourself thinking that these people looked more "attractive" with their clothes on. For me, the effect is dramatic and immediate, and it reminded me of a comment by Sigmund Freud (I wasn't able to dig out the quote), something to the effect that we are constantly and intensely attracted to the idea of sex (duh!), but that sex organs themselves often look rather strange to our eyes--sex organs are not necessarily sexy. I think the same thing can be said for our entire bodies. Nakedness isn’t the same thing as sexuality or else nudist colonies would tend to be orgies (which, from what I’ve read, they are not). Rather, sexual feelings are triggered by the way we use our bodies. We do many things that are sexual, and most of these things take some effort. Simply being naked is not an effective way to be sexy. In America, people constantly confound nudity with sexuality. I admit that the media presents us with many ravishing image of sexy naked people, but the sexiness of such images is not due to the mere nakedness. There’s always a lot more going on than mere nakedness. Consider also, that when people actually mate, they often bring the lights down low, further hiding their bodies. Then why do Westerners cover up with clothing to be "proper"? I suspect that anxiety about death (not so much anxiety about sex) contributes to our widespread practice of hiding those naturally furry parts of our bodies—those parts associated with critically "animal" functions relating to reproduction and excretion of body wastes. [More . . .]

Continue ReadingExcuse me . . . my mortality is showing: meditations on life and death

Depression as an adaptation

According to the latest edition of Scientific American Mind, new research suggests that depression is not necessarily a a disease or aberration. In many cases, having a depression might increase your chances of survival.

[D]epression should not be thought of as a disorder at all. In an article recently published in Psychological Review, we argue that depression is in fact an adaptation, a state of mind which brings real costs, but also brings real benefits.
The researchers go out of their way to acknowledge that depression is a terrible problem for many people who should seek out help. Nonetheless, they also suggest that the mode of thinking characteristic of many bouts of depression is focused, highly analytical and systematic:

Analysis requires a lot of uninterrupted thought, and depression coordinates many changes in the body to help people analyze their problems without getting distracted . . . [D]epression is nature’s way of telling you that you’ve got complex social problems that the mind is intent on solving. Therapies should try to encourage depressive rumination rather than try to stop it, and they should focus on trying to help people solve the problems that trigger their bouts of depression . . . [D]epression . . . seems . . . like the vertebrate eye—an intricate, highly organized piece of machinery that performs a specific function.

For further reference: I first rand in to this idea that many bouts of depression might be useful in a book called Why We Get Sick, by Randolf Nesse.

Continue ReadingDepression as an adaptation

Creative denial of mortality as an evolutionary adaptation?

The August 6, 2009 edition of Nature (available online only to subscribers) includes a fascinating letter by Ajit Varki, a Professor of Medicine and Cellular & Molecular Medicine at the University of California San Diego, La Jolla. Varki begins his letter by recognizing some of the unique features of human animals, such as theory of mind, "which enables inter-subjectivity." These impressive human cognitive abilities might have been positively selected by evolution "because of their benefits to interpersonal communication, cooperative reading, language and other critical human activities." Varki then describes his conversations with a geneticist named Danny Brower (now deceased), who was fascinated with the question of why theory of mind emerged only recently, despite millions of years of apparent opportunity. Brower offered Varki a tantalizing explanation for this delay:

[Brower] explained that with the full self-awareness and inter-subjectivity would also come awareness of death and mortality. Thus, far from being useful, the resulting overwhelming fear would be a dead end evolutionary barrier, curbing activities and cognitive functions necessary for survival and reproductive fitness. . . . in his view, the only way these properties could become positively selected was if they emerged simultaneously with neural mechanisms for denying mortality.
In other words, self-awareness is a double-edged sword that tends to kill off (through terror-induced paralysis) those who become too readily self-aware. Therefore, self-awareness evolved together with denial of death--Brower was suggesting that those who became too clearly self-aware would become incapacitated by something of which chimpanzees, dolphins and elephants remain blissfully ignorant: the fact that they will inevitably die. Image by Puroticorico at Flickr Varki suggests that Brower's idea would not only add to ongoing discussions of the origins of human uniqueness, but it could shed light on many puzzling aspects of human psychology and culture:
[I]t could also steer discussions of other uniquely human "universals," such as the ability to hold false beliefs, existential context, theories of afterlife, religiosity, severity of grieving, importance of death rituals, risk-taking behavior, panic attacks, suicide and martyrdom.
Perhaps we are simply incapable of viewing life "objectively," in that evolution has rigged us up with equipment that protects us by deluding us. It seems, then, that the co-evolution of delusion and awareness (if this is the case) dovetails quite well with Terror Management Theory (TMT), which I summarized in a post entitled "We are gods with anuses: another look at “terror management theory”:

The problem is that the evolution of our powerful ability to be conscious made us aware that we are mortal beings and that all of us are heading toward inevitable death. The “solution” is also offered by our highly developed cognitive abilities: we have developed the ability to wall off our cognitively toxic fear of death by “objectifying” our existences and living idealized lives free from fear of death.

Brower and Varki thus suggest that the ability of humans to be extraordinarily aware and curious is too dangerous to be dispensed by evolution in its pure form. Too much knowledge can might be too dangerous. To safely allow the continuation of the species, human awareness might need to be deluded and distorted in ways that account for some of the most baffling "cultural" aspects of what it means to be human. Image by Latvian at Flickr (creative commons) This approach sounds promising to me, though it also raises many other questions, such as this one: Why are some of us apparently immune from these delusions? Why are some of us much more able to disbelieve claims of gods and afterlives?

Continue ReadingCreative denial of mortality as an evolutionary adaptation?

Inferred justification: We invaded Iraq, therefore Saddam Hussein caused 9/11

According to Sharon Begley's article at Newsweek, "Lies of Mass Destruction," people are susceptible to upside down reasoning. She cites a large team of researchers who studied the people who believe the lie that Saddam Hussein caused 9/11. The researchers concluded that these believers believed that lie because the U.S. invaded Iraq. They refer to this upside-down process as "inferred justification." Begley sums it up:

Inferred justification is a sort of backward chain of reasoning. You start with something you believe strongly (the invasion of Iraq was the right move) and work backward to find support for it (Saddam was behind 9/11). "For these voters," says Hoffman, "the sheer fact that we were engaged in war led to a post-hoc search for a justification for that war."

The researchers published their findings in a paper entitled "There Must Be a Reason”: Osama, Saddam, and Inferred Justification." Here's an excerpt from Sociological Inquiry.

The primary causal agent for misperception is not the presence or absence of correct information . . . Our explanation draws on a psychological model of information processing that scholars have labeled motivated reasoning. This model envisions respondents as processing and responding to information defensively, accepting and seeking out confirming information, while ignoring, discrediting the source of, or arguing against the

substance of contrary information. Motivated reasoning is a descendant of the social psychological theory of cognitive dissonance, which posits an unconscious impulse to relieve cognitive tension when a respondent is presented with information that contradicts preexisting beliefs or preferences. Recent literature on motivated reasoning builds on cognitive dissonance theory to explain how citizens relieve cognitive dissonance: they avoid inconsistency, ignore challenging information altogether, discredit the information source, or argue substantively against the challenge. The process of substantive counterarguing is especially consequential, as the cognitive exercise of generating counterarguments often has the ironic effect of solidifying and strengthening the original opinion leading to entrenched, OSAMA, SADDAM, AND INFERRED JUSTIFICATION polarized attitudes. This confirmation bias means that people value evidence that confirms their previously held beliefs more highly than evidence that contradicts them, regardless of the source.

In her article, Begley suggests that the current health care debate stems from the same cognitive vulnerabilities.

There are legitimate, fact-based reasons to oppose health-care reform. But some of the loudest opposition is the result of confirmatory bias, cognitive dissonance, and other examples of mental processes that have gone off the rails.

Continue ReadingInferred justification: We invaded Iraq, therefore Saddam Hussein caused 9/11

Glia: That other tiny engine of thought

Quick! Name a small and numerous component in the brain that allows us to think. If you said "neuron," you would be only partially correct. According to Carl Zimmer's blog at Discover, "The Loom," evidence is accumulating that thinking is also accomplished by astrocytes

—named for their starlike rays, which reach out in all directions—are the most abundant of all glial cells and therefore the most abundant of all the cells in the brain. They are also the most mysterious. A single astrocyte can wrap its rays around more than a million synapses. Astrocytes also fuse to each other, building channels through which molecules can shuttle from cell to cell.
To put glia into a broader perspective, consider Zimmer's introduction to his post on glia:

I’ve asked around for a good estimate of how many neurons are in the human brain. Ten billion–100 billion–something like that, is the typical answer I get. But there are actually a trillion other cells in the brain. They’re known as glia, which is Latin for glue–which gives you an idea of how little scientists have thought of them.

It has now been shown that astrocytes can sense incoming signals, respond with calcium waves, and produce outputs

In other words, they have at least some of the requirements for processing information the way neurons do. Alfonso Araque, a neuroscientist at the Cajal Institute in Spain . . . find that two different stimulus signals can produce two different patterns of calcium waves (that is, two different responses) in an astrocyte. When they gave astrocytes both signals at once, the waves they produced in the cells was not just the sum of the two patterns. Instead, the astrocytes produced an entirely new pattern in response. That’s what neurons—and computers, for that matter—do. If astrocytes really do process information, that would be a major addition to the brain’s computing power . . . neuroscientist Andrew Koob suggests that conversations among astrocytes may be responsible for “our creative and imaginative existence as human beings.”

Continue ReadingGlia: That other tiny engine of thought