Many articles purporting to examine morality bore me. They tend to be laundry lists of personal preferences–the writer’s catalog of things that personally annoy and delight him or her–completely un-anchored by the scientific method or, for that matter, by any sort of disciplined thinking. Such articles have been around for a long time. Many of them were written prior to 1785, when Immanuel Kant wrote his Foundations of the Metaphysics of Morals, where he urged that we get serious about morality’s underpinnings. Though Kant’s categorical imperative leaves much to be desired as a full description of phenomenon of morality, it should be noted that Kant did not have access to the modern findings of cognitive science.
At edge.com, Marc D. Hauser, a Harvard Professor of Psychology, Organismic & Evolutionary Biology and Biological Anthropology, has published an article entitled “It Seems Biology (Not Religion) Equals Morality.” Hauser’s article, based on many of his prior writings, is a rigorous, insightful and succinct account of the roots of human morality.
Hauser starts his article with an attack on the commonly heard claim that religion is a major source of our moral insights. There is not a drop of evidence suggesting this, as should be obvious. After all, morally deficient believers and morally enlightened nonbelievers are ubiquitous (and vice versa). Hauser does acknowledge that religions do endow their members with a sense of meaning and community. His sharp attack, however, is on the narrow claim that religions provide “the only– or perhaps even the ultimate– source of moral reasoning.” This raises an obvious question: If our sense of morality is not based on religion, on what is it based?
Hauser argues thatscience has demonstrated that each of us is endowed with a gift from nature: “a biological code for living a moral life.” Our biologically endowed “cold calculus” takes the form of rules such as these: Actions are seen as worse than omissions; and forcing someone to do something for the greater good is worse if you make a person worse off in the process. Hauser describes this set of rules as a “moral grammar . . . and impartial, rational and unemotional capacity . . . an abstract set of rules for how to intuitively understand when helping another is obligatory and when harming another is forbidden.”
This impartial grammar has been revealed through experiments in which people were presented with unfamiliar moral dilemmas (he avoided such well-worn topics as abortion and euthanasia). For instance, is it permissible for a hospital to involuntarily take various internal organs from a healthy person walking by the hospital in order to save the lives of five patients needing transplants? When these sorts of dilemmas are presented to people of wildly divergent cultural backgrounds, the surprising finding is that their particular backgrounds are virtually irrelevant to determining how they will resolve such dilemmas.
The work of Frans de Waal dovetails nicely with Hauser’s writings. In particular, De Waal has argued that humans have evolved to be predominantly groupish and peace-loving beings who are well-tuned to look out for each other. Therefore, the question arises: what has gone wrong where we see moral atrocities? Hauser’s answer is that these atrocities arise due to culturally constructed emotions that fuel “in-group favoritism, outgroup hatred and ultimately dehumanization.” Essentially, we become just like psychopaths with regard to those we perceive to be in out-groups. Psychopaths are generally this way toward all others–they know the “rules” but they don’t care. The rest of us are psychopaths toward every who we characterize to be our outgroup. We see these people in outgroups as “disposable.” We allow children overseas to die, even when we have the money to prevent these deaths, and even when we would not allow the child of a sibling or a neighbor’s child (who we perceive to be in our ingroup) to suffer.
Here lies the answer to understanding the dangers of nurture, of education and partiality. When we fuel in-group biases by elevating and praising members of the group, we often unconsciously, and sometimes consciously, denigrate the “other” by feeding the most nefarious of all emotions, the dragon of disgust. We label “the other” (the members of the outgroup) with a description that makes them as subhuman even an adamant, often parasitic and file, and thus disgusting. When disgust is recruited, those in the ingroup have only one way out: purge the other.
Hauser’s work also dovetails well with the research of Jonathan Haidt, who has argued that disgust is one of the five pillars of morality. Haidt considers in-group/outgroup tension to be another one of those five pillars of morality (a separate pillar), whereas Hauser appears to be consolidating these two factors (people in outgroups disgust us). This consolidation seems to be the case, at least on an intuitive and anecdotal basis. Xenophobia and disgust do seem to go hand in hand. Mistreatment of members of outgroups not only allowed, but sometimes encouraged by those who preach universal love. Consider, for instance, the way that the members of many religions characterize gays–they are usually relegated to the outgroup. Hauser’s argument also comports with the basic findings of those who have studied human reactions to ingroups and outgroups.
If left unexamined and unchecked, our evolved system of simplistically categorizing people into ingroups and outgroups leads to moral catastrophe. This simplistic and intuitive system evolved while we lived in small groups of highly familiar people (many of them family members), and during times when there were no formal laws that coordinated large numbers of widely diverse individuals.
According to Hauser, this genesis of the problem also presents a potential solution. Although all animals have evolved the capacity to distinguish between members of the in group and out groups, these features are not calibrated in the genome. They are “abstract and content free,” much as our biologically endowed rules of moral grammar. We learn how to define our ingroup (and consequently, outgroups). Even seemingly compelling distinctions among humans, such as “racial” differences can be diminished or even eliminated by spending time with different-seeming others.
Moral education requires introducing all children, early in life, to a wide varieties of religions, political systems, languages, social organizations and races. Research shows that those who dated or married people of other “races” don’t so readily characterize those of other “races” to their outgroup. Exposure to diversity is perhaps our best option for reducing, if not eradicating, strong outgroup biases.
Hauser urges that we take our intuitive moral intuitions to task. We need to consciously push ourselves beyond our local family and community and train ourselves to “listen to the universal voice of [our] species.” We need to become “champions of plurality.”
At bottom, we need to recognize that diversity is not simply a buzzword. It is a critical part of the moral curriculum. We need to make ourselves spend time with different others, so that we are more likely to see one race, not many. We need to learn to see only fellow humans, rather than “our people” versus sub humans. Only when we have trained ourselves this way can our universal instinct toward empathy and our biologically endowed abstract moral grammar work together t pragmatically resolve differences peacefully. This would be a much better alternative to cracking heads and going to war based upon our ancient impulses toward unexamined, unenlightened and unjustified disgust.
About the Author (Author Profile)Erich Vieth is an attorney focusing on consumer law litigation and appellate practice. He is also a working musician and a writer, having founded Dangerous Intersection in 2006. Erich lives in the Shaw Neighborhood of St. Louis, Missouri, where he lives half-time with his two extraordinary daughters.
Sites That Link to this Post
- Moral conduct in the absence of commandments. | Dangerous Intersection | August 29, 2010