RSSCategory: cognitive biases

Dotted lines on paper

February 21, 2011 | By | 3 Replies More
Dotted lines on paper

Wray Herbert writes in a Scientific American article titled “Border Bias and Our Perception of Risk” of a study by husband-and-wife team Arul and Himanshu Mishra at the University of Utah on how people perceive events within a bias of arbitrary political borders.

Asked to imagine a vacation home in either “North Mountain Resort, Washington, or in West Mountain Resort, Oregon” the study group was given details about a hypothetical seismic event striking a distance that vacation home, but details differing as to where the event occurred:

Some heard that the earthquake had hit Wells, Wash., 200 miles from both vacation home sites. Others heard that the earthquake had struck Wells, Ore., also 200 miles from both home locations. They were warned of continuing seismic activity, and they were also given maps showing the locations of both home sites and the earthquake, to help them make their choice of vacation homes.

The results revealed a bias in that people felt a greater risk when the event was in-state as opposed to out of state. A second study involved a not-in-my-backyard look at a radioactive waste storage site and the Mishras used maps with thick lines and thin dotted lines to help people visualize the distances and state borders. It isn’t hard to guess which lines conveyed a greater feeling of risk.

I recall a story my brother told me about 17 years ago in which he was helping an old friend change the oil in his farm tractor. My brother asked, “Hey, Jack, where do you want me to put this [the used oil]?” Jack said, “Pour it over there on the stone wall.” (We lived in Connecticut, where they grow those things everywhere). Brother Marshall said, “Jack, you can’t do that anymore.”

Jack thought a short second or two, and said, “Yeah, you’re right. Better pour it on the other side.”

Share

Read More

Affirmative action for conservatives?

February 20, 2011 | By | 13 Replies More
Affirmative action for conservatives?

I have written several posts holding that we are all blinded by our sacred cows. Not simply those of us who are religions. This blindness occurs to almost of us, at least some of the time. Two of my more recent posts making this argument are titled “Mending Fences” and “Religion: It’s almost like falling in love.” In arriving at these conclusions, I’ve relied heavily upon the writings of other thinkers, including the writings of moral psychologist Jonathan Haidt. Several years ago, Haidt posited four principals summing up the state-of-the-art in moral psychology:

1. Intuitive primacy (but not dictatorship)
2. Moral thinking is for social doing.
3. Morality is about more than harm and fairness.
4. Morality binds and blinds.

In a recent article at Edge.org, Haidt argued that this fourth principle has proven to be particularly helpful, and it can “reveal a rut we’ve gotten ourselves into and it will show us a way out.” You can read Haidt’s talk at the annual convention for the Society of Personality and Social Psychology, or listen to his reconstruction of that talk (including slides) here. This talk has been making waves lately, exemplified by John Tierney’s New York Times article.

Haidt begins his talk by recognizing that human animals are not simply social, but ultrasocial. How social are we? Imagine if someone offered you a brand-new laptop computer with the fastest commercially available processor, but assume that this computer was broken in such a way that it could never be connected to the Internet. In this day and age of connectivity, that computer will get very little use, if any. According to Haidt, human ultrasociality means that we “live together in very large

[caption id="attachment_16630" align="alignright" width="300" caption="Image by Jeremy Richards at Dreamstime.com (with permission)"][/caption]

groups of hundreds or thousands, with a massive division of labor and a willingness to sacrifice for the group.” Very few species are ultrasocial, and most of them do it through a breeding trick by which all members of the group are first-degree relatives and they all concentrate their efforts at breeding with regard to a common queen. Humans beings are the only animals that doesn’t use this breeding trick to maintain their ultrasociality.

[More . . . ]

Share

Read More

Bill Moyers: Facts threaten us.

February 15, 2011 | By | 2 Replies More
Bill Moyers: Facts threaten us.

According to Truthout, Bill Moyers recently gave a talk at History Makers, and had this disturbing information: well documented facts often backfire:

As Joe Keohane reported last year in The Boston Globe, political scientists have begun to discover a human tendency “deeply discouraging to anyone with faith in the power of information.” He was reporting on research at the University of Michigan, which found that when misinformed people, particularly political partisans, were exposed to corrected facts in news stories, they rarely changed their minds. In fact, they often became even more strongly set in their beliefs. Facts were not curing misinformation. “Like an underpowered antibiotic, facts could actually make misinformation even stronger.” You can read the entire article online.

I won’t spoil it for you by a lengthy summary here. Suffice it to say that, while “most of us like to believe that our opinions have been formed over time by careful, rational consideration of facts and ideas and that the decisions based on those opinions, therefore, have the ring of soundness and intelligence,” the research found that actually “we often base our opinions on our beliefs … and rather than facts driving beliefs, our beliefs can dictate the facts we chose to accept. They can cause us to twist facts so they fit better with our preconceived notions.”

Share

Read More

Lessons Learned?

November 7, 2010 | By | 2 Replies More
Lessons Learned?

What can be drawn from this recent election that speaks to America?

To listen to the bombast, this election is all about money. Who has it, where it comes from, what it’s to be spent on, when to cut it off. An angry electorate looking at massive job loss and all that that implies tossed out the previous majority in Congress over money. This is not difficult to understand. People are frightened that they will no longer be able to pay their bills, keep their homes, send their children to college. Basic stuff. Two years into the current regime and foreclosures are still high, unemployment still high, fear level still high, and the only bright spot concerns people who are seemingly so far removed from such worries as to be on another plain of existence. The stock market has been steadily recovering over the last two years. Which means the economy is growing.

Slowly. Economic forecasters talking on the radio go on and on about the speed of the recovery and what it means for jobs.

Out of the other end of the media machine, concern over illegal immigrants and outsourcing are two halves of the same worry. Jobs are going overseas, and those that are left are being filled by people who don’t even belong here. The government has done nothing about either—except in Arizona, where a law just short of a kind of fascism has been passed, and everyone else has been ganging up on that state, telling them how awful they are. And of course seemingly offering nothing in place of a law that, for it’s monumental flaws, still is something.

[More . . . ]

Share

Read More

Alleged problems with small attorneys riding big elephants

October 1, 2010 | By | Reply More
Alleged problems with small attorneys riding big elephants

I’ve previously written about Jonathan Haidt’s approach to human moral psychology. His approach is termed the “Social Intuitionist Model” of moral motivation and it suggests that

moral behaviors are typically the product of multiple levels of moral functioning, and are usually energized by the “hotter” levels of intuition, emotion, and behavioral virtue/vice. The “cooler” levels of values, reasoning, and willpower, while still important, are proposed to be secondary to the more affect-intensive processes.

Haidt has used the metaphor of an intellectually-nimble lawyer riding on top of a huge emotion-permeated elephant to illustrate his counter-intuitive approach, suggesting that the small articulate lawyer on top often lacks meaningful control over the elephant. Moral judgments are usually dominated by emotions such as empathy and disgust (the strength of these is represented by the big-ness of the elephant). In short, Haidt is quite sympathetic to David Hume’s suggestion that moral reasoning is essentially “the slave of the passions.”

In the March 25, 2010 edition of Nature (available here), Paul Bloom expressed concern that something important has been left out of Haidt’s model. In reaction, Haidt defended himself against Bloom’s attack (see below), indicating that Bloom (whose work Haidt admires, for the most part) has misconstrued Haidt’s Social Intuitionist Model. I believe that summarizing this exchange between Haidt and Bloom sharpens the focus on the meaning of Haidt’s Social Intuitionist Model.

[More . . . ]

Share

Read More

Is religion honest?

September 17, 2010 | By | 37 Replies More
Is religion honest?

Most religious adherents would be aghast if one suggested that they, or their religion, were fundamentally and consistently dishonest. However I believe that is indeed the case.

I read a comment on a recent blog post by Ed Brayton (honesty vs intellectually honest).

Ed’s post argued about the distinction between honesty and intellectual honesty, and noted that intellectual honesty must recognize not only the arguments in support of a position, but also any evidence or arguments against that position.

One of the commenters (Sastra) then made the following case that faith was fundamentally intellectually dishonest:

[…] An intellectually dishonest person blurs the distinction [between being intellectually honest, and being emotionally honest], and seems to confuse fact claims with meaning or value claims. To a person who places emphasis on emotional honesty, strength of conviction is evidence. An attack on an idea, then, is an attack on the person who holds it. The idea is true because it’s emotionally fulfilling: intentions and sincerity matter the most. Therefore, you don’t question, search, or respect dissent. A person who is trying to change your mind, is trying to change you.

For example, I consider religious faith […] to be intellectually dishonest. It is, however, sincerely emotionally honest.

[…] “Faith is the substance of things hoped for; the evidence of what is not seen.” There’s a huge emotional component to it, so that one chooses to keep faith in X, the way one might remain loyal to a friend. You defend him with ingenuity and love, finding reasons to explain or excuse evidence against him. He cannot fail: you, however, can fail him, by allowing yourself to be lead into doubt.

Being able to spin any result into support then is a sign of good will, loyalty, reliability, and the ability to stand fast. The focus isn’t on establishing what’s true, but on establishing that you can be “true.” This emotional honesty may or may not be rewarded: the real point, I think, is to value it for its own sake, as a fulfillment of a perceived duty.

This is exactly the case with religion, and religious adherents.

Their faith in their god is entirely emotional, and no amount of material evidence will alter their belief.

They may be entirely honest in their belief, and may be entirely honest in their objection to evidence (cf Karl, Rabel, Walter, et al) but in doing so are being intellectually dishonest, because they refuse to recognize valid and entirely relevant evidence – they conflate with great consistency and verve fact claims with value claims, and deny any difference between them stating it’s all ‘interpretation’.

No, it isn’t all interpretation.

It’s dishonesty.

Share

Read More

Climategate scientists vindicated

July 7, 2010 | By | 2 Replies More
Climategate scientists vindicated

Another inquiry has determined that the “Climategate” scientists’ “rigor and honesty as scientists are not in doubt.” Not that this will slow down attacks on inconvenient science.

Perhaps the biggest lesson illustrated is that when you show know-nothings that they are wrong, it has no effect on their opinions. For an equally good example, read about the “Lenski Affair,” where the scientists had conducted 20 years of rigorous experiments that clearly demonstrated evolution of E. coli in the lab. Evidence just isn’t good enough for zealots.

Share

Read More

Comprehensive list of cognitive biases

July 3, 2010 | By | 5 Replies More
Comprehensive list of cognitive biases

The next time someone mentions that humans are “rational” you might want to refer them to Wikipedia’s list of dozens of cognitive biases. How handy to have all of these biases listed in one place. The list includes each of the following biases, each of them liked to specific Wikipedia articles.

Decision-making and behavioral biases

Bandwagon effect
Base rate fallacy
Bias blind spot
Choice-supportive bias
Confirmation bias
Congruence bias
Contrast effect
Denomination effect
Distinction bias
Endowment effect
Experimenter’s or Expectation bias
Extraordinarity bias
Focusing effect
Framing
Hyperbolic discounting
Illusion of control
Impact bias
Information bias
Interloper effect
Irrational escalation
Just-world phenomenon
Loss aversion
Mere exposure effect
Money illusion
Moral credential effect
Need for Closure
Negativity bias
Neglect of probability
Normalcy bias
Omission bias
Outcome bias
Planning fallacy
Pseudocertainty effect
Reactance
Restraint bias
Selective perception
Semmelweis reflex
Status quo bias
Von Restorff effect
Wishful thinking
Zero-risk bias

Biases in probability and belief

Ambiguity effect
Anchoring effect
Attentional bias
Authority bias

[More . . . ]

Share

Read More