Infographic on Cognitive Biases That Are Distorting Our Political Conversations

Check out this cool infographic at the Visual Capitalist.  Here is what is at stake, according to the website:  "[T]hird parties can also take advantage of these biases to influence our thinking." These should be memorized and sent as reminders to all of your half-crazed friends on social media.

Continue ReadingInfographic on Cognitive Biases That Are Distorting Our Political Conversations

Kahneman’s Inevitable Heuristics and Powerful Optical Illusions

Today, I was thinking about Daniel Kahneman's Thinking Fast and Slow. I think of this book at least several times each week. It's got to be one of the most important books I have ever read, in that it identifies numerous ways in which people are unwittingly mislead by mental heuristics, i.e., by their intuitions and shortcuts. You might be a bigger threat to yourself than any outsider.

The solution to Kahneman's heuristics is seemingly that we should be more careful or that we should train ourselves so that we are not mislead by these heuristics. Kahneman concludes, unfortunately, that heuristics are too strong to recognize in real time. He characterizes them to be like optical illusions. When we look at them over and over we will be fooled over and over.

Here is one of my favorite optical illusions: The Ames Window. My mind is intransigent. I can't unsee this illusion even though I know exactly what is going on.

Many (e.g., I've heard interviews by Sam Harris and Shane Parrish) have suggested to Kahneman that since we know about these systematic ways in which the mind runs off the rails, we can make adjustments. Kahneman will have none of it. "“The question that is most often asked about cognitive illusions is whether they can be overcome. The message … is not encouraging.”  Kahneman writes further: "Kahneman writes. “We would all like to have a warning bell that rings loudly whenever we are about to make a serious error, but no such bell is available.”

Or consider that Kahneman and Richard Nisbett seem to disagree on whether we can train out our systematic mistakes, yet this article ("The Cognitive Biases Tricking Your BrainScience suggests we’re hardwired to delude ourselves. Can we do anything about it?") suggests that we seem to be more easily trainable with regard to "easy problems."

This is nonetheless, discouraging, right?  Maybe that's why it is a good idea to work with groups of people.  Maybe someone else will catch your mistakes.  And if not, maybe you can catch your mistakes in a post-mortem and then add your mistakes on your check list to help you for the next time the situation arises.

Continue ReadingKahneman’s Inevitable Heuristics and Powerful Optical Illusions

I Cannot Read your Mind. Or your Face.

I'm not good at reading other people's minds, even when they think that I should have seen the emotions on their faces. Now there is science substantiating that I am not unusual in this regard.

Most of the time, other people can’t correctly guess what we’re thinking or feeling. Our emotions are not written all over our face all the time. The gap between our subjective experience and what other people pick up on is known as the illusion of transparency. It’s a fallacy that leads us to overestimate how easily we convey our emotions and thoughts.

The above excerpt is from an excellent blog, Farham Street.
Therefore, if we happen to be together, if you want to make sure that I understand what you are thinking, please use your words!

Continue ReadingI Cannot Read your Mind. Or your Face.

More on Political Opinions and Tribal Pressures

At The Atlantic, Jay Van Bavel discusses recent experiments showing that we are not permanently polarized with regard to our political positions. The article is titled, How Political Opinions Change.

In a recent experiment, we showed it is possible to trick people into changing their political views. In fact, we could get some people to adopt opinions that were directly opposite of their original ones. . . . A powerful shaping factor about our social and political worlds is how they are structured by group belonging and identities... We are also far more motivated to reason and argue to protect our own or our group’s views. Indeed, some researchers argue that our reasoning capabilities evolved to serve that very function.
People tend to take more extreme positions of their same viewpoint when challenged with information supporting the opposite view. The trick is to suggest to the person that they actually held the opposite view through false-feedback. The take-away: "people have a pretty high degree of flexibility about their political views once you strip away the things that normally make them defensive."

Continue ReadingMore on Political Opinions and Tribal Pressures