Tools for Changing the World

Social psychology for social good

Sweet reason isn’t always sweet enough

December 6th, 2010


You can probably think of one or two issues on which you have very firm opinions – issues that you feel you’ve considered thoroughly and which have very obvious right and wrong sides. (Possibly more than one or two: I had to stop counting when I ran out of fingers and toes.)

But if someone provided evidence that you were wrong, you’d change your mind… wouldn’t you?

Don’t depend on it.

Now I don't believe you wanted to do that !  (lol)“Confirmation bias” is a psychological term for our tendency to look for or interpret information in a way that will support our existing beliefs or expectations. We all do it, and we’re all aware that people in general do it, but it’s easy to underestimate its effects.

One of the early studies demonstrating the confirmation bias was published in 1979 by Charles Lord, Lee Ross and Mark Lepper. They showed study participants the results of fictitious research that either supported or challenged the deterrent effect of the death penalty and asked the participants to judge how well the research had been conducted and how convincing it was. They also asked participants about their own beliefs both before and after reading about the research.

As you might guess, those study participants who already favoured the death penalty found the pro-deterrence research much more convincing and believed it was much better conducted than the anti-deterrence research. The reverse was true for those who already opposed the death penalty. Considering that all the study participants read about exactly the same research, this is a little unnerving. What’s worse is that after reading about either supportive or opposing research, participants said they were more confident that their original beliefs were correct.

Other studies have shown that people trying to discover whether a belief is true will almost always look for evidence to support it, not oppose it (a precisely opposite process to the scientific method, which seeks to disprove hypotheses). We also tend to recall information that supports our beliefs better than information that challenges them. We see (and remember) what we’re looking for.

This is a disconcerting view of our own brains, but it’s particularly unsettling for those of us trying to educate people into changing their beliefs. It suggests that simply providing credible evidence is not sufficient to change people’s minds – and will actively backfire if those minds are already made up. Fortunately, there are other techniques available to change minds (as this blog frequently points out). Unfortunately, there aren’t always alternatives to providing facts.

If you can think of a way to present information that might circumvent our tendency toward confirmation bias, please let me know in the comments. There’s too much that needs to be said to risk saying it badly.

Leave a Comment