Here are some related and powerful insights from Robin Hanson over the past few years where he identifies a bias that motivates much more of our thinking that we’d care to admit. The general theme is that our we are biased in our beliefs because they are more than just beliefs to us:
In my experience “I believe X” suggests that the speaker has chosen to affiliate with X, feeling loyal to it and making it part of his or her identity. The speaker is unlikely to offer much evidence for X, or to respond to criticism of X, and such criticism will likely be seen as a personal attack.
Feel the warm comfort inside you when you say “I believe” – recognize it and be ready to identify it in the future, even without those woods. And then – flag that feeling as a dangerous bias. The “I believe” state of mind is quite far from being neutrally ready to adjust its opinions in the light of further evidence. Far better to instead say “I feel,” which directly warns listeners of the speaker’s attachment to an opinion.
We feel a deep pleasure from realizing that we believe something in common with our friends, and different from most people. We feel an even deeper pleasure letting everyone know of this fact. This feeling is EVIL. Learn to see it in yourself, and then learn to be horrified by how thoroughly it can poison your mind. Yes evidence may at times force you to disagree with a majority, and your friends may have correlated exposure to that evidence, but take no pleasure when you and your associates disagree with others; that is the road to rationality ruin.
There is another old post of Robin’s I think about often but cannot seem to find that offers a way to check yourself against biases like these. Since I can’t find it, I’ll try to summarize it.
Think about beliefs that you hold and imagine yourself changing your mind. Literally imagine waking up tomorrow with a changed mind and imagine how you would or wouldn’t discuss changing your mind with people you know. Feelings will be strong for beliefs that are important to our identities or that we value for some signaling purpose, like signaling affiliation with some group. Can you actually imagine yourself with these changed beliefs, or is it unthinkable?
In his post Robin argued that people often convince themselves that they truly reconsider their strongly held beliefs, but what they do is false reconsideration with the real purpose of reassuring themselves and strengthening the belief. Before it was just a strong belief, but after false reconsideration it’s a strong belief that they’ve really, definitely, seriously reconsidered. But if you can’t imagine yourself going through the day holding another set of competing beliefs than you never actually reconsidered it.
To provide a concrete example, I think many religious people tell themselves that they truly reconsider some of their deeply held religious beliefs. But can they imagine waking up tomorrow a non-believer and telling their significant others, parents, friends, children, and people in their church that they are now non-believers? If not, can you at the very least picture lying to these people about your beliefs for the rest of your life? If you can’t seriously tell yourself you could do one of these, you should be skeptical that you’ve ever really reconsidered your beliefs.
Conservatives, could you imagine becoming someone believes that higher taxes and unemployment insurance don’t hurt economic growth or employment? Liberals can you imagine becoming someone who believes that that minimum wages decrease employment and fiscal stimulus doesn’t work? If the answer is no, you should think about whether it’s because holding such a belief would conflict with your identity or affiliations.