How to Ensure You Are Always Right

Welcome to Part 2 of my mini-series on better reasoning (Lesson 1). In today’s lesson we’re going to learn how to prove that you are right every single time–even when you’re not! You will learn about two important concepts: confirmation bias and counter-exampling.

Confirmation bias is our brain’s tendency to see only confirming evidence for our beliefs and to diminish the importance of or ignore inconvenient evidence. We have a belief or hypothesis and anytime we encounter confirming evidence we whisper softly to ourselves, “See! I knew it!”. But anytime something doesn’t fit our hypothesis, we ignore it and fail to adjust our hypothesis. Our brain is a confirmation-seeking machine, not a truth-seeking machine. This is true of everyone. It’s how human brains work.

Let’s look at a current example. There is a conspiracy theory that 5G cellphone towers either cause the symptoms of coronavirus or weaken our immune system so that we are vulnerable to it–and we wouldn’t otherwise be.

The conspiracy theory, and conspiracy theories generally, are a classic case of confirmation bias. Suppose you believe this conspiracy. People in Wuhan got coronavirus and guess what? They also have 5G cellphone towers. Boom! Conspiracy confirmed.

Know what other city got coronavirus? London. Know what kind of cellphone towers London has?

Five

Freakin’

Geeeeeeeeee.

BOOM! Conspiracy confirmed.

Pick any three cities with 5G cellphone towers and I’ll bet they have the Rona. BOOM! BOOM! BOOM! We’re on to you Bill Gates!

By now the reasoning error should be obvious. The conspiracy theorist can’t explain why people are getting coronavirus where there are no 5G cellphone towers. If it’s true that 5G in some way causes coronavirus symptoms or susceptibility then how do we explain why this also occurs where there is no 5G?

The hole in the conspiracy theorist’s case occurs because of the way the human brain works. Our default is to look to confirm our beliefs. Confirming any hypothesis–no matter how wrong–is really easy. All you need to do is focus exclusively on evidence that confirms what you want to be true and ignore evidence that doesn’t. You can always be right–even when you’re not–because you can always find evidence to support whatever you want. This is why there are flat-earthers. The earth looks flat. Hypothesis confirmed! …And ignore disconfirming evidence.

Correcting Confirmation Bias: Part 1

Because seeking confirmation is our default way of thinking, we need some method to correct what our brain does automatically. We need to do the opposite of what our brain wants. We need to look for evidence that disconfirms our hypothesis about the world. This method of thinking is called counter-exampling. And it’s extremely difficult to remind ourselves to do it because it it feels so good to have our existing beliefs confirmed. And nobody likes to be wrong.

Let’s use a really simple example to illustrate. Suppose someone claims that all philosophers have beards. They show you pictures of Aristotle. Boom! Hypothesis confirmed. They show you a picture of Plato. Boom! Daniel Dennett. Boom! I could do this all day. Hypothesis confirmed!

If you want to disprove their hypothesis you need to come up with an example that disconfirms. This is called a counter-example. Counter-examples are powerful because one good counter-example can overturn an entire hypothesis! So, to disprove “all philosophers have beards” we just need to find a philosopher without a beard. We won’t go through the list but this is not a difficult task.

Let’s apply this to the 5G conspiracy. They claim that if there were no 5G towers people would not be getting sick. A counterexample would be to find a place on earth where people are sick with the Rona but there are no 5G towers. There are lots of places that fit this description. In fact, it describes most of the world.

Countering Confirmation Bias: Part 2

It’s one thing to know a technique. It’s another to remember to apply it. How are we going to remember to engage in counterexampling instead falling into our stupid default?

The answer begins with a puzzle:

Confirmation bias is our natural way of thinking–for everyone. It is the product of 10s of thousands of years of human evolution. But it also frequently leads us to assent to beliefs that are false. If evolution selects for features (physical and cognitive) that best lead to survival, why would evolution select confirmation bias as our natural cognitive setting? Shouldn’t evolution have selected for disconfirmation bias? After all, that’s a much better way to discover truth.

This puzzled evolutionary psychologists for a long time. Several proposed theories but they all had problems. Perhaps you may have heard of the most popular–that it’s better to assume falsely that there’s a tiger behind the bushes and run than to stick around to disconfirm your belief. This explanation runs into some problems. It’s vulnerable to counterexamples!

The most promising new solution to the puzzle comes out of dual inheritance theory. The general idea is that, for social animals, there’s a tension between group-level selection pressure and individual-level selection pressure. Humans cannot survive on their own. We are primarily social animals. Our ancestors’ chances of survival were directly related to the cohesion of the group to which they belonged. Groups that were always disagreeing and bickering did not survive the harsh evolutionary conditions of our ancestors because they could not effectively coordinate their actions.

At the individual level, confirmation bias is a disadvantage. But at the group level it’s an advantage. Our ancestors’s survival depended more heavily on group cohesion than on particular individuals discovering the truth. And so group-level selection pressure led to confirmation bias even though it’s disadvantageous individually.

Think of it this way. Confirmation bias makes us more readily accept our group member’s proposals. We are less likely to come up with reasons to disagree. Think of your experiences in groups where people are constantly saying, “well, actshchually…” A group full of people like this would never survive in a dangerous environment that requires strong group cohesion and cooperation let alone a group project for an undergrad class. And they didn’t. That’s why we’re stuck with confirmation bias as our default setting.

So, assuming this is the correct explanation, we can draw two lessons to remind ourselves to employ counterexampling. The first is that, counter-intuitively, we should be most skeptical of news stories that fit too perfectly with our own social and political group’s beliefs. These are the situations when we are most vulnerable to confirmation bias because preserving group cohesion is why we think this way in the first place!

The second is to find some “well, actshchually” friends. Find people who belong to groups that you usually disagree with. I will tell you first hand, that this isn’t always pleasant and it can be fucking annoying as all hell. There is someone telling us when we’re wrong or at least contesting our claims. It feels waaaaay better to be right all the time–especially if it’s something that supports our team. But if we are genuinely concerned with truth and not just feeling like we’re right, it’s critical to build and maintain relationships with people who disagree with us.

The temptation to fall back into our evolutionary defaults is too strong to prevent on our own and your own group members are no help here. I know from personal experience. Even though this is my area of study, I still find myself occasionally falling for articles or making comments that are not well-supported or partly false. It’s just how we’re built. The more an article confirms my group’s world view, the more likely I am to fail to critically appraise it. Especially when I’m all riled up by the latest thing the other team purportedly did.

Conclusion

In Lesson 1, I said that I don’t like the term critical thinking because it has lost its original meaning. Most people think a good critical thinker is whoever holds the same beliefs as them. But becoming a good thinking has less to do with what you believe and much to do with the mental process that led you to your belief. Good reasoning is a systematic method of evaluating claims, arguments, and evidence.

So here’s the first step in thinking systematically: Anytime you want to evaluate a belief or hypothesis do the opposite of what your brain naturally wants you to do. Rather than look for confirming evidence, look for disconfirming evidence. That is, look for counterexamples. Especially if the claims are congenial to your group’s worldview. Oh, and find yourself some intellectual pests. They will annoy you but will keep you honest. And occasionally, you just mind discover that you were–gasp!–wrong.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s