Five Cognitive Biases That Prove Your Brain Hates Science

Senior Contributor
11.11.11 6 Comments

Earlier this week, we reported on idiot parents deciding they know better than doctors, and giving their kids chicken-pox infected lollipops instead of a vaccination. Not unreasonably, a lot of us are baffled on how precisely these people all hopped in the Douche Canoe and paddled it right up Kid-Killing Creek. How can they not pay attention to science? Are they stupid?

This is a question that gets asked a lot. Richard Muller, a prominent climate skeptic, recently released a report that was essentially him eating a lot of crow and stating that he was wrong, climate change was actually happening, and that it turns out that just because he’s a physicist doesn’t mean he understands climatology. But we still have climate change skeptics, and this didn’t slow them down. We still have people screaming that evolution is “just a theory”.

What the heck is going on?

The short answer: your brain doesn’t like being wrong. At all. And it will go to incredible lengths to convince itself it isn’t, even when it is. If science is based on the objective observation of facts, here are five cognitive biases working in our brains, all the time, to keep us from doing exactly that.

#5) Blindspot Bias

Ever looked at a political issue and thought, “Oh, man, the solution to this is so painfully simple. If only these people would stop yelling at each other. It’s too bad they don’t have the free, clear, intelligent view of things that I do”?

Sure you have. We all have. One problem; it’s completely full of crap. You’re as biased as everybody else, your brain just won’t admit it.

We’re all inherently biased to, weirdly enough, see ourselves as far more objective than the clowns that surround us. Every single one of us has a cognitive bias wherein our brain is convinced that we don’t have nearly the number of biases that other people do, so we can observe the situation more clearly and objectively.

In other words, if somebody comes to you with a view contrary to what you believe, you dismiss them because clearly, they must be biased. Of course scientists believe in science! It’s their job! They’re blinded by science! I know better than those biased scientists! Which is why these parents gave their kids a possibly fatal disease.

#4) Reactance

We’ve all heard cheesmo statements like “Forbidden fruit is the sweetest”, but what we may not realize is that psychologists are pretty sure that’s an actual scientific truth. Except they call it “reactance”, because they’re boring.

Reactance is defined as “the urge to do the opposite of what someone asks you to do out of a need to resist perceived constraints on your behavioral freedom”; in other words, if somebody asks you not to do something, your brain really wants you to do it, because it’s fun to be a dick. Seriously, some psychologists think part of this really is because it’s fun to do.

And it overwhelms common sense pretty handily: reactance tests usually involve showing messages to subjects asking them to do things like floss or not swill booze like a fish, and, without fail, they stop flossing and chug Night Train like it’s water because the subject is his own person, dammit, and nobody’s gonna tell him what to do!

Except some part of his brain, apparently. So, to bring it back to our Parents of the Year, part of the reason they did it was because everybody was telling them not to. That was exactly the cue they needed to “rebel” and…kill kids. Um, wait.

#3) Anchoring

To give you an idea of what a problem “anchoring” is, it’s something financial types warn each other against. It’s pretty simple: when making decisions, we tend to rely way too much on one piece of information. Once you form that anchor, it actually becomes more like a black hole, sucking every piece of information towards it and twisting it into a shape that’s more pleasing and lets your brain think it can’t be wrong.

To use the examples of our douche parents, the information they anchored on is “some studies say vaccinations may cause autism.” Once that anchor’s locked in, forget it. Everything they hear won’t have as much weight. Which actually triggers another bias:

#2) Confirmation bias

You’ve probably heard of this one, and it’s pretty simple: you search for information that supports your opinion, and you interpret information that doesn’t in such a way that you think you’re still right. In other words, your brain sits and sulks in the corner whenever somebody disagrees with it.

This is so pervasive that scientists themselves are deeply scared of it: confirmation bias is why every reputable scientific paper is subject to peer review, and why scientists document everything. But most people don’t have to engage in that level of rigor. They don’t have to show their friends the proof they have that vaccines put their kids at risk. After all, it’s their kids.

But the confirmation bias can’t be that strong, right? The studies were disproven, all of them. Everything about the anti-vaccine movement was shown to be wrong. At that point the brain has to accept that just maybe it screwed up, right?

Wrong.

#1) The Backfire Effect

It’s really basic: if you’re presented with clear logical evidence that disproves your beliefs, you don’t change your beliefs. Instead they get stronger.

If you’ve ever wondered how people with college degrees and no small experience with critical thinking can dismiss well-reported and documented scientific evidence right out of hand, there you have it. Not only does proving them wrong not make them change their mind, it does the opposite.

None of these are all powerful, of course; people change their minds all the time. It’s just not a process done with logic and careful evaluation of the facts. Usually there has to be some sort of emotional appeal. So, yes, if we’d sent them a fuzzy puppy wearing a sign saying “Pwease don’t give your childwen a tewwible disease”, that probably would have worked.

Around The Web