How Your Brain Keeps You Believing Crap That Isn’t True

In response to a question about whether the Bush administration had adequate evidence showing Iraq was providing weapons of mass destruction to terrorist groups, former Defense Secretary Donald Rumsfeld famously said:

There are known knowns. There are things we know that we know. There are known unknowns. That is to say, there are things that we now know we don’t know. But there are also unknown unknowns. There are things we do not know we don’t know.

Something similar can be said about our beliefs. There are “true truths”—things we believe are true and genuinely are: The world is (roughly) round, not flat. Losing weight requires that we exercise more and eat fewer calories. Smoking cigarettes is bad for your health. There are also true lies—things that we believe to be false and actually are. The existence of Santa Claus, the Easter Bunny, and perpetual motion machines fall into this category. So far, so good.

But many other times, we’re tricked by false truths—things we think are true but aren’t. Drinking eight glasses of water a day seems like a good idea, but it doesn’t do a bit of good for your health. Many people believe that Napoleon was short, but there’s good reason to believe he was actually a bit taller than the average Frenchman of his day. Reducing salt intake has never been shown to prevent heart attacks or strokes, and there’s no such thing as an allergy to MSG.

How do these false truths come to be so widely believed? The answer lies in a powerful shortcut that our brains use every day: Information that’s easier to process is viewed positively in almost every way. Cognitive scientists refer to this ease as “processing fluency,” and it’s why your knowledge base is probably more full of flawed ideas than you’d like to believe.

The Mental Shortcut You’re Constantly Making

The effect of processing fluency on how we see the world is very robust—possibly alarmingly so. The greater something’s “fluency,” the more we tend to like it, the less risky we judge it, the more popular and prevalent we believe it is, and the easier we think it is to do. Meals whose recipes are written in hard-to-read fonts are judged as more difficult to make. Money with which we’re unfamiliar is perceived to be less valuable. Stock prices of companies with easy-to-pronounce names do better on the day the company goes public than others.

Nor does this come down to just different types of information—it also matters how the same piece of information is presented or stated. We’re more likely to believe statements that are themselves easy to process. And one of the easiest ways to increase the fluency of a statement is to repeat it. Randomized controlled trials have shown that people are more likely to believe things to which they’ve been exposed repeatedly. What’s more, the simple act of recalling a “fact” increases its fluency and therefore makes it more believable.

In other words, what counts as common knowledge is a mix of things that are true and other things that are false, all of which are believed because they’re widely held, frequently repeated, and routinely recalled. It’s this fluency-as-a-surrogate-for-truth shortcut that makes innovation tricky: We trust in assumptions about the way the world operates that seem so obviously true that we fail to test them. And in failing to check these basic assumptions, we slam the door shut on finding new and better ways to do things.

For example, when I worked at a large health care company, we observed that the vast majority of patients preferred to get their regular medications from local pharmacies rather than through the mail. Common sense told us that patients were voting with their prescriptions, choosing retail pharmacies over mail order, and that despite being able to save money by switching, those savings weren’t a big enough enticement to get them to change.

As in many other cases, though, common sense was wrong. It turned out that between 35% and 50% of those patients preferred mail order to retail. They simply hadn’t gotten around to making the change. What we thought was an intentional choice was just behavioral inertia.

Testing The Ideas We Don’t Know Are Bad

So how can you bypass your brain’s natural processing-fluency shortcut to make sure you aren’t not clinging to so many false assumptions?

The best way is to build explicit experimentation into how you operate. For example, suppose you have a formal process for ranking candidates that you’re considering hiring. You might periodically, and at random, hire the candidate ranked second or third. This approach allows you to test whether your ranking algorithm is actually working; without it, you’ll never really know.

In the meantime, though, we shouldn’t be all that surprised that our brains assume that things that are easier to process are just all around better. After all, in the harsh and dangerous environment in which our brains evolved, things that were familiar—the people in our group, the path to the river, the sun and moon moving across the sky—were likely to be safer and more trustworthy.

But our environment has changed enormously since then. Now more than ever, we need something far more reliable to separate truth from fiction. And for that, there’s always the scientific method.

 

Fast Company , Read Full Story

(44)