Skip to main contentSkip to navigationSkip to navigation
Apothecary flasks
Scientists have a duty to call out dubious research, says Sian Townson. Photograph: Alamy
Scientists have a duty to call out dubious research, says Sian Townson. Photograph: Alamy

Why people fall for pseudoscience (and how academics can fight back)

This article is more than 8 years old

Ingrained cognitive biases play a role, as does inverted snobbery about educational privilege. But we must battle on, says this scientist

Pseudoscience is everywhere – on the back of your shampoo bottle, on the ads that pop up in your Facebook feed, and most of all in the Daily Mail. Bold statements in multi-syllabic scientific jargon give the false impression that they’re supported by laboratory research and hard facts.

Magnetic wristbands improve your sporting performance, carbs make you fat, and just about everything gives you cancer.

Of course, we scientists accept that sometimes people believe things we don’t agree with. That’s fine. Science is full of people who disagree with one another . If we all thought exactly the same way, we could retire and call the status quo truth.

But when people think snake oil is backed up by science, we have to challenge that. So why is it so hard?

As a nation, the British have always been wary of the “social elite” and academics are included under that label of privilege. Many people bristle at the idea of listening to those with multiple degrees talking down from on high to “correct” the less educated.

Academics have a reputation for being blinkered, arrogant, patronising and intolerant of those whose specialities differ from their own. But for every toffee-nosed academic I’ve met, there have been plenty of humble, engaging, enthusiastic ones who love their subjects and just want to get the word out there.

However, when you set out to counter an existing belief, you’re up against a whole load of cognitive biases – let’s take a look at what’s going on in the mind of your readers.

Sunk cost fallacy is the reason that people who have already wasted money on tickets to a terrible film also waste their evening watching it. It can be the reason that people chomp their way through terrible food or get married when the relationship has already soured – it’s the urge to justify previous decisions using the next one. And it means that if people have put their weight behind a belief, they are invested in it, and are likely to fight its corner.

Along with our love of being right, we are hooked on patterns, and make sense of the world by seeking them out. This leads us to confirmation and selection bias: we look for evidence to support a theory, and ignore evidence to the contrary. Given the several million individually observable things that happen to you every day, it’s easy to pick one to prove an idea you’ve already become attached to, whether superstition or stereotype.

Everyday life brings a lot of data and sometimes your subconscious summarises it badly, falling victim to the clustering illusion. Any random set of data looks like it has clusters of points in it. If it didn’t have clusters, it wouldn’t be random scatter, it’d be an evenly spaced pattern. But our addiction to order makes those clusters very seductive, and it’s easy to forget that two things that happened at the same time don’t have to be related.

The Dunning–Kruger Effect was brutally summarised by Darwin as “ignorance begets confidence”. The less you know, the more likely you are to perceive yourself as an expert. Conversely, the more you know, the more likely you are to doubt your own competence. This means that some people have illusory superiority, and some experts can’t explain how they do things because they assume what they do is easy or obvious to all.

One expert who can explain things is Christian Behrenbruch, Advance global Australian of the year in biotechnology, adjunct professor at RMIT University and a professorial fellow at Monash University.

Behrenbruch dedicates at least three hours a day to dispelling pseudoscience. He says: “Whenever there is money involved, science gets thrown out the window.

“For example, when the average punter decides to invest in a technology company, it takes an awful lot to dislodge the belief that the science may be quackery. All over the planet, there are rafts of small public companies that take money from gullible investors with poor science – but once they are hooked, they are hooked.

“The other area is health. As a health condition degrades and there become fewer and fewer treatment options, the tendency to try anything rises. The confounding part of this equation is the concept of human hope – and that, unfortunately, is what undermines science every time. We hope that something will work, we believe that something will work.”

Margaret Defeyter is director of business and employer engagement at Northumbria University, a role which has given her a lot of hands-on experience with public communication.

She has some practical advice for getting a message across: “I think many academics find it hard to explain research in everyday language. The best way I have found is through events such as the British Science Festival. The Healthy Living team ran a pop-up stand displaying hands-on models and activities based on research findings. This worked extremely well, as it took something quite abstract and made it concrete.”

If we’re going to dispel myths, we need to improve our ability to communicate, with creative approaches such as hands-on activities that encourage self-directed learning. Rather than just trying to stamp out misunderstandings, we need to offer people something else to believe in.

Join the higher education network for more comment, analysis and job opportunities, direct to your inbox. Follow us on Twitter @gdnhighered. And if you have an idea for a story, please read our guidelines and email your pitch to us at highereducationnetwork@theguardian.com

More on this story

More on this story

  • Well you would say that: the science behind our everyday biases

  • It's the perfect time to stop our unconscious bias going viral

  • Psychic future: what next for the ‘precog economy’?

  • Scientists must keep fighting fake news, not retreat to their ivory towers

  • Strongest opponents of GM foods know the least but think they know the most

  • Emotional intelligence: why it matters and how to teach it

  • Opinion vs facts: why do celebrities so often get it wrong?

  • Cognitive biases can hold learning back – here's how to beat them

  • There's a big difference between those who realise they're winging it, and those who don't

Most viewed

Most viewed