Over the last few centuries, the scientific method has established itself as a pretty useful tool.
A key facet of scientific methodology is reproducibility.
Essentially, good science should be easily reproduced with the same methodology in a different setting.
Ten years ago, Nature published an article noting that more than half of psychological studies cannot be replicated.
It seems little has changed since then, with claims that sociology, economics, medicine and other fields are all experiencing reproducibility issues.
Dodgy dudes doing dodgy things
Science is littered with dodgy work that, despite heavy criticism, remains influential.
Take the Stanford Prison Experiment. A team of researchers randomly assigned volunteers as either prisoners or guards to study behaviour in jails.
The study was cancelled almost immediately after ‘prisoners’ were abused by ‘guards’.
Caption: The Stanford Prison Experiment was called off after less than a week.
Credit: Philip Zimbardo
This was unreplicable for several reasons. First, ‘prisoners’ were not protected from the ‘guards’, which cannot be allowed to happen again.
Second, the lead researcher Phillip Zimbardo was instructing the guards about how to act, undermining the experiment’s premise.
Scientists have nevertheless attempted to replicate something similar with inconsistent results.
Digital disinformation
Associate Professor Chris Kavanagh is a cultural anthropologist and psychologist at Rikkyo University.
He has been vocal about this issue, pointing out the pervasiveness of the replication crisis due to unregulated podcasts like The Huberman Lab and popular science YouTube channels such as TED.
The influential TED Talk Your Body Language May Shape Who You Are is guilty of this.
Social psychologist Dr Amy Cuddy claims that altering your posture can change how you feel and forcing a smile can make you happier.
She reached this conclusion after measuring cortisol levels in people’s brains after they adopted a ‘power’ pose for just minutes.
Supposedly, people who posed like a gorilla felt more confident and were less stressed.
The problem is that power poses aren’t so powerful under a rigorous scientific lens.
Scientists have struggled to replicate Cuddy’s research, but the TED Talk has amassed an impressive 27 million views.
Publish or perish
Chris argues there’s fault on both sides of the academic publishing industry.
“There’s a well-known issue called publication bias,” he says.
Publication bias is the propensity for academic journals to prioritise studies with positive, statistically significant results.
“Journals want to publish interesting results with novel findings, but science relies on negative results and things which aren’t really groundbreaking,” says Chris.
“They deserve a lot of the blame for the situation because a lot of the incentives were allowing questionable research practices.”
That said, researchers contribute to the problem as well.
“Academics are trained to respond to the incentives of the publishing industry,” says Chris.
He says students are told, “You’ve collected all this data and the results are negative, you don’t need to panic.”
“Go back to your data with an open mind … Control for different things and see if you can find a result.”

Credit: Matt Cardy/Getty Images
This kind of retrospective malpractice is known as p-hacking.
The p-value is a measure of how likely the observed outcome can be attributed to chance.
This p-hacking generally occurs when scientists reassess their hypothesis after the data has been collected due to the results being deemed boring or insignificant.
Just pre better
In recent years, things appear to have improved due to organisations such as the Journal of Negative Results, a now defunct journal dedicated to publishing studies with no significant outcomes.
Or there’s the Many Labs Project, where 51 researchers from 35 institutions tested the outcomes of significant studies using exactly the same methodology, often with results that the original researchers found less than satisfying.
Since the replication alarm bells rang a few years ago, a lot of the social sciences have cleaned up their act.
Chris suggests this is thanks to an increase in the number of pre-registered studies.
Pre-registration is the process of recording the methods and analysis of your study into an open-source database before the study has been conducted.
That way, you can’t skew the data afterwards to suit preconceived notions or biases from academic publishers
However, pre-registration isn’t a silver bullet.
“Very few people go and check pre-registrations. Very few people download datasets and analyse them,” says Chris.
“It’s not usually a prerequisite for publication.”
So while we have good reason to trust in the scientific process of criticism and debate, there are still plenty of gaps for crappy science to slip through.