Science has incredible power. In few centuries, thanks to scientific knowledge, we nearly doubled our lifespan, boosting the quality of life, and we learnt a lot about the nature of the universe. However, it’s easy to find people who mistrust science. Especially during the last decades, the percentage of people who lost faith in academia and science in general increased significantly. In the US, between 2006 and 2014, people with a lot of confidence in academia dropped from 41 to 14%. In the same way, between 2009 and 2015 the number of people who think science made life more difficult increased by 50%.
From where mistrust in science comes from?
The question is not accurate. In fact, different kinds of skepticism should be taken into account. “I trust vaccines but I do not believe in climate change” and “science is only one of many options” are two different ways to express mistrust in science, which should not be lumped together. I refer to a study published by Bastiaan Rutjens and colleagues in 2018 to explain. They indicate four different predictors of science skepticism: political ideology, religiosity, morality and knowledge of science. Through a survey on North-American people, they observed that conservative political ideology is correlated with disbelief in climate change. Strong religious feelings have a correlation with mistrust in vaccines and little knowledge of science with skepticism about genetic modification. Finally, religious conservatives are the category of people who support science the least. Later, the authors performed the same survey in the Netherlands, mirroring only some of the results obtained in the US. For example, religiosity has a correlation with mistrust in evolution and not vaccines, and spirituality with general low trust in science. Instead, correlation between political conservatism and climate change, and between scientific knowledge and vaccines/genetic modification skepticism were confirmed also in Europe. So, the take-home message is that there are different kinds of skepticism that should be treated differently.
Several studies proved that vaccines do not cause autism, that global warming is happening, that people are not safer by having a gun at home. But still, many people refuse these facts and persist in their erroneous beliefs. The sociologist Gordon Gauchat examined survey data from 1974 to 2010 in the US, and he observed an inexorable loss of trust in the scientific community.
Why trust in science is constantly decreasing?
Nowadays, multiple factions support disbelief: some religious groups challenge evolution, industry groups share global warming skepticism. Together with them, pseudosciences shamelessly contributes to misinformation, by competing with the scientific method. This method, which was introduced in the 17° century, begins with the observation of a phenomenon, from which an assumption is derived. This hypothesis is then verified by rigorous scientific experiments, which can either confirm or refute it. Pseudo-scientists acts differently: first, they claim that scientific consensus derives from a conspiracy to suppress alternative opinions; then they produce fake experts, without reliable scientific track record, to confirm their claims; they cherry-pick the few data and papers that challenge the main view to discredit the entire field and they finally require impossible research efforts to defend the mainstream position. Evidently, the latter strategy is more catchy to people, who often do not understand the scientific method (not being their fault, science is difficult even for scientists).
Now let’s see why it’s so hard to make people drop their wrong beliefs. When people are confronted, an area in the brain that suppresses unwanted representations becomes active, and it’s easy to test this phenomenon by performing a Stroop test. The test consists in telling the color of the words in the two squares below (not reading the word). With a timer, you’ll notice that it takes longer to mention colors in the right square, because the text and the color don’t match. Try it!
This internal conflict aims to suppress information that proves a belief wrong, and could partially explain the difficulty in fighting people’s skepticism. Another interesting feature of our psychology is called Dunning-Kruger effect: it’s a cognitive bias that makes people of low ability/knowledge in a specific field feel over-competent about it. It is conceivable that internet emphasized this phenomenon, by making people feel that a little online reading could give them the same knowledge of a person who spent the entire life working on a topic. Professor Thomas Nichols stated that Internet’s openness offers a “Google-fueled, Wikipedia-based, blog-sodden” illusion of knowledge and that “Internet encourages not only the illusion that we are all equally competent, but that we are all peers. And we’re not.” Before internet, people got information only from experts who were interviewed or wrote chapters in encyclopedias. Internet opened a Pandora vase of misinformation, and people are not trained to distinguish it from real knowledge (for more info about it, read here).
How to help people gaining back trust in science?
“The debunking handbook”, compiled by John Cook and Stephan Lewandowsky, shows that rebutting pseudoscience is not effective, because it backfires. In fact, disproving false claims results in shedding light on the fraudulent study, which in turn becomes more popular and gets stronger support from believers. A more efficient strategy is to focus on spreading good science, without referring to the false claims. For example, it is worthless to fight the false myths about vaccines, but rather it is positive to highlight why they are good, providing all the information in support of that.
Who should do it?
Governments and institutions should put bigger effort in repelling misinformation and pseudosciences, by providing guidelines about how to recognize a fake news, and forcing mass media and social networks to limit their spreading. Because this is never a priority for politicians, scientists should contribute actively to the cause. Inspiration could be taken from the Pro-Truth Pledge (https://www.protruthpledge.org/), which is a non-profit, volunteering-based aggregation of scientists who aim to share, honor and encourage the truth, by fact-checking, acknowledging true information (even if in disagreement), asking people to retract information once disproved by reliable sources, and so on. As scientist, I feel that scientific community is not cohesive enough towards this goal, because of competition among “big shots” and the oppressive “publish-or-perish” condition of scientists. The scientific community can be seen as one of the biggest enterprises in the world and people expect to see scientist on the same side, like when they “March for Science”. Obviously it is fair that scientists disagree and debate, but what do people think of those debates? In fact, another point I want to make is that science is very poor in communication. Popular science is seen as a niche activity, but it is the only way to get to people and tell them that we are doing good things. What if research groups from the same field made a collaborative effort to publish popular science articles and exploited social media to reach also outside the academia environment? To catch up with the communicative skills of pseudosciences is a first required step.
Definitely it is possible to recover trust in science, but action needs to be taken as soon as possible.
Davide Giorgio Berta