Anti-science thinking: Why it happens and what to do about it
Anti-science thinking can arise as a side effect of otherwise benign strategies people use to process information
Evolution is not the only topic that many Americans think about in an anti-scientific way. What Americans believe about climate change is largely determined by their political party affiliation and not their knowledge of science, which is roughly equal among Republicans and Democrats.
In an article just reprinted in a special edition of Scientific American that covers truth and disinformation, four Arizona State University psychologists explain how anti-scientific thinking can happen – to all of us – and what to do about it.
“As someone who uses evolution as a framework to study human behavior, I am personally troubled by how many Americans do not believe in natural selection,” said Douglas Kenrick, President’s Professor of psychology and lead author on the article. “My co-authors and I decided to share what is known from psychology about how people process information and suggest possible ways to overcome anti-science thinking in general.”
Kenrick wrote the article with Adam Cohen, professor of psychology; Steven Neuberg, Foundation Professor of psychology and chair of the department; and Robert Cialdini, Regents Professor emeritus of psychology and marketing.
Anti-science thinking can arise as a side effect of otherwise benign strategies people use to process information. The article illustrates how this happens for three such strategies: using rules of thumb to make decisions, the tendency to reaffirm existing knowledge and social pressure.
When people make decisions that are based on a lot of information or on complex information, they often rely on rules of thumb called heuristics. An example of a common heuristic is someone booking air travel to avoid airports where they have previously missed a connection. A past missed connection might come to mind easily but predicts very little about the future trip. This mental shortcut is called the availability heuristic, and it and other heuristics like it can lead to anti-science thinking. Today some Americans use the fact that they do not personally know someone sickened with COVID-19 to explain why they choose not to wear a face mask, despite evidence showing masks can help reduce virus spread.
“Scientific recommendations often change over time, like what happened for face masks, because they are based on the best available evidence at the time. This is a good thing, not evidence that scientists just change their minds or don’t know what they are talking about,” Cohen said.
The tendency to reaffirm what we already know
People naturally pay close attention to information that is familiar and supports what they already know. Psychologists and behavioral scientists call this phenomenon the confirmation bias, and a famous study from 1979 shows how difficult it is to overcome. The study had Stanford University undergraduates listen to evidence for and against the death penalty. The students heard the same information, but they selectively used it to strengthen what they already thought about the death penalty. No students changed their mind; instead, they all became more resolute in their beliefs.
“When people have a lot of information coming at them, they often connect it to what they already know. This strategy can lead to people not forming a more tempered view after examining evidence for and against something – their view becomes more extreme,” Kenrick said.
But the same study also found a way for people to blunt the confirmation bias. Instead of asking the Stanford students to consider all information objectively, the researchers prompted the students to ask themselves what they would think if the information about capital punishment disagreed with their previous opinion. Playing devil’s advocate for the opposing side erased the confirmation bias.
The same motivations that compel people to get along in a group also affect how they think and act. The social pressure to remain a member of a group is so powerful that disagreeing activates the amygdala, a brain area that tracks negative emotions.
“Everyone has biases in cognition, but no one wants to say or even think things that endanger their status in groups that are important to them, like political parties,” Cohen said.
Overcoming social influences that contribute to anti-science thinking is possible when not everyone in the group agrees, as illustrated by social psychologist Stanley Milgram’s 1960s experiments on obedience. When the experimenters ordered them to, all the participants delivered what they thought was a powerful electric shock to another person – even if the request made them uncomfortable or upset. But when the participants were part of a group and the others in the group refused to deliver the shock, only 10% went ahead and gave the shock.
Emotions also affect how people process information, and the combination of social pressure and fear is especially potent.
“Instead of scientists repeating the same facts, which does not work and can exacerbate the issue by increasing the amount of information people have to process, we need a more positively framed approach to disseminating information to avoid automatically producing an avoidance reaction in people,” Kenrick said.