Research shows the appeal of untestable beliefs, and how it leads to a polarized society
“There was a scientific study that showed vaccines cause autism.”
“Actually, the researcher in that study lost his medical license, and overwhelming research since then has shown no link between vaccines and autism.”
“Well, regardless, it’s still my personal right as a parent to make decisions for my child.”
Does that exchange sound familiar: a debate that starts with testable factual statements, but then, when the truth becomes inconvenient, the person takes a flight from facts.
Our new research, recently published in the Journal of Personality and Social Psychology, examined a slippery way by which people get away from facts that contradict their beliefs. Of course, sometimes people just dispute the validity of specific facts. But we find that people sometimes go one step further and, as in the opening example, they reframe an issue in untestable ways. This makes potential important facts and science ultimately irrelevant to the issue.
Let’s consider the issue of same-sex marriage. Facts could be relevant to whether it should be legal—for example, if data showed that children raised by same-sex parents are worse off—or just as well-off—as children raised by opposite-sex parents. But what if those facts contradict one’s views?
We presented 174 American participants who supported or opposed same-sex marriage with (supposed) scientific facts that supported or disputed their position. When the facts opposed their views, our participants—on both sides of the issue—were more likely to state that same-sex marriage isn’t actually about facts, it’s more a question of moral opinion. But, when the facts were on their side, they more often stated that their opinions were fact-based and much less about morals. In other words, we observed something beyond the denial of particular facts: We observed a denial of the relevance of facts.
In a similar study using 117 religious participants, we had some read an article critical of religion. Believers who were especially high (but not low) in religiosity were more likely to turn to more untestable “blind faith” arguments as reasons for their beliefs, than arguments based in factual evidence, compared to those who read a neutral article.
These experiments show that when people’s beliefs are threatened, they often take flight to a land where facts do not matter. In scientific terms, their beliefs become less “falsifiable” because they can no longer be tested scientifically for verification or refutation.
For instance, sometimes people dispute government policies based on the argument that they don’t work. Yet, if facts suggest that the policies do work, the same person might stay resolvedly against the argument based on principle. We can see this on both sides of the political spectrum, whether it’s conservatives and Obamacare or liberals and the Iraqi surge of 2007.
One would hope that objective facts could allow people to reach consensus more easily, but American politics are more polarized than ever. Could this polarization be a consequence of feeling free of facts?
While it is difficult to objectively test that idea, we can experimentally assess a fundamental question: When people are made to see their important beliefs as relatively less rather than more testable, does it increase polarization and commitment to desired beliefs? Two experiments we conducted suggest so.
In an experiment with 179 Americans, we reminded roughly half of participants that much of President Obama’s policy performance was empirically testable and did not remind the other half. Then participants rated President Obama’s performance on five domains (e.g., job creation). Comparing opponents and supports of Obama, we found that the reminder of testability reduced the average polarized assessments of President Obama’s performance by about 40%.
To test this further the hypothesis that people strengthen their desired beliefs, when the beliefs are free of facts, we looked at sample 103 participants that varied from highly to moderate religious. We found that when highly (but not more moderately) religious participants were told that God’s existence will always be untestable, they reported stronger desirable religious beliefs afterwards (e.g. the belief God was looking out for them), relative to when they were told that one day science might be able to investigate God’s existence.
Together these findings show, at least in some cases, when testable facts are less a part of the discussion, people dig deeper into the beliefs they wish to have—such as viewing a politician in a certain way or believing God is constantly there to provide support. These results bear similarities to the many studies that find when facts are fuzzier people tend to exaggerate desired beliefs.
So after examining the power of untestable beliefs, what have we learned about dealing with human psychology? We’ve learned that bias is a disease and to fight it we need a healthy treatment of facts and education. We find that when facts are injected into the conversation, the symptoms of bias become less severe. But, unfortunately, we’ve also learned that facts can only do so much. To avoid coming to undesirable conclusions, people can fly from the facts and use other tools in their deep belief protecting toolbox.
With the disease of bias, then, societal immunity is better achieved when we encourage people to accept ambiguity, engage in critical thinking, and reject strict ideology. This society is something the new common core education system and at times The Daily Show are at least in theory attempting to help create. We will never eradicate bias—not from others, not from ourselves, and not from society. But we can become a people more free of ideology and less free of facts.
Courtesy: Scientific American