One of the most frustrating thing about conspiracy theorists is how resistant they are to changing their minds, even when presented with incontrovertible evidence.
Look, for example, at the whole "Stop the Steal" thing. There are a significant number of Republicans who still won't acknowledge that Biden won the election fair and square, despite the fact that the opposite claim -- that there was widespread voter fraud that favored the Democrats, and an organized effort by the Left to make it seem like Trump lost an election he actually "won in a landslide" -- has gone to court in one form or another over sixty times, and in all but one case the lawsuit was thrown out because of a complete lack of evidence. The judges who made these decisions include both Republicans and Democrats; the legal response to "Stop the Steal" has been remarkably bipartisan.
Which, you'd think, would be enough, but apparently it isn't. An amazingly small number of Republicans have said publicly that they were wrong, there was little to no fraud, certainly not enough to sway the election, and that Biden clearly was the victor. Mostly, the lack of evidence and losses in court has caused the True Believers double down, has made them even surer that a vast conspiracy robbed Trump of his win, and the lack of any kind of factual credibility is because there's an even vaster conspiracy to cover it all up.
Essentially, people have gone from "believe this because there's evidence" to "believe this despite the fact there's no evidence" to "believe this because there's no evidence."
Once you've landed in the last-mentioned category, it's hard to see what possible way there'd be to reach you. But there may be hope, to judge by a study that came out last week in The Journal of Personality and Social Psychology.
In "Jumping to Conclusions: Implications for Reasoning Errors, False Belief, Knowledge Corruption, and Impeded Learning," by Carmen Sanchez of the University of Illinois - Urbana/Champaign and David Dunning of the University of Michigan (of Dunning-Kruger fame), we find out that there is a strong (and fascinating) correlation between four features of the human psyche:
- Jumping to conclusions -- participants were given a task in which a computerized character was fishing in a lake. The lake had mostly red fish and a few gray fish, and the researchers looked at how quickly the test subject was confident about predicting the color of the next fish pulled from the lake.
- Certainty about false beliefs -- volunteers were given a test of their knowledge of American history, and for each four-answer multiple choice question they were asked how confident they were in their answer. The researchers looked at people who got things wrong -- while simultaneously being certain they were right.
- Understanding of basic logic -- participants were given a variety of logic puzzles, such as simple syllogisms (All fish can swim; sharks are fish; therefore sharks can swim), and asked to pick out which ones were sound logic and which were faulty.
- Belief in conspiracy theories -- test subjects were given a variety of common conspiracy theories, such as the belief that cellphones cause cancer but it's being covered up by big corporations, and asked to rank how likely they thought the beliefs were to be true.
They found that the faster you are to jump to conclusions on the fish test, the worse you are at logic, and the more certain you are about your beliefs even if they are wrong -- and, most critically, the more likely you are to believe spurious, zero-evidence claims.
So far, nothing too earth-shattering, and I think most of us could have predicted the outcome. But what makes this study fascinating is that Sanchez and Dunning looked at interventions that could slow people down and make them less likely to jump to false conclusions -- and therefore, less likely to feel certain about their own false or counterfactual beliefs.
The intervention had four parts:
- An explanation of the "jumping to conclusions" phenomenon, including an explanation of why it happens in the brain and the fact that we are all prone to this kind of thing.
- An acknowledgement of the difficulty of making a correct decision based on incomplete information. Test subjects were shown a zoomed-in photo, and then it was zoomed out a little bit at a time, and the test subjects had to decide when they were sure of what they were looking at.
- An exercise in studying optical illusions. Here, the point was to illustrate the inherent flaws of our own sensory-integrative mechanisms, and how focusing on one thing can make you miss details elsewhere that might give you more useful information.
- A short video of a male jogger who compliments a female street artist, and gets no response. He repeats himself, finally becoming agitated and shouting at her, but when she reacts with alarm he turns and runs away. Later, he finds she has left him a picture she drew, along with a note explaining that she's deaf -- leaving the guy feeling pretty idiotic and ashamed of himself. This was followed up by asking participants to write down snap judgments they'd made that later proved incorrect, and what additional information they'd have needed in order to get it right.
This is where I got a surprise, because I've always thought of believers in the counterfactual as being essentially unreachable. And the intervention seems like pretty rudimentary stuff, something that wouldn't affect you unless you were already primed to question your own beliefs. But what Sanchez and Dunning found is that the individuals who received the intervention did much better on subsequent tasks than the control group did -- they were more accurate in assessing their own knowledge, slower to make snap judgments, and less confident about crediting conspiracy theories.
I don't know about you, but I find this pretty hopeful. It once again reinforces my contention that one of the most important things we can do in public schools is to teach basic critical thinking. (And in case you didn't know -- I have an online critical thinking course through Udemy that is available for purchase, and which has gotten pretty good reviews.)
So taking the time to reason with people who believe in conspiracies can actually be productive, and not the exercise in frustration and futility I thought it was. Maybe we can reach the "Stop the Steal" people -- with an intervention that is remarkably simple. It's not going to fix them all, nor eradicate such beliefs entirely, but you have to admit that at this point, any movement in the direction of rationality is worth pursuing.****************************************