Skeptophilia (skep-to-fil-i-a) (n.) - the love of logical thought, skepticism, and thinking critically. Being an exploration of the applications of skeptical thinking to the world at large, with periodic excursions into linguistics, music, politics, cryptozoology, and why people keep seeing the face of Jesus on grilled cheese sandwiches.

Thursday, August 24, 2017

Tribalism vs. the facts

For the diehard skeptic, one of the most frustrating things about human nature is how to combat belief in the absence of evidence (or even in the face of evidence to the contrary).

And I'm not talking about religion here, or at least not solely about religion.  The current 30% or so of Americans who still support Donald Trump are a good example of an evidence-free belief that borders on religious fervor; witness a recent poll of Trump supporters wherein six out of ten said that they can't think of anything he could do that would change their approval of his presidency.

The maddening part of all this is that at its heart, skepticism only asks one thing; that you base your understanding on facts.  The idea that people can adhere to their beliefs so strongly that no logic or evidence could shift them is a little incomprehensible.

But it's even worse than this.  A new study has shown that if a person is predisposed to certain beliefs -- anything from Trump support to climate change denialism to young-Earth creationism -- it doesn't help for them to learn more about the subject.

In fact, learning more about the subject actually increases their certainty that they were right in the first place.

These were the rather dismal findings of Caitlin Drummond and Baruch Fischhoff of Carnegie Mellon University, whose paper "Individuals With Greater Science Literacy and Education Have More Polarized Beliefs on Controversial Science Topics" appeared last week in the Proceedings of the National Academy of Sciences.  The authors write:
Although Americans generally hold science in high regard and respect its findings, for some contested issues, such as the existence of anthropogenic climate change, public opinion is polarized along religious and political lines.  We ask whether individuals with more general education and greater science knowledge, measured in terms of science education and science literacy, display more (or less) polarized beliefs on several such issues...  We find that beliefs are correlated with both political and religious identity for stem cell research, the Big Bang, and human evolution, and with political identity alone on climate change.  Individuals with greater education, science education, and science literacy display more polarized beliefs on these issues.
Put simply, your views on (for example) evolutionary biology have less to do with your understanding of the subject than they do on your political and religious identification.  Which, of course, implies that if you are trying to convince someone of the correctness of the evolutionary model, teaching them about what the scientists are actually saying is unlikely to change their perspective, and it may actually cause them to double down on their original beliefs.

[image courtesy of the Wikimedia Commons]

So it's another example of the insidious backfire effect, and it is profoundly maddening.  It is unsurprising, perhaps, given the fact that for all of our technology and civilization, we're still tribal animals.  Our in-group identification, with respect to politics, religion, ethnicity, or nationality, trumps damn near everything else, up to and including the facts and evidence sitting right in front of our faces, and that education isn't going to change that.

It remains to be seen what can be done about this.  Baruch Fischhoff, who co-authored the study, said:
These are troubling correlations. We can only speculate about the underlying causes.  One possibility is that people with more education are more likely to know what they are supposed to say, on these polarized issues, in order to express their identity.  Another possibility is that they have more confidence in their ability to argue their case.
"Troubling" is right, especially given that I'm a science teacher.  I've always thought that one of the main jobs of science teachers is to correct students' misapprehensions about how the world works, because let's face it: a great deal of science is counterintuitive.  As Sean Carroll put it, in his wonderful book about the discovery of the Higgs boson, The Particle at the End of the Universe:
It's only because the data force us into corners that we are inspired to create the highly counterintuitive structures that form the basis for modern physics...  Imagine that a person in the ancient world was wondering what made the sun shine.  It's not really credible to imagine that they would think about it for a while and decide, "I bet most of the sun is made up of particles that can bump into one another and stick together, with one of them converting into a different kind of particle by emitting yet a third particle, which would be massless if it wasn't for the existence of a field that fill space and breaks the symmetry that is responsible for the associated force, and that fusion of the original two particles releases energy, which we ultimately see as sunlight."  But that's exactly what happens.  It took many decades to put this story together, and it never would have happened if our hands weren't forced by the demands of observation and experiment at every step.
The same, of course, is true for every discipline of science.  None of it is simple and intuitive; that's why we need the scientists.

But if people don't believe what the scientists are saying, not because of a lack of understanding or a disagreement over the facts, but because of tribal identity and in spite of the facts, there's not a whole hell of a lot you can do.

Which makes me even more depressed about our current situation here in the United States.

No comments:

Post a Comment