Skeptophilia (skep-to-fil-i-a) (n.) - the love of logical thought, skepticism, and thinking critically. Being an exploration of the applications of skeptical thinking to the world at large, with periodic excursions into linguistics, music, politics, cryptozoology, and why people keep seeing the face of Jesus on grilled cheese sandwiches.

Saturday, June 9, 2018

Chain of hysteria

Because confirmation bias alone is apparently not enough, a team of psychologists at the University of Warwick (England) has just found that when bad news is passed from person to person, its capacity for inducing alarm and hysteria increases.

The study, released just this week in the journal Risk Analysis, was led by Thomas Hills, who summed up the research as "The more people share information, the more negative it becomes, the further it gets from the facts, and the more resistant it becomes to correction."  What they did was to take 154 test subjects, split them into fourteen groups of eight people each, and gave one person in each group a balanced, factual news article to read.  That person had to write a summary of the article in their own words, and pass that to another person in the group -- who read the summary, and had to summarize that, and pass that to the next person, and so on.

The sixth person was given not only the fifth person's summary, but the original article, to see if reading the original unbiased facts changed how they summarized the information.  And the scary thing is, it didn't.  The version passed on to the seventh person in the chain was just as inaccurate and alarmist as the previous ones had been.  So that points to a second, rather disturbing, conclusion from the Hills et al. research: once people have accepted a scary, emotionally-laden view of an issue, even presenting them with the facts doesn't change anything.

"Society is an amplifier for risk," Hills said.  "This research explains why our world looks increasingly threatening despite consistent reductions in real-world threats.  It also shows that the more people share information, the further that information gets from the facts and the more resilient it becomes to correction."

So it's kind of like an ugly, and potentially dangerous, game of Telephone.

1941 British advertisement [Image is in the Public Domain]

What interests me the most is this "resilience to correction" -- which I have to admit sounds way better than what I'd have called it, which is "ignorant, willful, pig-headed stupidity."  This tendency has been manipulated with what I can only interpret as cunning and malice aforethought by Fox News, which unhesitatingly broadcasts complete, outright lies, then (much more quietly) issues a "retraction" later -- knowing full well that the correction will not undo the outrage and misapprehension the original story created in the listeners.  It's what was going on this past week with the nonsense over Donald Trump's petulant and toddlerish withdrawal of the invitation to the Superbowl-winning Philadelphia Eagles over the controversy regarding players "taking the knee" during the National Anthem in protest of the unfair treatment of minorities.  Fox broadcast a story about Trump's cancellation of the visit (obviously siding with Trump, not that I probably had to mention that), and backed up the story with photographs of Eagles players kneeling.

The problem is that none of the photographs were of players "taking the knee" in protest during the National Anthem.  Every damn one of them was a photograph of a player kneeling to say a prayer prior to the start of the game, which (given their ongoing hysteria over the "War on Christianity" you'd think they'd have been in favor of).  When several players said, "Hey, wait a moment.  That picture of me wasn't what they implied it was," Fox finally (two days later) made a mealy-mouthed, and short, retraction statement.

Which one do you think got more views, and generated more attention and more emotion, the original story, or the retraction?

Don't answer that.  Rhetorical question.

So the scary part of the Hills et al. research is that knowing this, media agencies can knowingly manipulate this tendency -- start out with a sensationalized and hysteria-inducing story, which will then only amplify further in the retelling.  Then they can retract anything that was an obvious falsehood (or at least any of the falsehoods that enough people object to), and the retraction will have exactly zero effect.

What this does is make it even more imperative that we somehow fix the biased, slanted nightly shitshow that popular media has become.  How to do this, I have no idea.  But if we don't, we end up in a frightening positive-feedback loop -- where we believe the hysteria more strongly because the media insists that it's true, and they insist that it's true because it gets listeners who already believe it.

And the end result, I'm afraid, will be a nation filled with easily-led, emotion-driven dupes -- which, honestly, is probably precisely what the powers-that-be want.

***********************

This week's featured book is the amazing Thinking, Fast and Slow by Daniel Kahneman, which looks at the fact that we have two modules in our brain for making decisions -- a fast one, that mostly works intuitively, and a slower one that is logical and rational.  Unfortunately, they frequently disagree on what's the best course of action.  Worse still, trouble ensues when we rely on the intuitive one to the exclusion of the logical one, calling it "common sense" when in fact it's far more likely to come from biases rather than evidence.

Kahneman's book will make you rethink how you come to conclusions -- and make you all too aware of how frail the human reasoning capacity is.






No comments:

Post a Comment