The study, released just this week in the journal Risk Analysis, was led by Thomas Hills, who summed up the research as "The more people share information, the more negative it becomes, the further it gets from the facts, and the more resistant it becomes to correction." What they did was to take 154 test subjects, split them into fourteen groups of eight people each, and gave one person in each group a balanced, factual news article to read. That person had to write a summary of the article in their own words, and pass that to another person in the group -- who read the summary, and had to summarize that, and pass that to the next person, and so on.
The sixth person was given not only the fifth person's summary, but the original article, to see if reading the original unbiased facts changed how they summarized the information. And the scary thing is, it didn't. The version passed on to the seventh person in the chain was just as inaccurate and alarmist as the previous ones had been. So that points to a second, rather disturbing, conclusion from the Hills et al. research: once people have accepted a scary, emotionally-laden view of an issue, even presenting them with the facts doesn't change anything.
"Society is an amplifier for risk," Hills said. "This research explains why our world looks increasingly threatening despite consistent reductions in real-world threats. It also shows that the more people share information, the further that information gets from the facts and the more resilient it becomes to correction."
So it's kind of like an ugly, and potentially dangerous, game of Telephone.
1941 British advertisement [Image is in the Public Domain]
The problem is that none of the photographs were of players "taking the knee" in protest during the National Anthem. Every damn one of them was a photograph of a player kneeling to say a prayer prior to the start of the game, which (given their ongoing hysteria over the "War on Christianity" you'd think they'd have been in favor of). When several players said, "Hey, wait a moment. That picture of me wasn't what they implied it was," Fox finally (two days later) made a mealy-mouthed, and short, retraction statement.
Which one do you think got more views, and generated more attention and more emotion, the original story, or the retraction?
Don't answer that. Rhetorical question.
So the scary part of the Hills et al. research is that knowing this, media agencies can knowingly manipulate this tendency -- start out with a sensationalized and hysteria-inducing story, which will then only amplify further in the retelling. Then they can retract anything that was an obvious falsehood (or at least any of the falsehoods that enough people object to), and the retraction will have exactly zero effect.
What this does is make it even more imperative that we somehow fix the biased, slanted nightly shitshow that popular media has become. How to do this, I have no idea. But if we don't, we end up in a frightening positive-feedback loop -- where we believe the hysteria more strongly because the media insists that it's true, and they insist that it's true because it gets listeners who already believe it.
And the end result, I'm afraid, will be a nation filled with easily-led, emotion-driven dupes -- which, honestly, is probably precisely what the powers-that-be want.
This week's featured book is the amazing Thinking, Fast and Slow by Daniel Kahneman, which looks at the fact that we have two modules in our brain for making decisions -- a fast one, that mostly works intuitively, and a slower one that is logical and rational. Unfortunately, they frequently disagree on what's the best course of action. Worse still, trouble ensues when we rely on the intuitive one to the exclusion of the logical one, calling it "common sense" when in fact it's far more likely to come from biases rather than evidence.
Kahneman's book will make you rethink how you come to conclusions -- and make you all too aware of how frail the human reasoning capacity is.
Post a Comment