I've long been fascinated with communication network theory -- the model that maps out the rules behind the spread of information (and its ugly cousin, disinformation). Back in my day (you'll have to imagine me saying this in a creaky old-geezer voice) both moved a lot more slowly; communities devoted to conspiracies, for example, had to rely on such clunky modes of transmission as newsletters, magazines, and word-of-mouth.
Now? The internet, and especially social media, have become rapid-transit networks for bullshit. The phenomenon of a certain idea, video, meme, or link "going viral" has meant that virtually overnight, it can go from being essentially unknown to basically everyone who is online seeing it. There was nothing even close to comparable forty years ago.
Communications network theory looks at connectedness between different communities and individuals, the role of nodes (people or groups who are multiply-connected to many other people and groups), and "tastemakers" -- individuals whose promotion of something virtually guarantees it gaining widespread notice. The mathematics of this model is, unfortunately, over my head, but the concepts are fascinating. Consider the paper that came out this week in the journal Social Media and Society, "From 'Nasa Lies' to 'Reptilian Eyes': Mapping Communication About 10 Conspiracy Theories, Their Communities, and Main Propagators on Twitter," by Daniela Mahl, Jing Zeng, and Mike Schäfer of the University of Zürich.
In this study, they looked at the communities that have grown up around ten different conspiracy theories:
- Agenda 21, which claims that the United Nations has a plan to strip nations of their sovereignty and launch a one-world government
- The anti-vaccination movement
- The Flat Earthers
- Chemtrails -- the idea we're being dosed with psychotropic chemicals via jet exhaust contrails
- Climate change deniers
- Directed energy weapons -- high-intensity beams are being used to kill people and start natural disasters like major forest fires
- The Illuminati
- Pizzagate -- the claim that the Democrats are running some kind of nationwide human trafficking/pedophilia ring
- The Reptilians -- many major world leaders are reptilian aliens in disguise, and you can sometimes catch a glimpse of their real appearance in video clips
- "9/11 was an inside job"
They also looked at connections to two non-conspiracy communities -- pro-vaccination and anti-flat-Earth.
The researchers analyzed thousands of different accounts and tens of thousands of tweets to see what kind of overlap there was between these twelve online communities, as based on hashtag use, retweets, and so on.
What they found was that the communities studied formed eight tightly-networked clusters. Here's a diagram of their results:
There are a couple of interesting features of this.
First, that six of the communities are so entangled that they form two multiply-connected clusters, the chemtrail/Illuminati/Reptilians cluster, and the Pizzagate/9/11/climate change denial clusters. Both make sense considering who is pushing each of them -- the first by such conspiracy loons as David Icke, and the second by far-right media like Fox, OAN, and Newsmax.
Note, however, that even if three of the other conspiracy theories -- the anti-vaxxers, Agenda 21, and directed energy weapons -- are distinct enough that they form their own nodes, they still have strong connections to all the others. The only one that stands out as essentially independent of all the others is the Flat Earthers.
Evidently the Flerfs are so batshit crazy that even the other crazies don't want to have anything to do with them.
This demonstrates something that I've long believed; that acceptance of one loony idea makes you more likely to fall for others. Once you've jettisoned evidence-based science as your touchstone for deciding what is the truth, you'll believe damn near anything.
The other thing that jumps out at me is that the pro-vaccine and anti-flat-Earth groups have virtually no connections to any of the others. They are effectively closed off from the groups they're trying to counter. What this means is discouraging; that the people working to fight the network of nonsense by creating accounts dedicated to promoting the truth are sitting in an echo chamber, and their well-meant and fervent messages are not reaching the people whose minds need to be changed.
It's something that I've observed before; that it's all very well for people on Twitter and Facebook to post well-reasoned arguments about why Tucker Carlson, Tomi Lahren, Marjorie Taylor Greene, and Lauren Boebert are full of shit, but they're never going to be read by anyone who doesn't already agree.
It's why Fox News is so insidious. Years ago, they and their spokespeople, commentators like Rush Limbaugh and Ann Coulter, started off by convincing their listeners that everyone else was lying. Once you've decided that the only way to get the truth is to rely on one single source, you're at the mercy of the integrity and accuracy of that source. In the case of Fox, you are vulnerable to being manipulated by a group of people whose representation of the news is so skewed it has run afoul of Great Britain's Office of Communications multiple times on the basis of inaccuracy, partiality, and inflammatory content. (And in fact, last year Fox began an international streaming service in the UK, largely motivated by the fact that online content is outside the jurisdiction of the Office of Communications.)
Mahl et al. write:
Both anti-conspiracy theory communities, Anti-Flat Earth and Pro-Vaccination, are centered around scientists and medical practitioners. Their use of pro-conspiracy theory hashtags likely is an attempt to directly engage and confront users who disseminate conspiracy theories. Studies from social psychology have shown that cross-group communication can be an effective way to resolve misunderstandings, rumors, and misinformation. By deliberately using pro-conspiracy hashtags, anti-conspiracy theory accounts inject their ideas into the conspiracists’ conversations. However, our study suggests that this visibility does not translate into cross-group communication, that is, retweeting each other’s messages. This, in turn, indicates that debunking efforts hardly traverse the two clusters.
I wish I had an answer to all this. It's one thing if a group of misinformed people read arguments countering their beliefs and reject them; it's another thing entirely if the misinformed people are so isolated from the truth that they never even see it. Twitter and Facebook have given at least a nod toward deplatforming the worst offenders -- one study found that the flow of political misinformation on Twitter dropped by 75% after Donald Trump's account was suspended -- but it's not dealing with the problem as a whole, because there even if you delete the platforms of the people responsible for the wellspring of bullshit, there will always be others waiting in the wings to step in and take over.
However discouraging this is, it does mean that the skeptics and science types can't give up. Okay, we're not as multiply-connected as the wackos are; so we have to be louder, more insistent, more persistent. Saying "oh, well, nothing we can do about it" and throwing in the towel will have only one effect; making sure the disinformation platforms reach more people and poison more conduits of discourse.
And I, for one, am not ready to sit back and accept that as inevitable.
********************************
I have often been amazed and appalled at how the same evidence, the same occurrences, or the same situation can lead two equally-intelligent people to entirely different conclusions. How often have you heard about people committing similar crimes and getting wildly different sentences, or identical symptoms in two different patients resulting in completely different diagnoses or treatments?
In Noise: A Flaw in Human Judgment, authors Daniel Kahneman (whose wonderful book Thinking, Fast and Slow was a previous Skeptophilia book-of-the-week), Olivier Sibony, and Cass Sunstein analyze the cause of this "noise" in human decision-making, and -- more importantly -- discuss how we can avoid its pitfalls. Anything we can to to detect and expunge biases is a step in the right direction; even if the majority of us aren't judges or doctors, most of us are voters, and our decisions can make an enormous difference. Those choices are critical, and it's incumbent upon us all to make them in the most clear-headed, evidence-based fashion we can manage.
Kahneman, Sibony, and Sunstein have written a book that should be required reading for anyone entering a voting booth -- and should also be a part of every high school curriculum in the world. Read it. It'll open your eyes to the obstacles we have to logical clarity, and show you the path to avoiding them.
[Note: if you purchase this book using the image/link below, part of the proceeds goes to support Skeptophilia!]
No comments:
Post a Comment