Skeptophilia (skep-to-fil-i-a) (n.) - the love of logical thought, skepticism, and thinking critically. Being an exploration of the applications of skeptical thinking to the world at large, with periodic excursions into linguistics, music, politics, cryptozoology, and why people keep seeing the face of Jesus on grilled cheese sandwiches.
Showing posts with label disinformation. Show all posts
Showing posts with label disinformation. Show all posts

Tuesday, September 3, 2024

The problem with research

If there's one phrase that torques the absolute hell out of me -- and just about every actual scientist out there -- it's, "Well, I did my research."

Oh, you did, did you?  What lab did you do your research in?  Or was it field work?  Let's see your data!  Which peer-reviewed journal published your research?  How many times has it been cited in other scientific journals?

Part of the problem, of course, is like a lot of words in the English language -- "theory" and "proof" are two examples that come to mind -- the word "research" is used one way by actual researchers and a different way by most other people.  We were taught the alternate definition of "research" in grade school, with being assigned "research papers," which meant "go out and look up stuff other people have found out on the topic, and summarize that in your own words."  There's a value to doing this; it's a good starting place to understanding a subject, and is honestly where we all began with scholarship.

The problem is -- and it exists even at the grade-school level of inquiry -- this kind of "research" is only as good as the sources you choose.  When I was a teacher, one of the hardest things to get students to understand was that all sources are not created equal.  A paper in Science, or even the layperson's version of it in Scientific American or Discover, is head-and-shoulders above the meanderings of Some Random Guy in his blog.  (And yes, I'm well aware that this pronouncement is being made by Some Random Guy in his blog.)

That doesn't mean those less-reputable sources are necessarily wrong, of course.  It's more that they can't be relied upon.  While papers in Science (and other comparable journals) are occasionally retracted for errors or inaccuracies, there is a vetting process that makes their likelihood of being correct vastly higher.  After all, any oddball with a computer can create a website, and post whatever they want on it, be it brilliant posts about cutting-edge science or the looniest of wingnuttery.

The confusion between the two definitions of the word research has the effect of increasing people's confidence in the kind we were all doing in middle school, and giving that low-level snooping about an undeserved gloss of reputability.  This was the upshot of a paper in Nature (peer-reviewed science, that), by Kevin Aslett of the University of Central Florida et al., entitled, "Online Searches to Evaluate Misinformation Can Increase Its Perceived Veracity."  Their results are kind of terrifying, if not unexpected given the "post-truth society" we've somehow slid into.  The authors write:

Although conventional wisdom suggests that searching online when evaluating misinformation would reduce belief in it... across five experiments, we present consistent evidence that online search to evaluate the truthfulness of false news articles actually increases the probability of believing them...  We find that the search effect is concentrated among individuals for whom search engines return lower-quality information.  Our results indicate that those who search online to evaluate misinformation risk falling into data voids, or informational spaces in which there is corroborating evidence from low-quality sources. 

The tendency appears to be that when someone is "doing their research" on a controversial subject, what they do is an online search, pursued until they find two or three hits on sources that corroborate what they already believed, and that strengthens their conviction that they were right in the first place.  The study found that very little attention was usually given to the quality of those sources, or where those sources got the information themselves.  If it makes the "researcher" nod sagely and say, "Yeah, that's what I thought," it doesn't matter if the information came from NASA -- or from QAnon.

The problem is, a lot of those bogus sources can look convincing. 

Other times, of course, all you have to be able to do is add two-digit numbers to realize that they're full of shit.

People see data in some online source, and rarely consider (1) who collected the data and why, (2) how it was analyzed, (3) what information wasn't included in the analysis, and (4) whether it was verified, and if so how and by whom.  I first ran into the old joke about "73.4% of all statistics are made up on the spot" years ago, and it's still funny, even if our laughs are rather wry these days.  Sites like Natural News, Food Babe, Before It's News, Breitbart.com, Mercola.com, InfoWars, One America News, and even a few with scholarly-sounding names -- like The Society for Scientific Exploration, Evolution News, and The American College of Pediatricians are three examples -- are clearinghouses for fringe-y and discredited ideas, often backed up by data that's either cherry-picked and misrepresented, or from sources even further down the ladder of sketchy credibility.

Given how much bullshit is out there,  a lot of it well-hidden behind facts, figures, and fancy writing, it can be a challenge for laypeople (and I very much count myself amongst their numbers) to discern truth from fiction.  It's also an uphill struggle to fight against the very natural human tendency of confirmation bias; we all would love it if our cherished notions of how the world works were one hundred percent correct.  But if we want to make smart decisions, we all need to stop saying "I did my research" when all that "research" involved was a twenty-minute Google search to find the website of some random crank who confirmed what we already believed.

Remember, as the brilliant journalist Kathryn Schulz points out, that one of the most mind-expanding and liberating things we can say is, "I don't know.  Maybe I'm wrong."  And to start from that open-minded perspective and find out what the facts really are -- from the actual researchers.

****************************************


Friday, May 14, 2021

The network of nonsense

I've long been fascinated with communication network theory -- the model that maps out the rules behind the spread of information (and its ugly cousin, disinformation).  Back in my day (you'll have to imagine me saying this in a creaky old-geezer voice) both moved a lot more slowly; communities devoted to conspiracies, for example, had to rely on such clunky modes of transmission as newsletters, magazines, and word-of-mouth.

Now?  The internet, and especially social media, have become rapid-transit networks for bullshit.  The phenomenon of a certain idea, video, meme, or link "going viral" has meant that virtually overnight, it can go from being essentially unknown to basically everyone who is online seeing it.  There was nothing even close to comparable forty years ago.

Communications network theory looks at connectedness between different communities and individuals, the role of nodes (people or groups who are multiply-connected to many other people and groups), and "tastemakers" -- individuals whose promotion of something virtually guarantees it gaining widespread notice.  The mathematics of this model is, unfortunately, over my head, but the concepts are fascinating.  Consider the paper that came out this week in the journal Social Media and Society, "From 'Nasa Lies' to 'Reptilian Eyes': Mapping Communication About 10 Conspiracy Theories, Their Communities, and Main Propagators on Twitter," by Daniela Mahl, Jing Zeng, and Mike Schäfer of the University of Zürich.

In this study, they looked at the communities that have grown up around ten different conspiracy theories:

  1. Agenda 21, which claims that the United Nations has a plan to strip nations of their sovereignty and launch a one-world government
  2. The anti-vaccination movement
  3. The Flat Earthers
  4. Chemtrails -- the idea we're being dosed with psychotropic chemicals via jet exhaust contrails
  5. Climate change deniers
  6. Directed energy weapons -- high-intensity beams are being used to kill people and start natural disasters like major forest fires
  7. The Illuminati
  8. Pizzagate -- the claim that the Democrats are running some kind of nationwide human trafficking/pedophilia ring
  9. The Reptilians -- many major world leaders are reptilian aliens in disguise, and you can sometimes catch a glimpse of their real appearance in video clips
  10. "9/11 was an inside job"

They also looked at connections to two non-conspiracy communities -- pro-vaccination and anti-flat-Earth.

The researchers analyzed thousands of different accounts and tens of thousands of tweets to see what kind of overlap there was between these twelve online communities, as based on hashtag use, retweets, and so on.

What they found was that the communities studied formed eight tightly-networked clusters.  Here's a diagram of their results:


There are a couple of interesting features of this.

First, that six of the communities are so entangled that they form two multiply-connected clusters, the chemtrail/Illuminati/Reptilians cluster, and the Pizzagate/9/11/climate change denial clusters.  Both make sense considering who is pushing each of them -- the first by such conspiracy loons as David Icke, and the second by far-right media like Fox, OAN, and Newsmax.

Note, however, that even if three of the other conspiracy theories -- the anti-vaxxers, Agenda 21, and directed energy weapons -- are distinct enough that they form their own nodes, they still have strong connections to all the others.  The only one that stands out as essentially independent of all the others is the Flat Earthers.

Evidently the Flerfs are so batshit crazy that even the other crazies don't want to have anything to do with them.

This demonstrates something that I've long believed; that acceptance of one loony idea makes you more likely to fall for others.  Once you've jettisoned evidence-based science as your touchstone for deciding what is the truth, you'll believe damn near anything.

The other thing that jumps out at me is that the pro-vaccine and anti-flat-Earth groups have virtually no connections to any of the others.  They are effectively closed off from the groups they're trying to counter.  What this means is discouraging; that the people working to fight the network of nonsense by creating accounts dedicated to promoting the truth are sitting in an echo chamber, and their well-meant and fervent messages are not reaching the people whose minds need to be changed.

It's something that I've observed before; that it's all very well for people on Twitter and Facebook to post well-reasoned arguments about why Tucker Carlson, Tomi Lahren, Marjorie Taylor Greene, and Lauren Boebert are full of shit, but they're never going to be read by anyone who doesn't already agree.

It's why Fox News is so insidious.  Years ago, they and their spokespeople, commentators like Rush Limbaugh and Ann Coulter, started off by convincing their listeners that everyone else was lying.  Once you've decided that the only way to get the truth is to rely on one single source, you're at the mercy of the integrity and accuracy of that source.  In the case of Fox, you are vulnerable to being manipulated by a group of people whose representation of the news is so skewed it has run afoul of Great Britain's Office of Communications multiple times on the basis of inaccuracy, partiality, and inflammatory content.  (And in fact, last year Fox began an international streaming service in the UK, largely motivated by the fact that online content is outside the jurisdiction of the Office of Communications.)

Mahl et al. write:

Both anti-conspiracy theory communities, Anti-Flat Earth and Pro-Vaccination, are centered around scientists and medical practitioners.  Their use of pro-conspiracy theory hashtags likely is an attempt to directly engage and confront users who disseminate conspiracy theories.  Studies from social psychology have shown that cross-group communication can be an effective way to resolve misunderstandings, rumors, and misinformation.  By deliberately using pro-conspiracy hashtags, anti-conspiracy theory accounts inject their ideas into the conspiracists’ conversations.  However, our study suggests that this visibility does not translate into cross-group communication, that is, retweeting each other’s messages.  This, in turn, indicates that debunking efforts hardly traverse the two clusters.

I wish I had an answer to all this.  It's one thing if a group of misinformed people read arguments countering their beliefs and reject them; it's another thing entirely if the misinformed people are so isolated from the truth that they never even see it.  Twitter and Facebook have given at least a nod toward deplatforming the worst offenders -- one study found that the flow of political misinformation on Twitter dropped by 75% after Donald Trump's account was suspended -- but it's not dealing with the problem as a whole, because there even if you delete the platforms of the people responsible for the wellspring of bullshit, there will always be others waiting in the wings to step in and take over.

However discouraging this is, it does mean that the skeptics and science types can't give up.  Okay, we're not as multiply-connected as the wackos are; so we have to be louder, more insistent, more persistent.  Saying "oh, well, nothing we can do about it" and throwing in the towel will have only one effect; making sure the disinformation platforms reach more people and poison more conduits of discourse.

And I, for one, am not ready to sit back and accept that as inevitable.

********************************

I have often been amazed and appalled at how the same evidence, the same occurrences, or the same situation can lead two equally-intelligent people to entirely different conclusions.  How often have you heard about people committing similar crimes and getting wildly different sentences, or identical symptoms in two different patients resulting in completely different diagnoses or treatments?

In Noise: A Flaw in Human Judgment, authors Daniel Kahneman (whose wonderful book Thinking, Fast and Slow was a previous Skeptophilia book-of-the-week), Olivier Sibony, and Cass Sunstein analyze the cause of this "noise" in human decision-making, and -- more importantly -- discuss how we can avoid its pitfalls.  Anything we can to to detect and expunge biases is a step in the right direction; even if the majority of us aren't judges or doctors, most of us are voters, and our decisions can make an enormous difference.  Those choices are critical, and it's incumbent upon us all to make them in the most clear-headed, evidence-based fashion we can manage.

Kahneman, Sibony, and Sunstein have written a book that should be required reading for anyone entering a voting booth -- and should also be a part of every high school curriculum in the world.  Read it.  It'll open your eyes to the obstacles we have to logical clarity, and show you the path to avoiding them.

[Note: if you purchase this book using the image/link below, part of the proceeds goes to support Skeptophilia!]



Saturday, March 28, 2020

Contagious disinformation

Well, that didn't take long.

All it took was Donald Trump harping on the "Chinese virus" thing for a few days, and all of his MAGA followers took off in a large herd, bleating angrily about how the most important thing was making China pay for causing all this.  I've already seen three people post that COVID stands for "China-Originated Viral Infectious Disease."  Worse, when someone responded that this was incorrect, that it stands for "COronaVIrus Disease," another person piped up, "Who gives a fuck?  It started in CHINA and that's all that matters."

All.  That.  Matters.

Not the fact that we currently have the highest number of cases in the world here in the United States.  Not that we are woefully behind in testing, whatever Trump and his cronies would have you believe.  Not that we're in drastic need of PPE, including masks and gloves, and that some hospitals have substituted plastic garbage bags for protective suits -- and that because Trump is a vindictive toddler, it's looking like what PPE we do have is going to be parceled out according to which states' governors kiss Trump's ass most enthusiastically.

I've said more than once recently that none of this is going to change until some miracle occurs and Fox News decides to end their nightly celebratory circle-jerk over how wonderful Dear Leader is.  Every day they're presenting nothing but lies, spin, and propaganda, and a good 50% of Americans get their news solely from Fox.

And don't even start with what-about-ism.  Yes, I know the other media sources are biased.  Show me one major American news source that lies as consistently and as maliciously as Fox.  There have been whole studies that have shown that Fox News viewers are, across the board, the least aware of the facts by comparison to viewers from six other sources -- and compared to those who don't watch the news at all.  That's right: not watching the news leaves you, on average, better informed than watching Fox.

Anyhow, because Trump et al. are now more concerned about getting people pissed off at China than they are about dealing with the problem in our own country, we also have conspiracy theories popping up all over the place that the virus didn't just originate in China, it was created by China.  In some versions, the pandemic was caused by a lab accident in Wuhan; in others, the virus was deliberately introduced into the population, for reasons that remain unclear (largely because it didn't happen, but try to tell the conspiracy theorists that).

[Image is in the Public Domain courtesy of the Center for Disease Control]

In any case, this sort of thing is becoming so widespread that a team led by virologist Kristian Andersen of Scripps just published a study analyzing the genome of the COVID-19 virus, and they found that --  beyond a shadow of a doubt -- the virus is a natural pathogen, and it looks like although it started in some non-human animal (bats and pangolins being the two top contenders), all it took was one jump to a human host to get the ball rolling.

In "The Proximal Origin of SARS-CoV-2," we read the following:
It is improbable that SARS-CoV-2 emerged through laboratory manipulation of a related SARS-CoV-like coronavirus.  As noted above, the RBD [receptor-binding domain] of SARS-CoV-2 is optimized for binding to human ACE2 with an efficient solution different from those previously predicted.  Furthermore, if genetic manipulation had been performed, one of the several reverse-genetic systems available for betacoronaviruses would probably have been used.  However, the genetic data irrefutably show that SARS-CoV-2 is not derived from any previously used virus backbone.  Instead, we propose two scenarios that can plausibly explain the origin of SARS-CoV-2: (i) natural selection in an animal host before zoonotic transfer; and (ii) natural selection in humans following zoonotic transfer.  [Italics mine]
Of course, the claim that it was bioengineered never had much going for it.  Molecular epidemiologist Emma Hodcroft, of the University of Basel, said in an interview with Science News, "Essentially their claim was the same as me taking a copy of the Odyssey and saying, 'Oh, this has the word the in it,' and then opening another book, seeing the word the in it and saying,  'Oh my gosh, it’s the same word, there must be parts of the Odyssey in this other book.'  It was a really misleading claim and really bad science."

What about the claims of China mishandling the response to the epidemic, and then lying about it?  Okay, they probably did.  But the people who are bitching the most about this seem perfectly fine with Donald Trump doing the same damn thing.  "It's another Democrat hoax."  "One day, like a miracle, it will disappear."  "Anyone who needs a test, gets a test... and the tests, they're beautiful."  "Health insurance companies agreed to waive all co-payments for coronavirus treatments, extend insurance coverage to these treatments, and to prevent surprise medical billing."  "[W]hen you have fifteen people, and the fifteen within a couple of days is going to be down to close to zero, that's a pretty good job we've done."

And after all that, he had the gall to say, "I’ve always known this is a real—this is a pandemic.  I felt it was a pandemic long before it was called a pandemic…  I’ve always viewed it as very serious."

But on Fox News apparently Trump can say one thing today and exactly the opposite tomorrow, and the loyal viewers will believe him both times.

Okay, I'm ranting.  But this is killing people.  There seems to be no way to compel Fox to stop lying, even when American citizens are being harmed as a direct result of what they air.  I'm all for freedom of speech and freedom of the press, but I'm also for personal responsibility -- and when your lies cause people to die, there should be some kind of legal recourse available.

But thus far, they've gotten away with it scot-free, and in fact are encouraging the conspiracy theories and anti-Chinese sentiment, probably to draw attention away from the abject failure of our own government to act quickly and responsibly.  The ironic thing is that the success of their own strategies has put their own viewers into the greatest likelihood of harm -- and that even that isn't stopping them from their daily smorgasbord of disinformation.

*****************************

Any guesses as to what was the deadliest natural disaster in United States history?

I'd speculate that if a poll was taken on the street, the odds-on favorites would be Hurricane Katrina, Hurricane Camille, and the Great San Francisco Earthquake.  None of these are correct, though -- the answer is the 1900 Galveston hurricane, that killed an estimated nine thousand people and basically wiped the city of Galveston off the map.  (Galveston was on its way to becoming the busiest and fastest-growing city in Texas; the hurricane was instrumental in switching this hub to Houston, a move that was never undone.)

In the wonderful book Isaac's Storm, we read about Galveston Weather Bureau director Isaac Cline, who tried unsuccessfully to warn people about the approaching hurricane -- a failure which led to a massive overhaul of how weather information was distributed around the United States, and also spurred an effort toward more accurate forecasting.  But author Erik Larson doesn't make this simply about meteorology; it's a story about people, and brings into sharp focus how personalities can play a huge role in determining the outcome of natural events.

It's a gripping read, about a catastrophe that remarkably few people know about.  If you have any interest in weather, climate, or history, read Isaac's Storm -- you won't be able to put it down.

[Note: if you purchase this book using the image/link below, part of the proceeds goes to support Skeptophilia!]





Tuesday, December 10, 2019

Misremembering the truth

There are two distinct, but similar-sounding, cognitive biases that I've written about many times here at Skeptophilia because they are such tenacious barriers to rational thinking.

The first, confirmation bias, is our tendency to uncritically accept claims when they fit with our preconceived notions.  It's why a lot of conservative viewers of Fox News and liberal viewers of MSNBC sit there watching and nodding enthusiastically without ever stopping and saying, "... wait a moment."

The other, dart-thrower's bias, is more built-in.  It's our tendency to notice outliers (because of their obvious evolutionary significance as danger signals) and ignore, or at least underestimate, the ordinary as background noise.  The name comes from the thought experiment of being in a bar while there's a darts game going on across the room.  You'll tend to notice the game only when there's an unusual throw -- a bullseye, or perhaps impaling the bartender in the forehead -- and not even be aware of it otherwise.

Well, we thought dart-thrower's bias was more built into our cognitive processing system and confirmation bias more "on the surface" -- and the latter therefore more culpable, conscious, and/or controllable.  Now, it appears that confirmation bias might be just as hard-wired into our brains as dart-thrower's bias is.

A paper appeared this week in Human Communication Research, describing research conducted by a team led by Jason Coronel of Ohio State University.  In "Investigating the Generation and Spread of Numerical Misinformation: A Combined Eye Movement Monitoring and Social Transmission Approach," Coronel, along with Shannon Poulsen and Matthew D. Sweitzer, did a fascinating series of experiments that showed we not only tend to accept information that agrees with our previous beliefs without question, we honestly misremember information that disagrees -- and we misremember it in such a way that in our memories, it further confirms our beliefs!

The location of memories (from Memory and Intellectual Improvement Applied to Self-Education and Juvenile Instruction, by Orson Squire Fowler, 1850) [Image is in the Public Domain]

What Coronel and his team did was to present 110 volunteers with passages containing true numerical information on social issues (such as support for same-sex marriage and rates of illegal immigration).  In some cases, the passages agreed with what (according to polls) most people believe to be true, such as that the majority of Americans support same-sex marriage.  In other cases, the passages contained information that (while true) is widely thought to be untrue -- such as the fact that illegal immigration across the Mexican border has been dropping for years and is now at its lowest rates since the mid-1990s.

Across the board, people tended to recall the information that aligned with the conventional wisdom correctly, and the information that didn't incorrectly.  Further -- and what makes this experiment even more fascinating -- is that when people read the unexpected information, data that contradicted the general opinion, eye-tracking monitors recorded that they hesitated while reading, as if they recognized that something was strange.  In the immigration passage, for example, they read that the rate of immigration had decreased from 12.8 million in 2007 to 11.7 million in 2014, and the readers' eyes bounced back and forth between the two numbers as if their brains were saying, "Wait, am I reading that right?"

So they spent longer on the passage that conflicted with what most people think -- and still tended to remember it incorrectly.  In fact, the majority of people who did remember wrong got the numbers right -- 12.8 million and 11.7 million -- showing that they'd paid attention and didn't just scoff and gloss over it when they hit something they thought was incorrect.  But when questioned afterward, they remembered the numbers backwards, as if the passage had actually supported what they'd believed prior to the experiment!

If that's not bad enough, Coronel's team then ran a second experiment, where the test subjects read the passage, then had to repeat the gist to another person, who then passed it to another, and so on.  (Remember the elementary school game of "Telephone?")  Not only did the data get flipped -- usually in the first transfer -- subsequently, the difference between the two numbers got greater and greater (thus bolstering the false, but popular, opinion even more strongly).  In the case of the immigration statistics, the gap between 2007 and 2014 not only changed direction, but by the end of the game it had widened from 1.1 million to 4.7 million.

This gives you an idea what we're up against in trying to counter disinformation campaigns.  And it also illustrates that I was wrong in one of my preconceived notions; that people falling for confirmation bias are somehow guilty of locking themselves deliberately into an echo chamber.  Apparently, both dart-thrower's bias and confirmation bias are somehow built into the way we process information.  We become so certain we're right that our brain subconsciously rejects any evidence to the contrary.

Why our brains are built this way is a matter of conjecture.  I wonder if perhaps it might be our tribal heritage at work; that conforming to the norm, and therefore remaining a member of the tribe, has a greater survival value than being the maverick who sticks to his/her guns about a true but unpopular belief.  That's pure speculation, of course.  But what it illustrates is that once again, our very brains are working against us in fighting Fake News -- which these days is positively frightening, given how many powerful individuals and groups are, in a cold and calculated fashion, disseminating false information in an attempt to mislead us, frighten us, or anger us, and so maintain their positions of power.

***********************

This week's Skeptophilia book of the week is brand new; Brian Clegg's wonderful Dark Matter and Dark Energy: The Hidden 95% of the Universe.  In this book, Clegg outlines "the biggest puzzle science has ever faced" -- the evidence for the substances that provide the majority of the gravitational force holding the nearby universe together, while simultaneously making the universe as a whole fly apart -- and which has (thus far) completely resisted all attempts to ascertain its nature.

Clegg also gives us some of the cutting-edge explanations physicists are now proposing, and the experiments that are being done to test them.  The science is sure to change quickly -- every week we seem to hear about new data providing information on the dark 95% of what's around us -- but if you want the most recently-crafted lens on the subject, this is it.

[Note: if you purchase this book from the image/link below, part of the proceeds goes to support Skeptophilia!]





Tuesday, April 4, 2017

The post-truth world

The Oxford Dictionary word of the year for 2016 was "post-truth."

The OED defines "post-truth" as "relating to or denoting circumstances in which objective facts are less influential in shaping public opinion than appeals to emotion and personal belief."  So here we are using the prefix "post" not to mean "following in time sequence" but "after the point where it becomes irrelevant."

The fact that this term was even coined is unsettling.  Have we really come to a place where demonstrable truth is less important than belief?  Where Daniel Patrick Moynihan's trenchant quip, "You are entitled to your own opinion, but you are not entitled to your own facts" is not argued against so much as it is ignored into nonexistence?

Unfortunately, the answer to both of these questions appears to be "yes."  This is the alarming conclusion of research done by University of Washington professor Kate Starbird, of the Department of Human-Centered Design and Engineering, who decided to study how people use social networks to respond to disasters and ended up uncovering something deeply disturbing about our society.

Starbird's research started after the Boston Marathon bombing of April 2013.  She was sifting through tweets that followed the attack, and noticed that there were the expected calls for help and outcries by worried family members about the safety of their loved ones, but there was something else, something considerably darker.

"There was a significant volume of social-media traffic that blamed the Navy SEALs for the bombing," Starbird said, in an interview with Danny Westneat of the Seattle Times.  "It was real tinfoil-hat stuff.  So we ignored it."

She began to realize her error when she did the same sort of analysis of subsequent disasters.  "After every mass shooting, dozens of them, there would be these strange clusters of activity," Starbird said.  "It was so fringe we kind of laughed at it...  That was a terrible mistake.  We should have been studying it."

[image courtesy of the Wikimedia Commons]

So she threw herself into a study of the networks on the internet that involve conspiracy theorists and conspiracy websites -- InfoWars, Before It's News, NewsBusters, NoDisinfo, the misleadingly-named Veterans Today, and hundreds of others.  And what she found was horrifying; these interconnected webs of misinformation and bizarre speculation form a powerful force for spreading their message -- one that rivals the power of legitimate media.

"More people are dipping into this stuff than I had ever imagined," Starbird said, noting that InfoWars alone gets the same web traffic as the Chicago Tribune.

Interestingly, the websites she's studying don't follow any kind of definitive political alignment.  InfoWars, for example, is right-wing; The Free Thought Project is left-wing.  Instead, the unifying theme is anti-globalism and xenophobia, which can manifest irrespective of political leaning.

"To be antiglobalist often included being anti-mainstream media, anti-immigration, anti-science, anti-U.S. government, and anti-European Union," Starbird said.  And that can appeal to both sides of the political spectrum.

The most frightening thing of all is how insulated this network is from the truth.  Since there are now hundreds of these sites, the usual mantra -- cross-check your sources -- doesn't help you much.  "Your brain tells you ‘Hey, I got this from three different sources,’ " she said.  "But you don’t realize it all traces back to the same place, and might have even reached you via bots posing as real people.  If we think of this as a virus, I wouldn’t know how to vaccinate for it."

Which supports a contention I've had for years; once you've trained people to doubt facts, deluded them into thinking that the raw data has spin, you can convince them of anything.  After that, everything they look at is seen through the lens of suspicion, as if the information itself had an agenda, was trying to pull the wool over their eyes.

And I'm as at a loss as Starbird is about how to combat it.  Yes, teach critical thinking and media analysis in schools; yes, harp on comparing your sources to known reliable media before you tweet or post on Facebook.  But this spider's web of interconnected sites is remarkably well-insulated from attack.  Anyone who contradicts the party line is either a dupe or a shill; either they've "drunk the KoolAid" (to use the conspiracy theorists' favorite line) or they've actively sold out to the other side.

Once you've accepted that, there's no way out.

"I used to be a techno-utopian," Starbird told Westneat.  "Now I can’t believe that I’m sitting here talking to you about all this...  My fear is that we may be headed toward the menace of unreality — which is that nobody believes anything anymore... Alex Jones is a kind of prophet.  There really is an information war for your mind.

"And we’re losing it."