Skeptophilia (skep-to-fil-i-a) (n.) - the love of logical thought, skepticism, and thinking critically. Being an exploration of the applications of skeptical thinking to the world at large, with periodic excursions into linguistics, music, politics, cryptozoology, and why people keep seeing the face of Jesus on grilled cheese sandwiches.
Showing posts with label misinformation. Show all posts
Showing posts with label misinformation. Show all posts

Thursday, December 5, 2024

Disinformation and disorder

I've dealt with a lot of weird ideas over the thirteen years I've been blogging here at Skeptophilia.

Some of them are so far out there as to be risible.  A few of those that come to mind:
  • the "phantom time hypothesis" -- that almost three hundred years' worth of history didn't happen, and was a later invention developed through collusion between the Holy Roman Empire and the Catholic Church
  • "vortex-based mathematics," which claims (1) that spacetime is shaped like a donut, (2) infinity has an "epicenter," and (3) pi is a whole number
  • the planet Nibiru, which is supposed to either usher in the apocalypse or else cause us all to ascend to a higher plane of existence, but which runs into the snag that it apparently doesn't exist
  • a claim that by virtue of being blessed by a priest, holy water has a different chemical structure and a different set of physical properties from ordinary water
  • gemstones can somehow affect your health through "frequencies"
In this same category, of course, are some things that a lot of people fervently believe, such as homeopathy, divination, and the Flat Earth.

These, honestly, don't bother me all that much, except for the fact that the health-related ones can cause sick people to bypass appropriate medical care in favor of what amounts to snake oil.  But on an intellectual level, they're easily analyzed, and equally easily dismissed.  Once you know some science, you kind of go, "Okay, that makes no sense," and that's that.

It's harder by far to deal with the ones that mix in just enough science that to a layperson, they sound like they could be plausible.  After all, science is hard; I have a B.S. in physics, and most academic papers in the field go whizzing over my head so fast they don't even ruffle my hair.  The problem, therefore, is how to tell if a person is taking (real, but difficult) science, misinterpreting or misrepresenting it, but then presenting it in such an articulate fashion that even to intelligent laypeople, it seems legitimate.

One of the first times I ran into this was the infamous video What the Bleep Do We Know?, from 2004, which is one of the best-known examples of quantum mysticism.  It takes some real, observable effects -- strange stuff like entanglement and indeterminacy and the Heisenberg Uncertainty Principle and the role of the observer in the collapse of the wave function -- and weaves in all sorts of unscientific hand-waving about how "the science says" our minds create the universe, thoughts can influence the behavior of matter, and that the matter/energy equivalence formula means that "all being is energy."  Those parts aren't correct, of course; but the film's makers do it incredibly skillfully, describing the scientific bits more or less accurately, and interviewing actual scientists then editing their segments to make it sound like they're in support of the fundamentally pseudoscientific message of the film's makers.  (It's worth noting that it was the brainchild of none other than J. Z. Knight, whose Ramtha cult has become notorious for its homophobia, anti-Semitism, anti-Catholicism, and racism.)

I ran into a (much) more recent example of this when I picked up a copy of Howard Bloom's book The God Problem: How a Godless Cosmos Creates at our local Friends of the Library used book sale.  At first glance, it looked right down my alley -- a synthesis of modern cosmology, philosophy, and religion.  And certainly the first few pages and the back cover promised great things, with endorsements from everyone from Barbara Ehrenreich to Robert Sapolsky to Edgar Mitchell.

I hadn't gotten very far into it, however, before I started to wonder.  The writing is frenetic, jumping from one topic to another seemingly willy-nilly, sprinkled with rapid-fire witticisms that in context sound like the result of way too many espressos.  But I was willing to discount that as a matter of stylistic preference, until I started running one after another into weird claims of profound insights that turn out, on examination, to be simply sleight-of hand.  We're told, for example, that we should believe his "heresy" that "A is not equal to A," and when he explains it, it turns out that this only works if you define the first A differently from the second one.  Likewise that "one plus one doesn't equal two" -- only if you're talking about the fact that joining two things together can result in the production of something different (such as a proton and an electron coming together to form a neutral hydrogen atom).

So his supposedly earthshattering "heresies" turn out to be something that, if you know a little science, would induce you to shrug your shoulders and say, "So?"

But what finally pissed me off enough that I felt like I needed to address it here was his claim that the Second Law of Thermodynamics is wrong, which he said was a heresy so terrible we should "prepare to be burned at the stake" by the scientific establishment for believing him.  Here's a direct quote:
... the Second Law of Thermodynamics [is] a law that's holy, sacred, and revered.  What is the Second Law?  All things tend toward disorder.  All things fall apart.  All things tend toward the random scramble of formlessness and meaninglessness called entropy.
He then goes into a page-long description of what happens when you put a sugar cube into a glass of water, and ends with:
The molecules of sugar in your glass went from a highly ordered state to a random whizzle [sic] of glucose and fructose molecules evenly distributed throughout your glass.  And that, says the Second Law of Thermodynamics, is the fate of everything in the universe.  A fate so inevitable that the cosmos will end in an extreme of lethargy, a catastrophe called "heat death."  The cosmos will come apart in a random whoozle [sic] just like the sugar cube did.  The notion of heat death is a belief so widespread that it was enunciated by Lord Kelvin in 1851 and has hung around like a catechism.
Then he tells us what the problem is:
But is the Second Law of Thermodynamics true?  Do all things tend to disorder?  Is the universe in a steady state of decline?  Is it moving step by step toward randomness?  Are form and structure steadily stumbling down the stairway of form into the chaos of a wispy gas?...  No.  In fact, the very opposite is true.  The universe is steadily climbing up.  It is steadily becoming more form-filled and more structure-rich.  How could that possibly be true?  Everyone knows that the Second Law of Thermodynamics is gospel.  Including everybody who is anybody in the world of physics, chemistry, and even complexity theory.
*brief pause to scream obscenities*  

*another brief pause to reassure my puppy that he's not the one I'm mad at*

No one, scientist or otherwise, is going to burn Bloom at the stake for this, because what he's claiming is simply wrong.  This is a complete mischaracterization of what the Second Law says.  Whether Bloom knows that, and is deliberately misrepresenting it, or simply doesn't understand it himself, I'm not sure.  What the Second Law says, at least in one formulation, is that in a closed system, the overall entropy always increases -- and the critical italicized bit is the part he conveniently leaves out.  Of course order can be increased, but it's always at the cost of (1) expending energy, and (2) increasing entropy more somewhere else.  A simple example is the development of a human from a single fertilized egg cell, which represents a significant increase in complexity and decrease in entropy.  But the only way that's accomplished is by giving the developing human a continuous source of energy and building blocks (i.e., food), and cellular processes tearing those food molecules to shreds, increasing their entropy.  And what the Second Law says is that the entropy increase experienced by the food molecules is bigger than the entropy decrease experienced by the developing human.  (I wrote a longer explanation of this principle a while back, if you're interested in more information.)

Let's just put it this way.  If what Bloom is saying -- that the Second Law is wrong -- was true, he'd be in line for a Nobel Prize.  There has never, ever been an exception found to the Second Law, despite centuries of testing, and the frustrated desires of perpetual-motion-machine-inventors the world over.

A model of a perpetual motion machine -- which, for the record, doesn't work [Image licensed under the Creative Commons Tiia Monto, Deutsches Museum 6, CC BY-SA 4.0]

So Bloom got it badly wrong.  He's hardly the first person to do so.  Why, then, does this grind my gears so badly?

It's that apparently no one on his editorial team, and none of the dozens of people who endorsed his book, thought even to read the fucking Wikipedia page about this fundamental law of physics Bloom is saying is incorrect.  And he certainly sounds convincing; his writing is like a sort-of-scientific-or-something Gish gallop, hurling so many arguments at us all at once that it's all readers can do to withstand the barrage and stay on our feet.

For me, though, it immediately made me discount anything else he has to say.  If his understanding of a basic scientific law that I've known about since freshman physics, and taught every year to my AP Biology students, is that flawed, how can I trust what he says on other topics about which I might not have as much background knowledge?

And that, to me, is the danger.  It's easy to point out the obvious nonsense like space donuts and gemstone frequencies -- but far harder to recognize pseudoscience that is twisted together with actual science so intricately that you can't see where one ends and the other begins.  Especially if -- as is the case with The God Problem -- it's couched in folksy, jargon-free anecdote that sounds completely reasonable.

I guess the only real solution is to learn enough science to be able to recognize this kind of thing when you see it.  And that takes time and hard work.  But it's absolutely critical, especially in our current political situation here in the United States, where there are people who are deliberately spinning falsehoods for their own malign purposes about such critical issues as health care, gender and sexuality, and the climate.

So it's hard work we all need to be doing.  Otherwise we fall prey to persuasive nonsense -- and are at the mercy of whatever the author of it is trying to sell.

****************************************

Tuesday, September 3, 2024

The problem with research

If there's one phrase that torques the absolute hell out of me -- and just about every actual scientist out there -- it's, "Well, I did my research."

Oh, you did, did you?  What lab did you do your research in?  Or was it field work?  Let's see your data!  Which peer-reviewed journal published your research?  How many times has it been cited in other scientific journals?

Part of the problem, of course, is like a lot of words in the English language -- "theory" and "proof" are two examples that come to mind -- the word "research" is used one way by actual researchers and a different way by most other people.  We were taught the alternate definition of "research" in grade school, with being assigned "research papers," which meant "go out and look up stuff other people have found out on the topic, and summarize that in your own words."  There's a value to doing this; it's a good starting place to understanding a subject, and is honestly where we all began with scholarship.

The problem is -- and it exists even at the grade-school level of inquiry -- this kind of "research" is only as good as the sources you choose.  When I was a teacher, one of the hardest things to get students to understand was that all sources are not created equal.  A paper in Science, or even the layperson's version of it in Scientific American or Discover, is head-and-shoulders above the meanderings of Some Random Guy in his blog.  (And yes, I'm well aware that this pronouncement is being made by Some Random Guy in his blog.)

That doesn't mean those less-reputable sources are necessarily wrong, of course.  It's more that they can't be relied upon.  While papers in Science (and other comparable journals) are occasionally retracted for errors or inaccuracies, there is a vetting process that makes their likelihood of being correct vastly higher.  After all, any oddball with a computer can create a website, and post whatever they want on it, be it brilliant posts about cutting-edge science or the looniest of wingnuttery.

The confusion between the two definitions of the word research has the effect of increasing people's confidence in the kind we were all doing in middle school, and giving that low-level snooping about an undeserved gloss of reputability.  This was the upshot of a paper in Nature (peer-reviewed science, that), by Kevin Aslett of the University of Central Florida et al., entitled, "Online Searches to Evaluate Misinformation Can Increase Its Perceived Veracity."  Their results are kind of terrifying, if not unexpected given the "post-truth society" we've somehow slid into.  The authors write:

Although conventional wisdom suggests that searching online when evaluating misinformation would reduce belief in it... across five experiments, we present consistent evidence that online search to evaluate the truthfulness of false news articles actually increases the probability of believing them...  We find that the search effect is concentrated among individuals for whom search engines return lower-quality information.  Our results indicate that those who search online to evaluate misinformation risk falling into data voids, or informational spaces in which there is corroborating evidence from low-quality sources. 

The tendency appears to be that when someone is "doing their research" on a controversial subject, what they do is an online search, pursued until they find two or three hits on sources that corroborate what they already believed, and that strengthens their conviction that they were right in the first place.  The study found that very little attention was usually given to the quality of those sources, or where those sources got the information themselves.  If it makes the "researcher" nod sagely and say, "Yeah, that's what I thought," it doesn't matter if the information came from NASA -- or from QAnon.

The problem is, a lot of those bogus sources can look convincing. 

Other times, of course, all you have to be able to do is add two-digit numbers to realize that they're full of shit.

People see data in some online source, and rarely consider (1) who collected the data and why, (2) how it was analyzed, (3) what information wasn't included in the analysis, and (4) whether it was verified, and if so how and by whom.  I first ran into the old joke about "73.4% of all statistics are made up on the spot" years ago, and it's still funny, even if our laughs are rather wry these days.  Sites like Natural News, Food Babe, Before It's News, Breitbart.com, Mercola.com, InfoWars, One America News, and even a few with scholarly-sounding names -- like The Society for Scientific Exploration, Evolution News, and The American College of Pediatricians are three examples -- are clearinghouses for fringe-y and discredited ideas, often backed up by data that's either cherry-picked and misrepresented, or from sources even further down the ladder of sketchy credibility.

Given how much bullshit is out there,  a lot of it well-hidden behind facts, figures, and fancy writing, it can be a challenge for laypeople (and I very much count myself amongst their numbers) to discern truth from fiction.  It's also an uphill struggle to fight against the very natural human tendency of confirmation bias; we all would love it if our cherished notions of how the world works were one hundred percent correct.  But if we want to make smart decisions, we all need to stop saying "I did my research" when all that "research" involved was a twenty-minute Google search to find the website of some random crank who confirmed what we already believed.

Remember, as the brilliant journalist Kathryn Schulz points out, that one of the most mind-expanding and liberating things we can say is, "I don't know.  Maybe I'm wrong."  And to start from that open-minded perspective and find out what the facts really are -- from the actual researchers.

****************************************


Tuesday, July 12, 2022

Warning: DNA is everywhere!

Because evidently my generally abysmal opinion of the intelligence of the human species isn't low enough, yesterday a loyal reader sent me an article referencing a survey in which eighty percent of respondents said they favored mandatory labeling of foods that contain DNA.



[Image is in the Public Domain courtesy of the National Institute of Health]

I kept looking, in vain, for a sign that this was a joke.  Sadly, this is real.  It came from a study done by the Oklahoma State University Department of Agricultural Economics.  And what it shows, in my opinion, is that there are people out there who vote and make important decisions and (apparently) walk upright without dragging their knuckles on the ground, and yet who do not know that DNA is found in every living organism.

Or maybe, they don't know that most of what we eat is made of cells.  I dunno.  Whatever.  Because if you aren't currently on the Salt, Baking Soda, and Scotch Diet, you consume the DNA of plants and/or animals every time you eat.

Lettuce contains lettuce DNA.  Potatoes contain potato DNA.  Beef contains cow DNA.  "Slim Jims" contain -- well, they contain the DNA of whatever the hell Slim Jims are made from.  I don't want to know.  But get the picture?  If you put a label on foods with DNA, the label goes on everything.

Ilya Somin, of the Washington Post, even made a suggestion of what such a food-warning label might look like:
WARNING: This product contains deoxyribonucleic acid (DNA).  The Surgeon General has determined that DNA is linked to a variety of diseases in both animals and humans.  In some configurations, it is a risk factor for cancer and heart disease.  Pregnant women are at very high risk of passing on DNA to their children.
Despite the scary sound of Somin's tongue-in-cheek proposed label, there's nothing dangerous about eating DNA.  Enzymes in our small intestines break down the DNA we consume into individual building blocks (nucleotides), and we then use those building blocks to produce our own DNA every time we make new cells.  Which is all the time.  Eating pig DNA will not, as one of my students once asked me, "make us oink."

But this highlights something rather terrifying, doesn't it?  Every other day we're told things like "Thirty Percent of Americans Are Against GMOs" and "Forty Percent of Americans Disbelieve in Anthropogenic Climate Change" and "Thirty-Two Percent of Americans Believe the Earth is Six Thousand Years Old."  (If you're curious, I made those percentages up, because I really don't want to know what the actual numbers are, I'm depressed enough already.)  What the Oklahoma State University study shows is: none of that is relevant.  If eighty percent of Americans don't know what DNA is, why the fuck should I trust what they say on anything else even remotely scientific?

But it's the voting part that scares me, because as we've seen over and over again, dumb people vote for dumb people.  I'm not sure why this is, either, because you'd think that there'd be a sense that even if a lot of voters are dumb themselves, they'd want smart people running the country.  But maybe that'd make all the dumb people feel inferior.  Or maybe it's because the dumb people want to be reassured that they, too, could one day hold public office.

Either way, it's why we end up with public office being held by people like:
  • Mitt Romney: "I believe in an America where millions of Americans believe in an America that’s the America millions of Americans believe in.  That’s the America I love."
  • Louie Gohmert: "We give the military money, it ought to be to kick rears, break things, and come home."
  • Rick Perry: "The reason that we fought the [American] Revolution in the 16th century — was to get away from that kind of onerous crown, if you will."
  • Hank Johnson: "Guam is an island that is, what, twelve miles from shore to shore?  And on its smallest level, uh, smallest, uh, uh, location, it's uh, seven miles, uh, between one shore and the other...  My fear is that (if US Marines are sent there) the whole island will become so populated that it will tip over and capsize."
  • Diana DeGette: "These are ammunition, they’re bullets, so the people who have those now, they’re going to shoot them, so if you ban them in the future, the number of these high-capacity magazines is going to decrease dramatically over time because the bullets will have been shot and there won’t be any more available."
  • James Inhofe: "Well actually the Genesis 8:22 that I use in there is that ‘as long as the earth remains there will be seed time and harvest, cold and heat, winter and summer, day and night,’ my point is, God’s still up there.  The arrogance of people to think that we, human beings, would be able to change what He is doing in the climate is to me outrageous."
  • Henry Waxman: "We're seeing the reality of a lot of the North Pole starting to evaporate, and we could get to a tipping point.  Because if it evaporates to a certain point -- they have lanes now where ships can go that couldn't ever sail through before.  And if it gets to a point where it evaporates too much, there's a lot of tundra that's being held down by that ice cap."
The whole thing is profoundly distressing, and brings to mind the quote from Joseph de Maistre: "Democracy is the form of government in which everyone has a voice, and therefore in which the people get exactly the government they deserve."

Now, bear in mind that what I'm talking about here isn't simple ignorance.  We all have subjects upon which we are ignorant.  If I'm ever in any doubt of that in my own case, all I have to do is wait until the biennial meeting with my financial planner, because as soon as he starts talking about bond values and stocks and annuities and debentures and brokerage accounts, I end up with the same puzzled expression my dog would have if I attempted to teach him quantum physics.  

Ignorance, though, can be cured, with a little hard work and (most importantly) an admission that you actually don't understand everything.  What we're talking about here isn't ignorance alone; it's more like aggressive stupidity.  This is ignorance coupled with a defiant sort of confidence.  This would be like me taking my complete lack of knowledge of economics and finance, and trying to get people to hire me as a financial planner.

It brings to mind once again the quote from the brilliant biochemist, author, and polymath Isaac Asimov, which seems like as good a place as any to end: "There is a cult of ignorance in the United States, and there has always been.  The strain of anti-intellectualism has been a constant thread winding its way through our political and cultural life, nurtured by the false notion that democracy means that 'my ignorance is just as good as your knowledge.'"

****************************************


Thursday, March 25, 2021

A tsunami of lies

One of the ways in which the last few years have changed me is that it has made me go into an apoplectic rage when I see people sharing false information on social media.

I'm not talking about the occasional goof; I've had times myself that I've gotten suckered by parody news accounts, and posted something I thought was true that turns out to be some wiseass trying to be funny.  What bothers me is the devastating flood of fake news on everything from vaccines to climate change to politics, exacerbated by "news" agencies like Fox and OAN that don't seem to give a shit about whether what they broadcast is true, only that it lines up with the agenda of their directors.

I've attributed this tsunami of lies to two reasons: partisanship and ignorance.  (And to the intersection of partisanship and ignorance, where lie the aforementioned biased media sources.)  If you're ignorant of the facts, of course you'll be prone to falling for an appealing falsehood; and partisanship in either direction makes you much more likely to agree unquestioningly with a headline that lines up with what you already believed to be true.

Turns out -- ironically -- the assumption that the people sharing fake news are partisan, ignorant, or both might itself be an appealing but inaccurate assessment of what's going on.  A study in Nature this week has generated some curious results showing that once again, reality turns out to be more complex than our favored black-and-white assessments of the situation.


[Image is in the Public Domain]

A study by Ziv Epstein, Mohsen Mosleh, Antonio Arechar, Dean Eckles, and David Rand (of the Massachusetts Institute of Technology) and Gordon Pennycook (of the University of Regina) decided to see what was really motivating people to share false news stories online, and they found -- surprisingly -- that sheer carelessness played a bigger role than either partisanship or ignorance.  In "Shifting Attention to Accuracy Can Reduce Misinformation Online," the team describes a series of experiments involving over a thousand volunteers that leads us to the heartening conclusion that there might be a better way to stem the flood of lies online than getting people to change their political beliefs or engaging in a massive education program.

The setup of the study was as simple as it was elegant.  They first tested the "ignorance" hypothesis by taking test subjects and presenting them with various headlines, some true and some false, and asked them to determine which were which.  It turns out people are quite good at this; there was a full 56-point difference between the likelihood of correctly identifying true and false headlines and making a mistake.

Next, they tested the "partisanship" hypothesis.  The test subjects did worse on this task, but still the error rate wasn't as big as you might guess; people were still 10% less likely to rate true statements as false (or vice versa) even if those statements agreed with the majority stance of their political parties.  So partisanship plays a role in erroneous belief, but it's not the set of blinders many -- including myself -- would have guessed.

Last -- and this is the most interesting test -- they asked volunteers to assess their likelihood of sharing the news stories online, based upon their headlines.  Here, the difference between sharing true versus false stories dropped to only six percentage points.  Put a different way, people who are quite good at discerning false information overall, and still pretty good at recognizing it even when it runs counter to their political beliefs, will still share the news story anyhow.

What it seems to come down to is simple carelessness.  It's gotten so easy to share links that we do it without giving it much thought.  I know I've been a bit shame-faced when I've clicked "retweet" to a link on Twitter, and gotten the message, "Don't you want to read the article first?"  (In my own defense, it's usually been because the story in question is from a source like Nature or Science, and I've gotten so excited by whatever it was that I clicked "retweet" right away even though I fully intend to read the article afterward.  Another reason is the exasperating way Twitter auto-refreshes at seemingly random moments, so if you don't respond to a post right away, it might disappear forever.)  

Improving the rate at which people detected (and chose not to share) fake headlines turned out to be remarkably easy to tweak.  The researchers found that reminding people of the importance of accuracy at the start of the experiment decreased the volunteers' willingness to share false information, as did asking them to assess the accuracy of the headline prior to making the decision about whether to share it. 

It does make me wonder, though, about the role of pivotal "nodes" in the flow of misinformation -- a few highly-motivated people who start the ball of fake news rolling, with the rest of us spreading around the links (whatever our motivation for doing so) in a more piecemeal fashion.  A study by Zignal Labs, for example, found that the amount of deceptive or outright false political information on Twitter went down by a stunning 73% after Donald Trump's account was closed permanently.  (Think of what effect it might have had if Twitter had made this decision back in 2015.)

In any case, to wrap this up -- and to do my small part in addressing this problem -- just remember before you share anything that accuracy matters.  Truth matters.  It's very easy to click "share," but with that ease comes a responsibility to make sure that what we're sharing is true.  We ordinary folk can't dam the flow of bullshit singlehandedly, but each one of us has to take seriously our role in stopping up the leaks, small as they may seem.

******************************************

Last week's Skeptophilia book-of-the-week, Simon Singh's The Code Book, prompted a reader to respond, "Yes, but have you read his book on Fermat's Last Theorem?"

In this book, Singh turns his considerable writing skill toward the fascinating story of Pierre de Fermat, the seventeenth-century French mathematician who -- amongst many other contributions -- touched off over three hundred years of controversy by writing that there were no integer solutions for the equation  an + bn = cn for any integer value of n greater than 2, then adding, "I have discovered a truly marvelous proof of this, which this margin is too narrow to contain," and proceeding to die before elaborating on what this "marvelous proof" might be.

The attempts to recreate Fermat's proof -- or at least find an equivalent one -- began with Fermat's contemporaries, Evariste de Gaulois, Marin Mersenne, Blaise Pascal, and John Wallis, and continued for the next three centuries to stump the greatest minds in mathematics.  It was finally proven that Fermat's conjecture was correct by Andrew Wiles in 1994.

Singh's book Fermat's Last Theorem: The Story of a Riddle that Confounded the World's Greatest Minds for 350 Years describes the hunt for a solution and the tapestry of personalities that took on the search -- ending with a tour-de-force paper by soft-spoken British mathematician Andrew Wiles.  It's a fascinating journey, as enjoyable for a curious layperson as it is for the mathematically inclined -- and in Singh's hands, makes for a story you will thoroughly enjoy.

[Note: if you purchase this book using the image/link below, part of the proceeds goes to support Skeptophilia!]



Tuesday, December 10, 2019

Misremembering the truth

There are two distinct, but similar-sounding, cognitive biases that I've written about many times here at Skeptophilia because they are such tenacious barriers to rational thinking.

The first, confirmation bias, is our tendency to uncritically accept claims when they fit with our preconceived notions.  It's why a lot of conservative viewers of Fox News and liberal viewers of MSNBC sit there watching and nodding enthusiastically without ever stopping and saying, "... wait a moment."

The other, dart-thrower's bias, is more built-in.  It's our tendency to notice outliers (because of their obvious evolutionary significance as danger signals) and ignore, or at least underestimate, the ordinary as background noise.  The name comes from the thought experiment of being in a bar while there's a darts game going on across the room.  You'll tend to notice the game only when there's an unusual throw -- a bullseye, or perhaps impaling the bartender in the forehead -- and not even be aware of it otherwise.

Well, we thought dart-thrower's bias was more built into our cognitive processing system and confirmation bias more "on the surface" -- and the latter therefore more culpable, conscious, and/or controllable.  Now, it appears that confirmation bias might be just as hard-wired into our brains as dart-thrower's bias is.

A paper appeared this week in Human Communication Research, describing research conducted by a team led by Jason Coronel of Ohio State University.  In "Investigating the Generation and Spread of Numerical Misinformation: A Combined Eye Movement Monitoring and Social Transmission Approach," Coronel, along with Shannon Poulsen and Matthew D. Sweitzer, did a fascinating series of experiments that showed we not only tend to accept information that agrees with our previous beliefs without question, we honestly misremember information that disagrees -- and we misremember it in such a way that in our memories, it further confirms our beliefs!

The location of memories (from Memory and Intellectual Improvement Applied to Self-Education and Juvenile Instruction, by Orson Squire Fowler, 1850) [Image is in the Public Domain]

What Coronel and his team did was to present 110 volunteers with passages containing true numerical information on social issues (such as support for same-sex marriage and rates of illegal immigration).  In some cases, the passages agreed with what (according to polls) most people believe to be true, such as that the majority of Americans support same-sex marriage.  In other cases, the passages contained information that (while true) is widely thought to be untrue -- such as the fact that illegal immigration across the Mexican border has been dropping for years and is now at its lowest rates since the mid-1990s.

Across the board, people tended to recall the information that aligned with the conventional wisdom correctly, and the information that didn't incorrectly.  Further -- and what makes this experiment even more fascinating -- is that when people read the unexpected information, data that contradicted the general opinion, eye-tracking monitors recorded that they hesitated while reading, as if they recognized that something was strange.  In the immigration passage, for example, they read that the rate of immigration had decreased from 12.8 million in 2007 to 11.7 million in 2014, and the readers' eyes bounced back and forth between the two numbers as if their brains were saying, "Wait, am I reading that right?"

So they spent longer on the passage that conflicted with what most people think -- and still tended to remember it incorrectly.  In fact, the majority of people who did remember wrong got the numbers right -- 12.8 million and 11.7 million -- showing that they'd paid attention and didn't just scoff and gloss over it when they hit something they thought was incorrect.  But when questioned afterward, they remembered the numbers backwards, as if the passage had actually supported what they'd believed prior to the experiment!

If that's not bad enough, Coronel's team then ran a second experiment, where the test subjects read the passage, then had to repeat the gist to another person, who then passed it to another, and so on.  (Remember the elementary school game of "Telephone?")  Not only did the data get flipped -- usually in the first transfer -- subsequently, the difference between the two numbers got greater and greater (thus bolstering the false, but popular, opinion even more strongly).  In the case of the immigration statistics, the gap between 2007 and 2014 not only changed direction, but by the end of the game it had widened from 1.1 million to 4.7 million.

This gives you an idea what we're up against in trying to counter disinformation campaigns.  And it also illustrates that I was wrong in one of my preconceived notions; that people falling for confirmation bias are somehow guilty of locking themselves deliberately into an echo chamber.  Apparently, both dart-thrower's bias and confirmation bias are somehow built into the way we process information.  We become so certain we're right that our brain subconsciously rejects any evidence to the contrary.

Why our brains are built this way is a matter of conjecture.  I wonder if perhaps it might be our tribal heritage at work; that conforming to the norm, and therefore remaining a member of the tribe, has a greater survival value than being the maverick who sticks to his/her guns about a true but unpopular belief.  That's pure speculation, of course.  But what it illustrates is that once again, our very brains are working against us in fighting Fake News -- which these days is positively frightening, given how many powerful individuals and groups are, in a cold and calculated fashion, disseminating false information in an attempt to mislead us, frighten us, or anger us, and so maintain their positions of power.

***********************

This week's Skeptophilia book of the week is brand new; Brian Clegg's wonderful Dark Matter and Dark Energy: The Hidden 95% of the Universe.  In this book, Clegg outlines "the biggest puzzle science has ever faced" -- the evidence for the substances that provide the majority of the gravitational force holding the nearby universe together, while simultaneously making the universe as a whole fly apart -- and which has (thus far) completely resisted all attempts to ascertain its nature.

Clegg also gives us some of the cutting-edge explanations physicists are now proposing, and the experiments that are being done to test them.  The science is sure to change quickly -- every week we seem to hear about new data providing information on the dark 95% of what's around us -- but if you want the most recently-crafted lens on the subject, this is it.

[Note: if you purchase this book from the image/link below, part of the proceeds goes to support Skeptophilia!]





Saturday, March 9, 2019

Rules of engagement

One of my besetting sins is being easily frustrated and taking things way too seriously.

Which is why recent developments in government have made me want to punch a wall.  Paul Manafort's softball prison sentence. Trump's demand that tornado-struck Alabama -- a red state -- receive "A-one treatment" with respect to disaster aid, while California -- a blue state -- was told they "should have raked their leaves" when they experienced the worst wildfires in the state's history.  The fact that a bunch of Republican legislators in New Hampshire thought it was appropriate to wear strings of pearls when confronted with gun law activists (ridiculing them by implying they were "clutching their pearls" -- making a big deal out of nothing).

All of those had me grinding my teeth down to nubs out of a sense of impotent rage.  A feeling of helplessness has become endemic in the last two years -- that we're powerless to stop the freewheeling corruption of this administration, the blind eye being turned toward Russian interference in American elections, and the complicity of lawmakers (exemplified by the smirking Mitch McConnell, who just this week said he wasn't going to bring an election reform bill onto the Senate floor purely because he "gets to decide").

So it was a bit of a relief to read a paper that appeared in appeared in Nature last week.  It's by climatologists Justin Farrell and Kathryn McConnell (of Yale University) and Robert Brulle (of Brown University), and is titled, "Evidence-based Strategies to Combat Scientific Misinformation."

Unsurprisingly -- even had I not already told you the field Farrell et al. were in -- is that the specific misinformation they're referring to is anthropogenic climate change.  The authors write:
Nowhere has the impact of scientific misinformation been more profound than on the issue of climate change in the United States.  Effective responses to this multifaceted problem have been slow to develop, in large part because many experts have not only underestimated its impact, but have also overlooked the underlying institutional structure, organizational power and financial roots of misinformation.  Fortunately, a growing body of sophisticated research has emerged that can help us to better understand these dynamics and provide the basis for developing a coordinated set of strategies across four related areas (public inoculation, legal strategies, political mechanisms and financial transparency) to thwart large-scale misinformation campaigns before they begin, or after they have taken root.
Which is packing a lot into a single paragraph.  They are unhesitatingly (and correctly) blaming the doubts in the public's mind over climate change on a large-scale -- and deliberate -- misinformation campaign on the part of the fossil fuels industry and the politicians they're funding.  In a press release from Yale University on the research, lead author Justin Farrell said:
Many people see these efforts to undermine science as an increasingly dangerous challenge and they feel paralyzed about what to do about it.  But there’s been a growing amount of research into this challenge over the past few years that will help us chart out some solutions...  Ultimately we have to get to the root of the problem, which is the huge imbalance in spending between climate change opponents and those lobbying for new solutions.  Those interests will always be there, of course, but I’m hopeful that as we learn more about these dynamics things will start to change.  I just hope it’s not too late.
Farrell et al. describe four realms that need to be addressed to counter these misinformation campaigns.  They are:
  • Public inoculation -- presenting the public with refuted arguments (including how they've been refuted) before the disinformation specialists have a chance to launch their campaign, so non-scientists are immune to their effects, and upon hearing them, will say, "Oh, yeah, that.  That's already been disproven."
  • Legal strategies -- actively target fossil fuel companies (and their lobbyists) with lawsuits when they libel reputable climate scientists with accusations of bias or outright falsification of data.  The difficulty is that the fossil fuel companies have way deeper pockets than do environmental activists -- but at least the attempt will bring the smear tactics into the public eye.
  • Political mechanisms -- focusing on research into how the political process has been subverted by corporate anti-environmental interests.
  • Financial transparency -- promoting legislation requiring public disclosure of who is funding political candidates, and encouraging investigation into elected officials whose actions have been compromised by donations from corporations.
Having concrete strategies to approach the problem is good, but the difficulty is, many of these rely on laws being passed by senators and representatives who are already compromised and have every reason to block change.  "We’re really just at the tip of the iceberg in terms of understanding the full network of actors and how they’re moving money in these efforts," said study co-author Kathryn McConnell.  "The better we can understand how these networks work, the better the chances that policymakers will be able to create policy that makes a difference."

[Image is in the Public Domain]

Which is an optimistic outlook.  Still, it's frustrating that any efforts in these directions are bound to be glacially slow, and my sense is that we don't really have much time left in which to act.  But the fact that this research is out there is a good first step.  Now we need to make certain that it doesn't simply sink into obscurity like most of the research on climate has done, buried under the sneering climate denialism of Fox News.

What it highlights is that this is a battle we can win.  Not that it'll be easy or quick; overcoming the mountain of misinformation out there, deliberately created by groups whose priority is short-term profit over the long-term habitability of the Earth, won't happen overnight.

But the fact that a team of climatologists thinks it can happen at all is encouraging.  Despite what feels like daily losing battle against succumbing to despair, it's not time to give up.

If they think we can win, maybe we should, too.

********************************

This week's Skeptophilia book recommendation is not only a fantastic read, it's a cautionary note on the extent to which people have been able to alter the natural environment, and how difficult it can be to fix what we've trashed.

The Control of Nature by John McPhee is a lucid, gripping account of three times humans have attempted to alter the outcome of natural processes -- the nearly century-old work by the Army Corps of Engineers to keep the Mississippi River within its banks and stop it from altering its course down what is now the Atchafalaya River, the effort to mitigate the combined hazards of wildfires and mudslides in California, and the now-famous desperate attempt by Icelanders to stop a volcanic eruption from closing off their city's harbor.  McPhee interviews many of the people who were part of each of these efforts, so -- as is typical with his writing -- the focus is not only on the events, but on the human stories behind them.

And it's a bit of a chilling read in today's context, when politicians in the United States are one and all playing a game of "la la la la la, not listening" with respect to the looming specter of global climate change.  It's a must-read for anyone interested in the environment -- or in our rather feeble attempts to change its course.

[If you purchase the book from Amazon using the image/link below, part of the proceeds goes to supporting Skeptophilia!]





Wednesday, October 11, 2017

Course correction

I suppose you could say that everything I write here at Skeptophilia has the same overarching theme; how to tell truth from falsehood, how to recognize spurious claims, how to tell if you're being had.  But helping people to do this is an uphill struggle, and just how uphill was highlighted by a meta-analysis published last week in the Journal of the Association for Psychological Science, which had the rather dismal conclusion that we debunkers are kind of fucked no matter what we do.

Of course, being academics, they didn't state it that way.  Here's how the authors phrased it:
This meta-analysis investigated the factors underlying effective messages to counter attitudes and beliefs based on misinformation.  Because misinformation can lead to poor decisions about consequential matters and is persistent and difficult to correct, debunking it is an important scientific and public-policy goal. This meta-analysis revealed large effects for presenting misinformation, debunking, and the persistence of misinformation in the face of debunking.  Persistence was stronger and the debunking effect was weaker when audiences generated reasons in support of the initial misinformation.  A detailed debunking message correlated positively with the debunking effect.  Surprisingly, however, a detailed debunking message also correlated positively with the misinformation-persistence effect.
Put more simply, the authors, Man-pui Sally Chan, Christopher R. Jones, and Kathleen Hall Jamieson of the University of Pennsylvania, and Dolores Albarracín of the University of Illinois at Urbana-Champaign, found that when confronting misinformation, a detailed response generates some degree of correction -- but makes some people double down on their incorrect understanding.

So it's yet another verification of the backfire effect, which makes it a little hard to see how we skeptics are supposed to move forward.  And the problem becomes even worse when people have been taught to distrust sources that could potentially ameliorate the problem; I can't tell you how many times I've seen posts stating that sites like Snopes and FactCheck.org are flawed, hopelessly biased, or themselves have an agenda to pull the wool over people's eyes.

It's like I've said before: once you convince people to doubt the facts, and that everyone is lying, you can convince them of anything.

[image courtesy of photographer John Snape and the Wikimedia Commons]

"The effect of misinformation is very strong," said co-author Dolores Albarracín.  "When you present it, people buy it.  But we also asked whether we are able to correct for misinformation.  Generally, some degree of correction is possible but it’s very difficult to completely correct."

The authors weren't completely doom-and-gloom, however, and made three specific recommendations for people dedicated to skepticism and the truth.  These are:
  • Reduce arguments that support misinformation: the media needs to be more careful about inadvertently repeating or otherwise giving unwarranted credence to the misinformation itself.
  • Engage audiences in scrutiny and counterarguing of information: schools, especially, should promote skepticism and critical thinking.  It is beneficial to have the audience involved in generating counterarguments -- further supporting the general idea of "teach people how to think, not what to think."
  • Introduce new information as part of the debunking message: give evidence and details.  Even though "misinformation persistence" is strong even in the face of detailed debunking, there was a positive correlation between detailed information and correction of misapprehension.  So: don't let the backfire effect stop you from fighting misinformation.
It may be an uphill battle, but it does work, and is certainly better than the alternative, which is giving up.  As Albarracín put it: "What is successful is eliciting ways for the audience to counterargue and think of reasons why the initial information was incorrect."

I think the most frustrating part of all this for me is that there are biased media sources.  Lots of them.  Some of them (so-called "clickbait") post bullshit to drive up ad revenue; others are simply so ridiculously slanted that anything they publish should be independently verified every single time.  And because people tend to gravitate toward media that agree with what they already thought was true, sticking with sources that conform to your own biases makes it unlikely that you'll see where you're wrong (confirmation bias), and will allow you to persist in that error because you're surrounding yourself by people who are saying the same thing (the echo-chamber effect).

And that one, I don't know how to address.  It'd be nice if the fringe media would act more responsibly -- but we all know that's not going to happen any time soon.  So I'll just end with an exhortation for you to broaden the media you do read -- if you're conservative, check out the arguments on MSNBC every once in a while (and give them serious thought; don't just read, scoff, and turn away).  Same if you're a liberal; hit Fox News on occasion.  It may not change your mind, but at least it'll make it more likely that you'll discover the holes in your own thinking.