Skeptophilia (skep-to-fil-i-a) (n.) - the love of logical thought, skepticism, and thinking critically. Being an exploration of the applications of skeptical thinking to the world at large, with periodic excursions into linguistics, music, politics, cryptozoology, and why people keep seeing the face of Jesus on grilled cheese sandwiches.
Showing posts with label cognitive bias. Show all posts
Showing posts with label cognitive bias. Show all posts

Saturday, August 27, 2022

Perception and suggestion

One topic that has come up over and over again here at Skeptophilia is the rather unsettling idea that the high opinion most of us have of our perceptions and memories is entirely unjustified.  Every time we're tempted to say "I saw it with my own eyes" or "of course it happened that way, I remember it," it should be a red flag reminding us of how inaccurate our brains actually are.

Now, to be fair, they work well enough.  It'd be a pretty significant evolutionary disadvantage if our sensory processing organs and memory storage were as likely to be wrong as right.  But a system that is built to work along the lines of "meh, it's good enough to get by, at least by comparison to everyone else" -- and let's face it, that's kind of how evolution works -- is inevitably going to miss a lot.  "Our experience of reality," said neuroscientist David Eagleman, "is constrained by our biology."  He talks about the umwelt -- the world as experienced by a particular organism -- and points out that each species picks up a different tiny slice of all the potential sensory inputs that are out there, and effectively misses everything else.

It also means that even of the inputs in our particular umwelt, the brain is going to make an executive decision regarding which bits are important to pay attention to.  People with normal hearing (for example) are being bombarded constantly by background sounds, which for most of us most of the time, we ignore as irrelevant.  In my intro to neuroscience classes, I used to point this out by asking students how many of them were aware (prior to my asking the question) of the sound of the fan running in the heater.  Afterward, of course, they were; beforehand, the sound waves were striking their ears and triggering nerve signals to the brain just like any other noise, but the brain was basically saying "that's not important."   (Once it's pointed out, of course, you can't not hear it; one of my students came into my room four days later, scowled at me, and said, "I'm still hearing the heater.  Thanks a lot.")

The point here is that we are about as far away from precision reality-recording equipment as you can get.  What we perceive and recall is a small fraction of what's actually out there, and is remembered only incompletely and inaccurately.

The Doors of Perception by Alan Levine [Image licensed under the Creative Commons cogdogblog, Doors of Perception (15354754466), CC BY 2.0]

Worst of all, what we do perceive and recall is also modified by what we think we should be perceiving and recalling.  This point was underscored by some cool new research done by a team led by Hernán Aniló at the Université Paris Sciences et Lettres, which showed that all it takes is a simple (false) suggestion of what we're seeing to foul up our perception completely.

The experiment was simple and elegant.  Subjects were shown a screen with an image of a hundred dots colored either blue or yellow.  Some of the screens had exactly fifty of each; others were sixty/forty (one way or the other).  The volunteers were then asked to estimate the proportions of the colors on a sequence of different screens, and to give an assessment of how confident they were in their guess.

The twist is that half of the group was given a "hint" -- a statement that in some of the screens, one of the colors was twice as frequent as the other.  (Which, of course, is never true.)  And this "hint" caused the subjects not only to mis-estimate the color frequencies, but to be more confident in their wrong guesses, especially in volunteers for whom a post-test showed a high inclination toward social suggestibility.

As easily-understood as the experiment is, it has some profound implications.  "Information is circulating with unprecedented speed, and it even finds its way into our social feeds against our will sometimes," Aniló said.  "It’s becoming increasingly difficult to observe events without having to go through some level of information on those events beforehand (e.g. buying a shirt, but not before reading its reviews online).  What we are looking at in our research here is how much the information you receive is going to contribute to the construction of your perceptual reality, and fundamentally, what are the individual psychological features that condition the impact that that information will have in shaping what you see and think, whether you like it or not.  Of course, we are not talking about enormous effects that can completely distort the world around you (e.g., no amount of false/imprecise information can make you misperceive a small bird as a 3-ton truck), but what our study shows is that, provided you are permeable enough to social influence (which we all are, the key here being how much), then false information can slightly shift your perception in whatever direction the information points."

What this means, of course, is that we have to be constantly aware of our built-in capacity for being fooled.  And although we clearly vary in that capacity, we shouldn't fall for believing "I'm seeing reality, it's everyone else who is wrong."  The truth is, we're all prone to inaccurate perception and recall, and all capable of having the power of suggestion alter what we see.  "Perception is a complex construction, and information is never an innocent bystander in this process," Anlló said.  "Always be informed, but make sure that your sources are of high quality, and trustworthy.  Importantly, when I say high-quality I do not mean a source that you may trust because of emotional reasons or social links, but rather by the accuracy of the information they provide and the soundness of the evidence.  Indeed, our experiment shows that your level of suggestibility to your social environment (how much you dress like your friends, or feel influenced by their taste in music) will also predict your permeability to perceptual changes triggered by false information.  This, much like many other cognitive biases, is part of the human experience, and essentially nothing to worry about.  Being susceptible to your social environment is actually a great thing that makes us humans thrive as a species, we just need to be aware of it and try our best to limit our exposure to bad information."

The most alarming thing of all, of course, is that the people who run today's news media are well aware of this capacity, and use it to reinforce the perception by their consumers that only they are providing accurate information.  "Listen to us," they tell us, "because everyone else is lying to you."  The truth is, there is no unbiased media; given that their profits are driven by telling viewers the bit of the news that supports what they think the viewers already want to believe, they have exactly zero incentive to provide anything like balance.  The only cure is to stay as aware as we can of our own capacity for being fooled, and to stick as close to the actual facts as possible (and, conversely, as far away as possible from the talking heads and spin-meisters who dominate the nightly news on pretty much whichever channel you choose).

If our perceptions of something as simple and concrete as the number of colored dots on a screen can be strongly influenced by a quick and inaccurate "hint," how much easier is it to alter our perception of the world with respect to complex and emotionally-laden issues -- especially when there's a powerful profit motive on the part of the people giving us the hints?

****************************************


Saturday, November 9, 2019

Poisoned by preconceived notions

If you needed something else to make you worry about our capacity to make decisions based on facts, go no further than a study that came out this week from the University of Texas at Austin.

Entitled "Fake News on Social Media: People Believe What They Want to Believe When it Makes No Sense At All," the study was conducted by Patricia L. Moravec, Randall K. Minas, and Alan R. Dennis of the McCombs School of Business.  And its results should be seriously disheartening for just about everyone.

What they did was a pair of experiments using students who were "social media literate" -- i.e., they should know social media's reputation for playing fast and loose with the truth -- first having them evaluate fifty headlines as true or false, and then giving them headlines with "Fake News" flags appended.  In each case, there was an even split -- in the first experiment, between true and false headlines, and in the second, between true and false headlines flagged as "Fake."

In both experiments, the subjects were hooked up to an electroencephalogram (EEG) machine, to monitor their brain activity as they performed the task.

In the first experiment, it was found -- perhaps unsurprisingly -- that people are pretty bad at telling truth from lies when presented only with a headline.  But the second one is the most interesting, and also the most discouraging.  Because what the researchers found is that when a true headline is flagged as false, and a false headline is flagged as true, this causes a huge spike in activity of the prefrontal cortex -- a sign of cognitive dissonance as the subject tries desperately to figure out how this can be so -- but only if the labeling of the headline as such disagrees with what they already believed.


[Image is in the Public Domain]

So we're perfectly ready to believe the truth is a lie, or a lie is the truth, if it fits our preconceived notions.  And worse still, what the researchers saw is that in general, even though subjects had an uncomfortable amount of cognitive processing going on when they were confronted by something that was the opposite of what they thought was true, it didn't have much influence over what they thought was true after the experiment.

In other words, you can label the truth a lie, or a lie the truth, but it won't change people's minds if they already believed the opposite.  Our ability to discern fact from fiction, and use that information to craft our view of the world, is poisoned by our preconceived notions of what we'd like to be true.

Before you start pointing fingers, the researchers also found that there was no good predictor of how well subjects did on this test.  They were all bad -- Democrats and Republicans, higher IQ and lower IQ, male and female.

"When we’re on social media, we’re passively pursuing pleasure and entertainment," said Patricia Moravec, who was lead author of the study, in an interview with UT News.  "We’re avoiding something else...  The fact that social media perpetuates and feeds this bias complicates people’s ability to make evidence-based decisions.  But if the facts that you do have are polluted by fake news that you truly believe, then the decisions you make are going to be much worse."

This is insidious because even if we are just going on social media to be entertained, the people posting political advertisements on social media aren't.  They're trying to change our minds.  And what the Moravec et al. study shows is that we're not only lousy at telling fact from fiction, we're very likely to get suckered by a plausible-sounding lie (or, conversely, to disbelieve an inconvenient truth) if it fits with our preexisting political beliefs.

Which makes it even more incumbent on the people who run social media platforms (yeah, I'm lookin' at you, Mark Zuckerberg) to have on-staff fact checkers who are empowered to reject ads on both sides of the political aisle that are making false claims.  It's not enough to cite free speech rights as an excuse for abrogating your duty to protect people from immoral and ruthless politicians who will say or do anything to gain or retain power.  The people in charge of social media are under no obligation to run any ad someone's willing to pay for.  It's therefore their duty to establish criteria for which ads are going to show up -- and one of those criteria should surely be whether it's the truth.

The alternative is that our government will continue to be run by whoever has the cleverest, most attractive propaganda.  And as we've seen over the past three years, this is surely a recipe for disaster.

**********************************

This week's Skeptophilia book recommendation is a fun book about math.

Bet that's a phrase you've hardly ever heard uttered.

Jordan Ellenberg's amazing How Not to Be Wrong: The Power of Mathematical Thinking looks at how critical it is for people to have a basic understanding and appreciation for math -- and how misunderstandings can lead to profound errors in decision-making.  Ellenberg takes us on a fantastic trip through dozens of disparate realms -- baseball, crime and punishment, politics, psychology, artificial languages, and social media, to name a few -- and how in each, a comprehension of math leads you to a deeper understanding of the world.

As he puts it: math is "an atomic-powered prosthesis that you attach to your common sense, vastly multiplying its reach and strength."  Which is certainly something that is drastically needed lately.

[Note: if you purchase this book using the image/link below, part of the proceeds goes to support Skeptophilia!]





Saturday, September 14, 2019

The illusion of truth

Because we apparently need one more cognitive bias to challenge our confidence in what we hear on the news on a daily basis, today I'm going to tell you about the illusory truth effect.

The idea here is that if you hear a falsehood repeated often enough, in your mind, it becomes a fact.  This is the "big lie" approach that Hitler recommends in Mein Kampf:
All this was inspired by the principle—which is quite true within itself—that in the big lie there is always a certain force of credibility; because the broad masses of a nation are always more easily corrupted in the deeper strata of their emotional nature than consciously or voluntarily; and thus in the primitive simplicity of their minds they more readily fall victims to the big lie than the small lie, since they themselves often tell small lies in little matters but would be ashamed to resort to large-scale falsehoods. 
It would never come into their heads to fabricate colossal untruths, and they would not believe that others could have the impudence to distort the truth so infamously.  Even though the facts which prove this to be so may be brought clearly to their minds, they will still doubt and waver and will continue to think that there may be some other explanation.  For the grossly impudent lie always leaves traces behind it, even after it has been nailed down, a fact which is known to all expert liars in this world and to all who conspire together in the art of lying.
But the most referenced quote framing this idea comes from Nazi Propaganda Minister Joseph Goebbels: "If you tell a lie big enough and keep repeating it, people will eventually come to believe it."

Which is more than a little ironic, because there's no evidence Goebbels ever said (or wrote) that -- although he certainly did embody the spirit of it.

The topic comes up because of a study that appeared in Cognition this week, called, "An Initial Accuracy Focus Prevents Illusory Truth," by psychologists Nadia M. Brashier (of Harvard University) and Emmaline Drew Eliseev and Elizabeth J. Marsh (of Duke University).  And what they found was simultaneously dismaying and heartening; that it is very easy to get people to fall for illusory truth through repetition, and they can be inoculated against it by having them read the source material with a critical eye the first time, striking out erroneous information.  Doing that, apparently, inoculates them against falling for the lie later, even after repeated exposure.

[Image licensed under the Creative Commons RyanMinkoff, Academic dishonesty, CC BY-SA 4.0]

What's especially frightening about the dismaying part of this study is that being taken in by repeated falsehoods even works for purely factual, easily checkable information.  One of the statements they used was "The fastest land mammal is the leopard," which most people recognize as false (the fastest land mammal is the cheetah).  The surmise is that if you keep seeing the same incorrect statement, you begin to doubt your own understanding or your own memory.

I know this happens to me.  There are few topics I'm so completely confident about that I could hear someone make a contradicting statement and think, "No, that's definitely wrong."  I'm much more likely to think, "Wait... am I remembering incorrectly?"  Part of the problem is that I'm a raging generalist; I know a little bit about a great many things, so if an expert comes along and says I've got it wrong, I'm putting my money on the expert.  (I've also been called a "dilettante" or a "dabbler" or "a light year across and an inch deep," but on the whole I like "generalist" better.)

The problem is, it's easy to mistake someone who simply speaks with a lot of confidence as being an expert.  Take, for example, Donald Trump.  (Please.  No, really, please.  Take him.)  He's lied so many times there's a whole Wikipedia page devoted to "Veracity of Statements by Donald Trump."  As only one example of the illusory truth effect, take his many-times-repeated statement that he would have won the popular vote if it hadn't been for millions of votes cast fraudulently for Hillary Clinton, and also that his electoral college win was "the biggest landslide in history" (it wasn't even close; of the 58 presidential elections the United States has had, Donald Trump's electoral college win comes in at #46).

The problem is, Trump makes these statements with so much confidence, and with such frequency, that it's brought up the question of whether he actually believes them to be true.  Even if he's lying, the technique is remarkably effective -- a sort of Gish gallop of falsehood (the latter term named after creationist Duane Gish, who was known for swamping his debate opponents with rapid-fire arguments of dubious veracity, wearing them down simply by the overall volume).  A lot of his supporters believe that he won by a landslide, that Clinton only did as well as she did because of rampant fraud, and a host of other demonstrably false beliefs (such as the size of Trump's inauguration crowd, attendance at his rallies, how well the economy is doing, and that the air and water in the United States are the highest quality in the world).

So to put the research by Brashier et al. to work, somehow people would have to be willing and able to fact check these statements as they're happening, the first time they hear them -- not very likely, especially given the role of confirmation bias in affecting how much people believe these statements at the outset (someone who supports Trump already would be more likely to believe him, for example when he's stated that the number of illegal immigrants is the highest it's ever been, when in fact it peaked in 2007 and has been falling steadily ever since).

In any case, it's hard to see how all this helps us.  The traction of "alternative facts" has simply become too great, as has the vested interest of partisan and sensationalized media.  Not for nothing do Brashier et al. call our current situation "the post-truth world."

********************************************

This week's Skeptophilia book recommendation is pure fun: science historian James Burke's Circles: Fifty Round Trips Through History, Technology, Science, and Culture.  Burke made a name for himself with his brilliant show Connections, where he showed how one thing leads to another in discoveries, and sometimes two seemingly unconnected events can have a causal link (my favorite one is his episode about how the invention of the loom led to the invention of the computer).

In Circles, he takes us through fifty examples of connections that run in a loop -- jumping from one person or event to the next in his signature whimsical fashion, and somehow ending up in the end right back where he started.  His writing (and his films) always have an air of magic to me.  They're like watching a master conjuror create an illusion, and seeing what he's done with only the vaguest sense of how he pulled it off.

So if you're an aficionado of curiosities of the history of science, get Circles.  You won't be disappointed.

[Note: if you purchase this book using the image/link below, part of the proceeds goes to support Skeptophilia!]





Tuesday, August 2, 2016

The universality of prejudice

One of the most insidious of biases is the perception that you are not biased.

Of course everyone else has their blind spots, their misapprehensions, their unquestioned assumptions about the world.  You, on the other hand?  You see the world through these perfectly clear lenses.  As Kathryn Schulz put it in her phenomenal TED Talk "On Being Wrong," "Of course we all accept that we're fallible, that we make mistakes in the abstract sense.  But try to think of one thing, one single thing, that you're wrong about now?  You can't do it."

Social psychologists Mark Brandt (of Tilburg University in the Netherlands) and Jarret Crawford (of the College of New Jersey) published a study this week in the journal Social Psychological and Personality Science that delivers a death blow to this perception, and underscores the fact that none of us are free from prejudice.  Our sense that prejudice is the bailiwick of the unintelligent turns out to be less than a half truth.  Your level of cognitive ability doesn't predict whether or not you're prejudiced -- it only predicts the sorts of things you're likely to be prejudiced about.

Through an analysis of survey data from 5,914 people in the United States, Brandt and Crawford drew conclusions that should give all of us pause.  Their results, which seem to be robust, indicate that people of low cognitive ability (as assessed by a test of verbal ability) tend to express prejudice toward groups perceived as liberal or unconventional (such as gays and atheists) and also groups for which membership is not a choice (such as ethnic minorities).  People of high cognitive ability are not less prejudiced, they simply show the opposite pattern -- showing prejudice toward groups perceived as conservative or conventional, and for which membership is by choice (such as Christians, Republicans, the military, and big business).

"There are a variety of belief systems and personality traits that people often think protect them from expressing prejudice," Brandt explains.  "In our prior work we found that people high and low in the personality trait of openness to experience show very consistent links between seeing a group as ‘different from us’ and expressing prejudice towards that group.  The same appears to be true for cognitive ability.

"Whereas prior work by others found that people with low cognitive ability express more prejudice, we found that this is limited to only some target groups.  For other target groups the relationship was in the opposite direction.  For these groups, people with high levels of cognitive ability expressed more prejudice.  So, cognitive ability also does not seem to make people immune to expressing prejudice."

[image courtesy of the Wikimedia Commons]

It's a finding that's well worth keeping in mind.  The key seems to be not in eliminating prejudice (a goal that is probably impossible) but in placing our prejudices out in the open where we can keep an eye on them.  If I (for example) have the inclination to believe that Democrats are wishy-washy bleeding hearts who don't give a rat's ass about national security, it's important for me to keep that bias in my conscious mind -- and to listen more carefully to Democrats when they speak, because I'm more likely to let my assumptions do the thinking for me.  

Even more critical, though, is to keep biases in mind when you're listening to someone you're inclined to agree with.  If, on the other hand, you're prone to thinking that Democrat = correct, be on your guard, because that assumption of righteousness is going to blind you to what you're actually being told.  How many times have we given a pass to someone who has turned out to be spouting nonsense, simply because (s)he belongs to the same political party, religion, or ethnic group as we do?

The bottom line is, be aware of your biases, and don't be afraid to challenge them.  Keep your brain turned on.  The human mind is rife with prejudice, unquestioned assumptions, and sloppy thinking, and that's not just true of the people you disagree with.  It's all of us, all of the time.  The best thinkers aren't the ones who expunge all such mental murk from their brains; they're just the ones who are the most determined to question their own mental set rather than assuming that it must be right about everything.

Monday, July 25, 2016

Fooling the experts

Today we consider what happens when you blend Appeal to Authority with the Dunning-Kruger Effect.

Appeal to Authority, you probably know, is when someone uses credentials, titles, or educational background -- and no other evidence -- to support a claim.  Put simply, it is the idea that if Stephen Hawking said it, it must be true, regardless of whether the claim has anything to do with Hawking's particular area of expertise.  The Dunning-Kruger Effect, on the other hand, is the idea that people tend to wildly overestimate their abilities, even in the face of evidence to the contrary, which is why we all think we're above average drivers.

Well, David Dunning (of the aforementioned Dunning-Kruger Effect) has teamed up with Cornell University researchers Stav Atir and Emily Rosenzweig, and come up with the love child of Dunning-Kruger and Appeal to Authority.  And what this new phenomenon -- dubbed, predictably, the Atir-Rosenzweig-Dunning Effect -- shows us is that people who are experts in a particular field tend to think that expertise holds true even for disciplines far outside their chosen area of study.

[image courtesy of the Wikimedia Commons]

In one experiment, the three researchers asked people to rate their own knowledge in various academic areas, then asked them to rank their level of understanding of various finance-related terms, such as "pre-rated stocks, fixed-rate deduction and annualized credit."  The problem is, those three finance-related terms actually don't exist -- i.e., they were made up by the researchers to sound plausible.

The test subjects who had the highest confidence level in their own fields were most likely to get suckered.  Simon Oxenham, who described the experiments in Big Think, says it's only natural.  "A possible explanation for this finding," Oxenham writes, "is that the participants with a greater vocabulary in a particular domain were more prone to falsely feeling familiar with nonsense terms in that domain because of the fact that they had simply come across more similar-sounding terms in their lives, providing more material for potential confusion."

Interestingly, subsequent experiments showed that the correlation holds true even if you take away the factor of self-ranking.  Presumably, someone who is cocky and arrogant and ranks his/her ability higher than is justified in one area would be likely to do it in others.  But when they tested the subjects' knowledge of terms from their own field -- i.e., actually measured their expertise -- high scores still correlated with overestimating their knowledge in other areas.

And telling the subjects ahead of time that some of the terms might be made up didn't change the results.  "[E]ven when participants were warned that some of the statements were false, the 'experts' were just as likely as before to claim to know the nonsense statements, while most of the other participants became more likely in this scenario to admit they’d never heard of them," Oxenham writes.

I have a bit of anecdotal evidence supporting this result from my experience in the classroom.  On multiple-choice tests, I have to concoct plausible-sounding wrong answers as distractors.  Every once in a while, I run out of good wrong answers, and just make something up.  (On one AP Biology quiz on plant biochemistry, I threw in the term "photoglycolysis," which sounds pretty fancy until you realize that it doesn't exist.)  What I find was that it was the average to upper-average students who are the most likely to be taken in by the ruse.  The top students don't get fooled because they know what the correct answer is; the lowest students are equally likely to pick any of the wrong answers, because they don't understand the material well.  The mid-range students see something that sounds technical and vaguely familiar -- and figure that if they aren't sure, it must be that they missed learning that particular term.

It's also the mid-range students who are most likely to miss questions where the actual answer seems too simple.  Another botanical question I like to throw at them is "What do all non-vascular land plants have in common?"  There are three wrong answers with appropriately technical-sounding jargon.

The actual answer is, "They're small."

Interestingly, the reason non-vascular land plants are small isn't simple at all.  But the answer itself just looks too easy to merit being the correct choice on an AP Biology quiz.

So Atir, Rosenzweig, and Dunning have given us yet another mental pitfall to watch out for -- our tendency to use our knowledge in one field to overestimate our knowledge in others.  But I really should run along, and make sure that the annualized credit on my pre-rated stocks exceeds the recommended fixed-rate deduction.  I'm sure you can appreciate how important that is.

Friday, August 28, 2015

The illusion of causality

Fighting bad thinking is an uphill battle, sometimes.  Not only, or even primarily, because there's so much of it out there; the real problem is that our brains are hard-wired to make poor connections, and once those connections are made, to hang on to them like grim death.

A particularly difficult one to overcome is our tendency to fall for the post hoc, ergo propter hoc fallacy -- "after this, therefore because of this."  We assume that if two events are in close proximity in time and space, the first one must have caused the second one.  Dr. Paul Offit, director of the Vaccine Education Center at Children's Hospital of Philadelphia, likes to tell a story about his wife, who is a pediatrician, preparing to give a child a vaccination.  The child had a seizure as she was drawing the vaccine into the syringe.  If the seizure had occurred only a minute later, right after the vaccine was administered, the parents would undoubtedly have thought that the vaccination caused the seizure -- and after that, no power on earth would have likely convinced them otherwise.

[image courtesy of the Wikimedia Commons]

Why do we do this?  The most reasonable explanation is that in our evolutionary history, forming such connections had significant survival value.  Since it's usual that causes and effects are close together in time and space, wiring in a tendency to decide that all such correspondences are causal is still going to be right more often than not.  But it does lead us onto some thin ice, logic-wise.

Which is bad enough.  But now three researchers -- Ion Yarritu (Deusto University), Helena Matute (University of Bilbao), and David Luque (University of New South Wales) -- have published research that shows that our falling for what they call the "causal illusion" is so powerful that even evidence to the contrary can't fix the error.

In a paper called "The dark side of cognitive illusions: When an illusory belief interferes with the acquisition of evidence-based knowledge," published earlier this year in the British Journal of Psychology, Yarritu et al. have demonstrated that once we've decided on an explanation for something, it becomes damn near impossible to change.

Their experimental protocol was simple and elegant.  Yarritu writes:
During the first phase of the experiment, one group of participants was induced to develop a strong illusion that a placebo medicine was effective to treat a fictitious disease, whereas another group was induced to develop a weak illusion.  Then, in Phase 2, both groups observed fictitious patients who always took the bogus treatment simultaneously with a second treatment which was effective.  Our results showed that the group who developed the strong illusion about the effectiveness of the bogus treatment during Phase 1 had more difficulties in learning during Phase 2 that the added treatment was effective.
The strength of this illusion explains why bogus "alternative medicine" therapies gain such traction.  All it takes is a handful of cases where people use "deer antler spray" and find they have more energy (and no, I'm not making this up) to get the ball rolling.  Homeopathy owes a lot to this flaw in our reasoning ability; any symptom abatement that occurs after taking a homeopathic "remedy" clearly would have happened even if the patient had taken nothing -- which is, after all, what (s)he did.

And that's not even considering the placebo effect as a further complicating factor.

Helena Matute, one of the researchers in the recent study, has written extensively about the difficulty of battling causal illusions.  In an article she wrote for the online journal Mapping Ignorance, Matute writes:
Alternative medicine is often promoted on the argument that it can do no harm.  Even though its advocates are aware that its effectiveness has not been scientifically demonstrated, they do believe that it is harmless and therefore it should be used.  "If not alone, you should at least use it in combination with evidence-based treatments," they say, "just in case." 
But this strategy is not without risk... even treatments which are physically innocuous may have serious consequences in our belief system, sometimes with fatal consequences.  When people believe that a bogus treatment works, they may not be able to learn that another treatment, which is really effective, is the cause of their recovery.  This finding is important because it shows one of the mechanisms by which people might decide to quit an efficient treatment in favor of a bogus one.
I think this same effect is contributory to errors in thinking in a great many other areas.  Consider, for instance, the fact that belief in anthropogenic climate change rises in the summer and falls in the winter.  After being told that human activity is causing the global average temperature to rise, our brains are primed to look out of the window at the snow falling, and say, "Nah.  Can't be."

Post hoc, ergo propter hoc.  To quote Stephen Colbert, "Global warming isn't real, because I was cold today.  Also great news: world hunger is over because I just ate."

The study by Yarritu et al. highlights not only the difficulty of fighting incorrect causal connections, but why it is so essential that we do so.  The decision that two things are causally connected is powerful and difficult to reverse; so it's critical that we be aware of this bias in thinking, and watch our own tendency to leap to conclusions.  But even more critical is that we are given reliable evidence to correct our own errors in causality, and that we listen to it.  Like any cognitive bias, we can combat it -- but only if we're willing to admit that we might get it wrong sometimes.

Or as Michael Shermer put it, "Don't believe everything you think."

Wednesday, May 27, 2015

Leaving the echo chamber

It is natural, I suppose, to surround oneself with people whose political, religious, and philosophical beliefs we share.  We tend to get along best with people whose values are aligned with our own, and having the same opinions makes conflict less likely.  So what I'm going to suggest runs completely counter to this tribal tendency that all humans have.

Anyone who aspires to a skeptical view of the world should seek out interactions with people of opposing stances.

I won't say this isn't frustrating at times.  Hearing our most cherished viewpoints criticized, sometimes stridently, brings up some pretty strong emotions.  But there are two outstanding reasons to strive for diversity in our social circles, and I think that both of these make a cogent argument for overcoming our knee-jerk reactions to having our baseline assumptions called into question.

First, being exposed to a wide range of opinions keeps us honest.  It is an all-too-human failing not to question things when everyone around us is in agreement.  This can lead not only to our making mistakes, but not realizing them -- sometimes for a long time -- because we've surrounded ourselves with a Greek chorus of supporters, and no one who is willing to say, "Wait a minute... are you sure that's right?"

Second, it becomes less easy to demonize those who disagree with us when they have faces.  You can slide quickly into "those awful conservatives" or "those evil atheists" -- until you meet one, and spend some time chatting, and find out that the people you've derided turn out to be friendly and smart and... human.  Just like you.


How to build an echo chamber [adapted from Jasny, Fisher, et al.]

The danger of living in an echo chamber was illustrated vividly by a new peer-reviewed study led by Dana Fisher, professor of sociology at the University of Maryland.  Fisher et al. looked at how attitudes about climate change in particular are affected by being surrounded by others who agree with you.  They found that networks of people who are already in agreement, sharing information that supports what they already believed, create a context of certainty so powerful that even overwhelming scientific evidence can't overcome it.

"Our research shows how the echo chamber can block progress toward a political resolution on climate change," Fisher said in an interview.  "Individuals who get their information from the same sources with the same perspective may be under the impression that theirs is the dominant perspective, regardless of what the science says...  Information has become a partisan choice, and those choices bias toward sources that reinforce beliefs rather than challenge them, regardless of the source’s legitimacy."

Lorien Jasny, a lead author of the paper, emphasized how important it was to venture outside of the echo chamber.  "Our research underscores how important it is for people on both sides of the climate debate to be careful about where they get their information.  If their sources are limited to those that repeat and amplify a single perspective, they can’t be certain about the reliability or objectivity of their information."

While the study by Fisher et al. was specifically about attitudes regarding climate change, I would argue that their conclusions could be applied in a much wider context.  We need to hear opposing viewpoints about everything, because otherwise we fall prey to the worst part of tribalism -- the attitude that only the members of the tribe are worth listening to.  It's why liberals should occasionally tune in to Fox News and conservatives to MSNBC.  It's why the religious shouldn't unfriend their atheist Facebook friends -- and vice versa.  It's why my friend and coworker who tends to vote for the opposite political party than I do is someone whose views I make myself listen to and consider carefully.

Now, don't mistake me.  This doesn't mean you should put up with assholes.  The social conventions still apply, and disagreeing philosophically doesn't mean you call the people on the other side idiots.  I have chosen to disconnect from people who were rude and disagreeable -- but I hope I'd do that even if they shared my political views.

Put simply, we need to be pushed sometimes to overcome our natural bent toward surrounding ourselves with the like-minded.  When we do, we become less likely to fall prey to our own biases, and less likely to pass unfair judgment on those who disagree with us.  The work by Fisher et al. shows us how powerful the echo chamber effect can be -- and why it's critical that we get ourselves out of it on occasion, however comforting the illusion of certainty can be at times.

Monday, March 16, 2015

Science-friendly illogic

I usually don't blog about what other people put in their blogs.  This kind of thing can rapidly devolve into a bunch of shouted opinions, rather than a reasoned set of arguments that are actually based upon evidence.

But just yesterday I ran into a blog that (1) cited real research, and (2) drew conclusions from that research that were so off the rails that I had to comment.  I'm referring to the piece over at Religion News Service by Cathy Lynn Grossman entitled, "God Knows, Evangelicals Are More Science-Friendly Than You Think."  Grossman was part of a panel at the American Association for the Advancement of Science's yearly Dialogue on Science, Ethics, and Religion, and commented upon research presented at that event by Elaine Howard Ecklund, sociologist at Rice University.

Ecklund's research surrounded the attitudes by evangelicals toward science.  She described the following data from her study:
  • 48% of the evangelicals in her study viewed science and religion as complementary.
  • 21% saw the two worldviews as entirely independent of one another (which I am interpreting to be a version of Stephen Jay Gould's "non-overlapping magisteria" idea).
  • A little over 30% saw the two views as in opposition to each other.
84% of evangelicals, Grossman said, "say modern science is going good [sic] in the world."  And she interprets this as meaning that evangelicals are actually, contrary to appearances, "science friendly."  Grossman writes:
Now, the myth that bites the data dust, is one that proclaims evangelicals are a monolithic group of young-earth creationists opposed to theories of human evolution... 
(M)edia... sometimes incorrectly conflate the conservative evangelical view with all Christians’ views under the general “religion” terminology. 
I said this may allow a small subset to dictate the terms of the national science-and-religion conversation although they are not representative in numbers -– or point of view. This could lead to a great deal of energy devoted to winning the approval of the shrinking group and aging group that believes the Bible trumps science on critical issues.
Well, here's the problem with all of this.

This seems to me to be the inherent bias that makes everyone think they're an above-average driver.  Called the Dunning-Kruger effect, it is described by psychologist David Dunning, whose team first described the phenomenon, thusly:
Incompetent people do not recognize—scratch that, cannot recognize—just how incompetent they are...  What’s curious is that, in many cases, incompetence does not leave people disoriented, perplexed, or cautious. Instead, the incompetent are often blessed with an inappropriate confidence, buoyed by something that feels to them like knowledge. 
An ignorant mind is precisely not a spotless, empty vessel, but one that’s filled with the clutter of irrelevant or misleading life experiences, theories, facts, intuitions, strategies, algorithms, heuristics, metaphors, and hunches that regrettably have the look and feel of useful and accurate knowledge.
Now, allow me to say right away that I'm not calling evangelicals incompetent and/or ignorant as a group.  I have a friend who is a diehard evangelical, and he's one of the best-read, most thoughtful (in both senses of the word) people I know.  But what I am pointing out is that people are poor judges of their own understanding and attitudes -- and on that level, Dunning's second paragraph is referring to all of us.

So Ecklund's data, and Grossman's conclusions from it, are not so much wrong as they are irrelevant. It doesn't matter if evangelicals think they're supportive of science, just like my opinion of my own driving ability isn't necessarily reflective of reality.  I'm much more likely to take the evangelicals' wholesale rejection of evolution and climate science as an indication of their lack of support and/or understanding of science than I would their opinions regarding their own attitudes toward it.

And, of course, there's that troubling 30% of evangelicals who do see religion and science as opposed, a group that Grossman glides right past.  She does, however, admit that scientists would probably find it "troubling" that 60% of evangelicals say that "scientists should be open to considering miracles in their theories."

Troubling doesn't begin to describe it, lady.


That doesn't stop Grossman from painting the Religious Right as one big happy science-loving family, and she can't resist ending by giving us secular rationalists a little cautionary kick in the ass:
[S]cientists who want to write off evangelical views as inconsequential may not want to celebrate those trends [that young people are leaving the church in record numbers]. The trend to emphasize personal experience and individualized spirituality over the authority of Scripture or religious denominational theology is part of a larger cultural trend toward rejecting authority. 
The next group to fall victim to that trend could well be the voices of science.
Which may be the most obvious evidence of all that Grossman herself doesn't understand science.  Science doesn't proceed by authority; it proceeds by hard evidence.  Stephen Hawking, one of the most widely respected authorities in physics, altered his position on information loss in black holes when another scientist, John Preskill, demonstrated that he was wrong.  The theoretical refutation of Hawking's position was later confirmed by data from the Wilkinson Microwave Anisotropy Probe.  Significantly, no one -- including Hawking himself -- said, "you have to listen to me, I'm an authority."

If anything, the trend of rejection of authority and "personal experience" works entirely in science's favor.  The less personal bias a scientist has, the less dependence on the word of authority, the more (s)he can think critically about how the world works.

So all in all, I'd like to thank Grossman and Ecklund for the good news, however they delivered it in odd packaging.  Given my own set of biases, I'm not going to be likely to see the data they so lauded in anything but an optimistic light.

Just like I do my own ability to drive.  Because whatever else you might say about me, I have mad driving skills.

Friday, November 2, 2012

... and the test results are in!

Regular readers may remember that a couple of weeks ago, I wrote about an experiment being performed by psychologist Chris French and science writer Simon Singh to test, under controlled conditions, the alleged telepathic powers of some self-proclaimed psychics.  (You can read the original post here.)  French and Singh conducted the test at the University of London, and in a deliberately ironic gesture, released the results two days ago -- on Halloween.

And the results were... (drum roll please):

Zilch.

The two "psychics" who had agreed to participate were asked to write down something about each of five volunteers who were concealed behind a screen.  Afterwards, the five volunteers were asked to pick the descriptions that fit them best.  The psychics achieved a hit rate of one in five -- exactly consistent with chance alone.  [Source]

Now, so far, I find nothing particularly surprising about this.  I've read a great deal of the literature regarding controlled tests of psychics, mediums, and so on, and also about human cognitive biases -- confirmation and dart-thrower's bias, the "Clever Hans" effect, and so on.  When researchers are not exceptionally careful to screen out and control for these sorts of things, the results are immediately suspect -- which is why I don't think most anecdotal evidence in this realm, of the "I Went To A Psychic And She Was So Amazing" kind, is logically admissible.

What is more interesting is the reaction of one of the psychics who participated in the test, the rather unfortunately-named Patricia Putt.  "This experiment doesn't prove a thing," Ms. Putt said.  She went on to explain that she needs to be face-to-face with a client to establish a connection of "psychic energy."  When she is allowed to see her clients, her "success rate is very high."

She ended with a snarky comment that the scientists had designed the test simply to prove what they already believed -- accusing them, with no apparent sense of irony, of confirmation bias.

"Scientists are very closed-minded," she said.

My response is that of course she prefers to work face-to-face, and it has nothing whatsoever to do with "establishing a connection of psychic energy."  The psychics I've seen working first hand do what they do by paying close attention to body language -- they have trained themselves to watch their clients' every twitch, because that cues them in to how well they're doing, and where to go next with the "reading."  I still recall seeing a psychic doing a reading for students in a high school psychology class that I was asked to attend (and of course I was thrilled to have the opportunity to do so!).  The psychic, a woman named Laura, never took her eyes off the student she was doing a "reading" for, and as soon as the student gave the slightest sign that she was saying something that was off-base, she'd shift direction.  And later in the class, I gave a quick glance at the clock on the wall -- I had a class to teach the following period -- and Laura immediately said, "Am I out of time?"  And she hadn't even been doing a "reading" for me at the time -- I was sitting in the back of the classroom, and she simply noticed my eyes moving!

So despite Pat Putt's objections, I'm not buying that French and Singh deliberately set up the experiment to make her fail, or that they're closed-minded, or that the screens they used were made of special psychic-energy-blocking materials.  The most reasonable explanation for the results is simply that the alleged telepaths were unable to perform, and that they accomplish their "very high success rate" with face-to-face clients a different, and probably quite natural, way.  Admittedly, these were only two psychics, and a single experiment, and this hardly rules out the existence of psychic abilities in the global sense; but it very much places the ball in the court of folks like Derek Acorah, Sylvia Browne, and Sally Morgan.  If what you are doing is not simply a combination of prior research, information provided by assistants, and sensitivity to human body language -- if you really are, improbably and amazingly, picking up on human thoughts through some sort of hitherto undetected "psychic energy field," I would very much like you all to man-up and do what any critical thinker would demand:

Prove, under controlled conditions, that you are able to do what you claim.  And if you cannot do that, kindly have the decency to stop ripping people off.