Skeptophilia (skep-to-fil-i-a) (n.) - the love of logical thought, skepticism, and thinking critically. Being an exploration of the applications of skeptical thinking to the world at large, with periodic excursions into linguistics, music, politics, cryptozoology, and why people keep seeing the face of Jesus on grilled cheese sandwiches.
Showing posts with label appeal to authority. Show all posts
Showing posts with label appeal to authority. Show all posts

Saturday, August 5, 2023

Hero worship

I got into a curious exchange with someone on Twitter a couple of days ago about Richard Dawkins's recent statement that "biological sex is binary, and that's all there is to it," wherein he called the claims of trans people (and their requests to be referred to by the pronouns they identified with) "errant nonsense," and characterized the people who have criticized him and author J. K. Rowling (amongst others) for their anti-trans stances as "bullies."

The person I had the exchange with seemed to consider this a gotcha moment, and came at me with a gleeful "what do you think of your atheist idol now that he's broken ranks?"

I found this a puzzling question from a number of standpoints.  First, I've never idolized Dawkins.  I think he is an incredibly lucid writer on the subject of evolutionary biology, and his books The Blind Watchmaker, Climbing Mount Improbable, and The Ancestor's Tale remain three of the best layperson's explanations of the science and evidence behind evolution I've ever read.  But admiring his writing on one topic doesn't mean I think he's infallible.  In fact, I've always had the impression that Dawkins was a bit of a dick, and he certainly comes across as more than a little arrogant.  While I agree with him on the subject of evolution, it doesn't mean that he's someone I'd particularly want to have a beer with.

[Image is in the Public Domain]

When I responded to the question with something like this, the person on Twitter seemed a bit deflated, as if he'd expected me to alter my stance on LGBTQ+ issues and the biology of gender just because My Hero had made some sort of pronouncement from on high.

This struck me as a peculiar reaction.  Maybe this is how it works within the context of religion, where a leader (e.g. the Pope, the Imams, and so on) makes a statement and the expectation is that everyone will simply accept it without question.

But it's definitely not how things go in science.

In this case, it has nothing to do with Dawkins bucking the system against some kind of perceived party line.  In fact, I'll bring out one of his own quotes, which applies here: "If two people are arguing opposite viewpoints, it is not necessarily the case the the truth lies somewhere in the middle.  It is possible that one of them is simply wrong."  On the subject of sexuality being binary, Dawkins is simply wrong, something I explored in some detail in a post a couple of years ago.

But the point is, that doesn't detract from his excellent writing on evolution.  Being wrong about one thing, or even about a bunch of things, doesn't mean you're wrong on everything, nor invalidate other outstanding work you may have done.  (Although it can rightly tarnish your reputation as a decent human being.)  It's sad that Dawkins has gone off the rails on this topic, and a shame that his aforementioned arrogance is very likely to make him unwilling to see his own faulty assessment of the evidence and even less likely to admit it if he does.  And it's unfortunate that his air of authority is certainly going to carry some weight with people, especially those who want more ammunition for defending what they already believed about the supposed binary nature of gender.

The fact that this doesn't make me discount him completely is because I feel no need to engage in hero worship.

That extends to other areas as well.  I can appreciate the acting ability of Tom Cruise and Gwyneth Paltrow, and thoroughly enjoy watching (respectively) Minority Report and Sliding Doors, while at the same time acknowledging that in real life both of them appear to have a screw loose.  I can still be inspired by some of the stories of H. P. Lovecraft, while keeping in mind that he was a virulent racist (something that comes through loud and clear in the worst of his stories, but fortunately not all).

In fact, it's best if we look at all famous people through that lens.  The expectation that someone prominent or admired must be flawless -- and therefore, anyone criticizing him/her is de facto wrong -- is what leads to the behavior we're now seeing in Trump loyalists, who will defend him to the death regardless what charges are proven against him or how overwhelming the evidence is.

It is this sort of thinking that is characteristic of a cult.

In any case, I can say I'm disappointed in Dawkins, but it neither caused me to abandon his writing on evolutionary biology nor to revise my own thinking on LGBTQ+ issues because Dawkins Says So.  It's best to keep in mind that people are complex bundles of often contradictory traits, and there's no one person who is going to be in line with your understanding of the world all the time.  In the end, it's always best to form your beliefs based on where the actual evidence leads -- and above all, to think for yourself.

****************************************



Tuesday, February 9, 2021

Fooling the experts

I was bummed to hear about the death of the inimitable Cloris Leachman a week and a half ago at the venerable age of 94.  Probably most famous for her role as Frau Blücher *wild neighing horse noises* in the movie Young Frankenstein, I was first introduced to her unsurpassed sense of comic timing in the classic 1970s sitcom The Mary Tyler Moore Show, where she played the tightly-wound self-styled intellectual Phyllis Lindstrom.

One of my favorite moments in that show occurred when Phyllis was playing a game of Scrabble against Mary's neighbor Rhoda Morgenstern (played with equal panache by Valerie Harper).  Rhoda puts down the word oxmersis, and Phyllis challenges it.

"There's no such thing as 'oxmersis,'" Phyllis says.

Rhoda looks at her, aghast.  "Really, Phyllis?  I can not believe that someone who knows as much about psychology as you do has never heard of oxmersis."

Long pause, during which you can almost see the gears turning in Phyllis's head.  "Oh," she finally says.  "That oxmersis."

I was immediately reminded of that scene when I ran into a paper while doing some background investigation for yesterday's post, which was about psychologist David Dunning's research with Robert Proctor regarding the deliberate cultivation of stupidity.  This paper looked at a different aspect of ignorance -- what happens when you combine the Dunning-Kruger effect (people's tendency to overestimate their own intelligence and abilities) with a bias called Appeal to Authority.

Appeal to Authority, you probably know, is when someone uses credentials, titles, or educational background -- and no other evidence -- to support a claim.  Put simply, it is the idea that if Richard Dawkins said it, it must be true, regardless of whether the claim has anything to do with Dawkins's particular area of expertise, evolutionary biology.  (I pick Dawkins deliberately, because he's fairly notorious for having opinions about everything, and seems to relish being the center of controversy regardless of the topic.)  

Dunning teamed up with Cornell University researchers Stav Atir and Emily Rosenzweig, and came up with what could be described as the love child of Dunning-Kruger and Appeal to Authority.  And what this new phenomenon -- dubbed, predictably, the Atir-Rosenzweig-Dunning Effect -- shows us is that people who are experts in a particular field tend to think their expertise holds true even for disciplines far outside their chosen area of study, and because of that are more likely to fall for plausible-sounding falsehoods -- like Phyllis's getting suckered by Rhoda's "oxmersis" bluff.

[Image is in the Public Domain]

In one experiment, the three researchers asked people to rate their own knowledge in various academic areas, then asked them to rank their level of understanding of various finance-related terms, such as "pre-rated stocks, fixed-rate deduction and annualized credit."  The problem is, those three finance-related terms actually don't exist -- i.e., they were made up by the researchers to sound plausible.

The test subjects who had the highest confidence level in their own fields were most likely to fall for the ruse.  Simon Oxenham, who described the experiments in Big Think, says it's only natural.  "A possible explanation for this finding," Oxenham writes, "is that the participants with a greater vocabulary in a particular domain were more prone to falsely feeling familiar with nonsense terms in that domain because of the fact that they had simply come across more similar-sounding terms in their lives, providing more material for potential confusion."

Interestingly, subsequent experiments showed that the correlation holds true even if you take away the factor of self-ranking.  Presumably, someone who is cocky and arrogant and ranks his/her ability higher than is justified in one area would be likely to do it in others.  But when they tested the subjects' knowledge of terms from their own field -- i.e., actually measured their expertise -- high scores still correlated with overestimating their knowledge in other areas.

And telling the subjects ahead of time that some of the terms might be made up didn't change the results. "[E]ven when participants were warned that some of the statements were false, the 'experts' were just as likely as before to claim to know the nonsense statements, while most of the other participants became more likely in this scenario to admit they’d never heard of them," Oxenham writes.

I have a bit of anecdotal evidence supporting this result from my experience in the classroom.  On multiple-choice tests, I had to concoct plausible-sounding wrong answers as distractors.  Every once in a while, I ran out of good wrong answers, and just made something up.  (On one AP Biology quiz on plant biochemistry, I threw in the term "photoglycolysis," which sounds pretty fancy until you realize that there's no such thing.)   What I found was that it was the average to upper-average students who were the most likely to be taken in.  The top students didn't get fooled because they knew what the correct answer was; the lowest students were equally likely to pick any of the wrong answers, because they didn't understand the material well.  The mid-range students saw something that sounded technical and vaguely familiar -- and figured that if they weren't sure, it must be that they'd missed learning that particular term.

It was also the mid-range students who were most likely to miss questions where the actual answer seemed too simple.  Another botanical question I liked to throw at them was, "What do all non-vascular land plants have in common?"  I always provided three wrong answers with appropriately technical-sounding jargon.

The actual answer is, "They're small."

Interestingly, the reason for the small size of non-vascular land plants (the most familiar example is moss) isn't simple at all.  But the answer itself just looked too easy to merit being the correct choice on an AP Biology quiz.

So Atir, Rosenzweig, and Dunning have given us yet another mental pitfall to watch out for -- our tendency to use our knowledge in one field to overestimate our knowledge in others.  But I really should run along, and make sure that the annualized credit on my pre-rated stocks exceeds the recommended fixed-rate deduction.  I worry a lot about that kind of thing, but I suppose my anxiety really just another case of excessive oxmersis.

*********************************

Science writer Elizabeth Kolbert established her reputation as a cutting-edge observer of the human global impact in her wonderful book The Sixth Extinction (which was a Skeptophilia Book of the Week a while back).  This week's book recommendation is her latest, which looks forward to where humanity might be going.

Under a White Sky: The Nature of the Future is an analysis of what Kolbert calls "our ten-thousand-year-long exercise in defying nature," something that immediately made me think of another book I've recommended -- the amazing The Control of Nature by John McPhee, the message of which was generally "when humans pit themselves against nature, nature always wins."  Kolbert takes a more nuanced view, and considers some of the efforts scientists are making to reverse the damage we've done, from conservation of severely endangered species to dealing with anthropogenic climate change.

It's a book that's always engaging and occasionally alarming, but overall, deeply optimistic about humanity's potential for making good choices.  Whether we turn that potential into reality is largely a function of educating ourselves regarding the precarious position into which we've placed ourselves -- and Kolbert's latest book is an excellent place to start.

[Note: if you purchase this book using the image/link below, part of the proceeds goes to support Skeptophilia!]



Thursday, October 15, 2020

Life at the center

Appeal to Authority is simultaneously one of the simplest, and one of the trickiest, of the fallacies.

The simple part is that one shouldn't rely on someone else's word for a claim, without some demonstration of evidence in support.  Just saying "Neil de Grasse Tyson said so" isn't sufficient proof for a conjecture.

On the other hand, there are times when relying on authority makes sense.  If I claimed that Neil de Grasse Tyson was wrong in the realm of astronomy, the likelihood of my being wrong myself is nearly 100%.   Expertise is worth something, and Tyson's Ph.D. in astrophysics certainly gives his statements in that field considerable gravitas.

The problem is that when confronted with a confident-sounding authority, people turn their own brains off.   And the situation becomes even murkier when experts in one field start making pronouncements in a different one.

Take, for example, Robert Lanza, a medical researcher whose work in stem cells and regenerative medicine has led to groundbreaking advances in the treatment of hitherto incurable diseases.  His contributions to medical science are undeniably profound, and I would consider his opinion in the field of stem cell research about as close to unimpeachable as you could get.  But Lanza hasn't been content to stay within his area of specialization, and has ventured forth into the fringe areas of metaphysics -- joining people like Fritjof Capra in their quest to show that quantum physics has something to say about consciousness, souls, and life after death.

Let's start with Lanza's idea of a "biocentric universe," which is defined thusly:
Biocentrism states that life and biology are central to being, reality, and the cosmos— life creates the universe rather than the other way around.  It asserts that current theories of the physical world do not work, and can never be made to work, until they fully account for life and consciousness.  While physics is considered fundamental to the study of the universe, and chemistry fundamental to the study of life, biocentrism claims that scientists will need to place biology before the other sciences to produce a theory of everything.
Which puts me in mind of Wolfgang Pauli's famous quote, "This isn't right. This isn't even wrong."  Biocentrism isn't really a scientific theory, in that it makes no predictions, and therefore de facto isn't falsifiable.  And Lanza's reception on this topic has been chilly at best.  Physicist Lawrence Krauss said, "It may represent interesting philosophy, but it doesn't look, at first glance, as if it will change anything about science."  Physicist and science writer David Lindley agrees, calling biocentrism "a vague, inarticulate metaphor."

And if you needed further evidence of its lack of scientific rigor, I must also point out that Deepak Chopra loves biocentrism.  "(Lanza's) theory of biocentrism is consistent with the most ancient wisdom traditions of the world which says that consciousness conceives, governs, and becomes a physical world," Chopra writes.  "It is the ground of our Being in which both subjective and objective reality come into existence."

As a scientist, you know you're in trouble if you get support from Deepak Chopra.

And there's a further problem with venturing outside of your field of expertise.  If you make unsupported claims, then others will take your claims (with your name appended to them, of course) and send them even further out into the ether.  Which is what happened recently over at the site Learning Mind, where Lanza's ideas were said to prove that the soul exists, and death is an illusion:
(Lanza's) theory implies that death simply does not exist.  It is an illusion which arises in the minds of people. It exists because people identify themselves with their body.  They believe that the body is going to perish, sooner or later, thinking their consciousness will disappear too. 
In fact, consciousness exists outside of constraints of time and space.  It is able to be anywhere: in the human body and outside of it.  That fits well with the basic postulates of quantum mechanics science, according to which a certain particle can be present anywhere and an event can happen according to several, sometimes countless, ways.  
Lanza believes that multiple universes can exist simultaneously.  These universes contain multiple ways for possible scenarios to occur.  In one universe, the body can be dead.  And in another it continues to exist, absorbing consciousness which migrated into this universe.  This means that a dead person while traveling through the same tunnel ends up not in hell or in heaven, but in a similar world he or she once inhabited, but this time alive.  And so on, infinitely.
Which amounts to taking an untestable claim, whose merits are best left to the philosophers to discuss, and running right off a cliff with it.

As I've said more than once: quantum mechanics isn't some kind of fluffy, hand-waving speculation.  It is hard, evidence-based science.  The mathematical model that is the underpinning of this description of the universe is complex and difficult for the layperson to understand, but it is highly specific.  It describes the behavior of particles and waves, on the submicroscopic scale, making predictions that have been experimentally supported time after time.


[Image is in the Public Domain]

And that's all it does.   Quantum effects such as superposition, indeterminacy, and entanglement have extremely limited effects on the macroscopic world.  Particle physics has nothing to say about the existence of the soul, the afterlife, or any other religious or philosophical claim.  And even the "Many Worlds" hypothesis, which was seriously put forth as a way to explain the collapse of the wave function, has largely been shelved by everyone but the science fiction writers because its claims are completely untestable.

To return to my original point, Appeal to Authority is one of those fallacies that seem simpler than they actually turn out to be.  I have no doubt that Robert Lanza is a genius in the field of regenerative medicine, and I wouldn't hesitate to trust what he says in that realm.  But his pronouncements in the field of physics appear to me to be unfalsifiable speculation -- i.e., not scientific statements.  As such, biocentrism is no better than "intelligent design."  What Adam Lee, of Daylight Atheism, said about intelligent design could be applied equally well to biocentrism:
(A) hypothesis must make predictions that can be compared to the real world and determined to be either true or false, and there must be some imaginable evidence that could disprove it.  If an idea makes no predictions, makes predictions that cannot be unambiguously interpreted as either success or failure, or makes predictions that cannot be checked out even in principle, then it is not science.
But as such, I'm sure biocentrism is going to be as popular amongst the woo-woos as ID is amongst the fervently religious.  For them, "unfalsifiable" means "you can't prove we're wrong."

"Therefore we're right. q.e.d. and ha ha ha."
***************************************

This week's Skeptophilia book recommendation is brand new, and is as elegiac as it is inspiring -- David Attenborough's A Life on Our Planet: My Witness Statement and a Vision for the Future.

Attenborough is a familiar name, face, and (especially) voice to those of us who love nature documentaries.  Through series such as Our Planet, Life on Earth, and Planet Earth, he has brought into our homes the beauty of nature -- and its desperate fragility.

At 93, Attenborough's A Life on Our Planet is a fitting coda to his lifelong quest to spark wonder in our minds at the beauty that surrounds us, but at the same time wake us up to the perils of what we're doing to it.  His message isn't all doom and gloom; despite it all, he remains hopeful, and firm in his conviction that we can reverse our course and save what's left of the biodiversity of the Earth.  It's a poignant and evocative work -- something everyone who has been inspired by Attenborough for decades should put on their reading list.

[Note: if you purchase this book using the image/link below, part of the proceeds goes to support Skeptophilia!]



Monday, July 25, 2016

Fooling the experts

Today we consider what happens when you blend Appeal to Authority with the Dunning-Kruger Effect.

Appeal to Authority, you probably know, is when someone uses credentials, titles, or educational background -- and no other evidence -- to support a claim.  Put simply, it is the idea that if Stephen Hawking said it, it must be true, regardless of whether the claim has anything to do with Hawking's particular area of expertise.  The Dunning-Kruger Effect, on the other hand, is the idea that people tend to wildly overestimate their abilities, even in the face of evidence to the contrary, which is why we all think we're above average drivers.

Well, David Dunning (of the aforementioned Dunning-Kruger Effect) has teamed up with Cornell University researchers Stav Atir and Emily Rosenzweig, and come up with the love child of Dunning-Kruger and Appeal to Authority.  And what this new phenomenon -- dubbed, predictably, the Atir-Rosenzweig-Dunning Effect -- shows us is that people who are experts in a particular field tend to think that expertise holds true even for disciplines far outside their chosen area of study.

[image courtesy of the Wikimedia Commons]

In one experiment, the three researchers asked people to rate their own knowledge in various academic areas, then asked them to rank their level of understanding of various finance-related terms, such as "pre-rated stocks, fixed-rate deduction and annualized credit."  The problem is, those three finance-related terms actually don't exist -- i.e., they were made up by the researchers to sound plausible.

The test subjects who had the highest confidence level in their own fields were most likely to get suckered.  Simon Oxenham, who described the experiments in Big Think, says it's only natural.  "A possible explanation for this finding," Oxenham writes, "is that the participants with a greater vocabulary in a particular domain were more prone to falsely feeling familiar with nonsense terms in that domain because of the fact that they had simply come across more similar-sounding terms in their lives, providing more material for potential confusion."

Interestingly, subsequent experiments showed that the correlation holds true even if you take away the factor of self-ranking.  Presumably, someone who is cocky and arrogant and ranks his/her ability higher than is justified in one area would be likely to do it in others.  But when they tested the subjects' knowledge of terms from their own field -- i.e., actually measured their expertise -- high scores still correlated with overestimating their knowledge in other areas.

And telling the subjects ahead of time that some of the terms might be made up didn't change the results.  "[E]ven when participants were warned that some of the statements were false, the 'experts' were just as likely as before to claim to know the nonsense statements, while most of the other participants became more likely in this scenario to admit they’d never heard of them," Oxenham writes.

I have a bit of anecdotal evidence supporting this result from my experience in the classroom.  On multiple-choice tests, I have to concoct plausible-sounding wrong answers as distractors.  Every once in a while, I run out of good wrong answers, and just make something up.  (On one AP Biology quiz on plant biochemistry, I threw in the term "photoglycolysis," which sounds pretty fancy until you realize that it doesn't exist.)  What I find was that it was the average to upper-average students who are the most likely to be taken in by the ruse.  The top students don't get fooled because they know what the correct answer is; the lowest students are equally likely to pick any of the wrong answers, because they don't understand the material well.  The mid-range students see something that sounds technical and vaguely familiar -- and figure that if they aren't sure, it must be that they missed learning that particular term.

It's also the mid-range students who are most likely to miss questions where the actual answer seems too simple.  Another botanical question I like to throw at them is "What do all non-vascular land plants have in common?"  There are three wrong answers with appropriately technical-sounding jargon.

The actual answer is, "They're small."

Interestingly, the reason non-vascular land plants are small isn't simple at all.  But the answer itself just looks too easy to merit being the correct choice on an AP Biology quiz.

So Atir, Rosenzweig, and Dunning have given us yet another mental pitfall to watch out for -- our tendency to use our knowledge in one field to overestimate our knowledge in others.  But I really should run along, and make sure that the annualized credit on my pre-rated stocks exceeds the recommended fixed-rate deduction.  I'm sure you can appreciate how important that is.

Saturday, May 16, 2015

The problem with Seymour

A few months ago, I made the point that the fallacy called appeal to authority is not as simple as it sounds.

On the surface, it's about not trusting authorities and public figures simply because they're well-known names.  You can convince anyone of anything, seemingly, if you append the words "Albert Einstein said so" after your claim; it's the reason I fight every year in my intro neuroscience class with the spurious claim that humans use only 10% of their brains.  You see this idea attributed to Einstein all the time -- although it's unlikely that he ever said such a thing, adding "apocryphal quotes" as another layer of fallacy to this claim, and the claim itself is demonstrably false.

The problem is, of course, there are some areas where Einstein was an expert.  Adding "Einstein said so" to a discussion of general relativity is pretty persuasive, given that relativity has passed every scientific test it's been put through.  But notice the difference; we're not accepting relativity because a respect physicist thought it was true.  Said respected physicist's ideas still had to be vetted, retested, and peer-reviewed.  It's the vindication of his theories that conferred credibility on his name, not the other way around.

The situation becomes even blurrier when you have someone whose work in a particular field starts out valid and evidence-based, and then at some point veers off into wild speculation.  This is the core of the problem with an appeal to authority; someone having one or two right ideas in the past is no insurance against his/her being wildly wrong later.

This is the situation we find ourselves in with Seymour Hersh.  Hersh is a Pulitzer Prize-winning investigative journalist whose work on exposing the truth about the My Lai Massacre and the torture of prisoners of war by American soldiers at Abu Ghraib was groundbreaking.  His dogged determination to get at the facts, even at the cost of embarrassing the American government and damaging the reputation of the U.S. overseas, earned him a well-deserved name as one of the giants of journalism.

Seymour Hersh [image courtesy of the Wikimedia Commons]

The problem is, Hersh seems to have gone badly off the rails lately.  His latest piece, which he's pursuing with the tenacity of a bloodhound, is about the claim that the public version of the death of Osama bin Laden is a complete fabrication -- that the United States had captured bin Laden all the way back in 2006, and with help from the Saudis was using him as leverage against al Qaeda.  When his usefulness began to wane, they had him killed and then faked a raid against his compound in Abbottabad, then made public the story of the brave soldiers who'd risked their lives to take down a wanted terrorist.

The problem is, as is described in more detail in an article in Vox, the claim is supported by little in the way of evidence.  Hersh's two sources admittedly have no direct knowledge of what happened.  The story itself is fraught with self-contradictions and inconsistencies.  And then, to make matters worse, Hersh has recently begun to claim that the United States government has been infiltrated by members of Opus Dei (a Roman Catholic spiritual organization made famous, or infamous, by The DaVinci Code), that the chemical weapons attacks in Syria were "false flags" staged by the Turkish government, and that the U.S. is training Iranian terrorists in Nevada.

None of these, apparently, have any evidential support beyond "an anonymous source told me."  Hersh, seemingly, has slipped from being a hard-hitting investigative reporter to a wild-eyed conspiracy theorist.

He's not backing down, however.  He granted an interview to Slate in which he reiterated everything he's said.  He seems to spend equal time during the interview defending himself without introducing any further facts, and disparaging the interviewer, journalist Isaac Chotiner.  "What difference does it make what the fuck I think about journalism?" Hersh asked Chotiner.  "I don’t think much of the journalism that I see.  If you think I write stories where it is all right to just be good enough, are you kidding?  You think I have a cavalier attitude on throwing stuff out?  Are you kidding?  I am not cavalier about what I do for a living."

And only a moment earlier, when asked a question he didn't like, he said to Chotiner, "Oh poor you, you don’t know anything.  It is amazing you can speak the God’s English."

This is a vivid, and rather sad, example of why a person's reputation isn't sufficient to establish the veracity of their claims.  No one -- including both Albert Einstein and Seymour Hersh -- have the right to rest on their laurels, to expect people to believe something just because they've appended their name to it.

Claims stand or fall on the basis of one thing; the evidence.  And what Hersh has brought forth thus far is of such poor quality that about the only one he's convincing is Alex Jones.

Monday, March 16, 2015

Science-friendly illogic

I usually don't blog about what other people put in their blogs.  This kind of thing can rapidly devolve into a bunch of shouted opinions, rather than a reasoned set of arguments that are actually based upon evidence.

But just yesterday I ran into a blog that (1) cited real research, and (2) drew conclusions from that research that were so off the rails that I had to comment.  I'm referring to the piece over at Religion News Service by Cathy Lynn Grossman entitled, "God Knows, Evangelicals Are More Science-Friendly Than You Think."  Grossman was part of a panel at the American Association for the Advancement of Science's yearly Dialogue on Science, Ethics, and Religion, and commented upon research presented at that event by Elaine Howard Ecklund, sociologist at Rice University.

Ecklund's research surrounded the attitudes by evangelicals toward science.  She described the following data from her study:
  • 48% of the evangelicals in her study viewed science and religion as complementary.
  • 21% saw the two worldviews as entirely independent of one another (which I am interpreting to be a version of Stephen Jay Gould's "non-overlapping magisteria" idea).
  • A little over 30% saw the two views as in opposition to each other.
84% of evangelicals, Grossman said, "say modern science is going good [sic] in the world."  And she interprets this as meaning that evangelicals are actually, contrary to appearances, "science friendly."  Grossman writes:
Now, the myth that bites the data dust, is one that proclaims evangelicals are a monolithic group of young-earth creationists opposed to theories of human evolution... 
(M)edia... sometimes incorrectly conflate the conservative evangelical view with all Christians’ views under the general “religion” terminology. 
I said this may allow a small subset to dictate the terms of the national science-and-religion conversation although they are not representative in numbers -– or point of view. This could lead to a great deal of energy devoted to winning the approval of the shrinking group and aging group that believes the Bible trumps science on critical issues.
Well, here's the problem with all of this.

This seems to me to be the inherent bias that makes everyone think they're an above-average driver.  Called the Dunning-Kruger effect, it is described by psychologist David Dunning, whose team first described the phenomenon, thusly:
Incompetent people do not recognize—scratch that, cannot recognize—just how incompetent they are...  What’s curious is that, in many cases, incompetence does not leave people disoriented, perplexed, or cautious. Instead, the incompetent are often blessed with an inappropriate confidence, buoyed by something that feels to them like knowledge. 
An ignorant mind is precisely not a spotless, empty vessel, but one that’s filled with the clutter of irrelevant or misleading life experiences, theories, facts, intuitions, strategies, algorithms, heuristics, metaphors, and hunches that regrettably have the look and feel of useful and accurate knowledge.
Now, allow me to say right away that I'm not calling evangelicals incompetent and/or ignorant as a group.  I have a friend who is a diehard evangelical, and he's one of the best-read, most thoughtful (in both senses of the word) people I know.  But what I am pointing out is that people are poor judges of their own understanding and attitudes -- and on that level, Dunning's second paragraph is referring to all of us.

So Ecklund's data, and Grossman's conclusions from it, are not so much wrong as they are irrelevant. It doesn't matter if evangelicals think they're supportive of science, just like my opinion of my own driving ability isn't necessarily reflective of reality.  I'm much more likely to take the evangelicals' wholesale rejection of evolution and climate science as an indication of their lack of support and/or understanding of science than I would their opinions regarding their own attitudes toward it.

And, of course, there's that troubling 30% of evangelicals who do see religion and science as opposed, a group that Grossman glides right past.  She does, however, admit that scientists would probably find it "troubling" that 60% of evangelicals say that "scientists should be open to considering miracles in their theories."

Troubling doesn't begin to describe it, lady.


That doesn't stop Grossman from painting the Religious Right as one big happy science-loving family, and she can't resist ending by giving us secular rationalists a little cautionary kick in the ass:
[S]cientists who want to write off evangelical views as inconsequential may not want to celebrate those trends [that young people are leaving the church in record numbers]. The trend to emphasize personal experience and individualized spirituality over the authority of Scripture or religious denominational theology is part of a larger cultural trend toward rejecting authority. 
The next group to fall victim to that trend could well be the voices of science.
Which may be the most obvious evidence of all that Grossman herself doesn't understand science.  Science doesn't proceed by authority; it proceeds by hard evidence.  Stephen Hawking, one of the most widely respected authorities in physics, altered his position on information loss in black holes when another scientist, John Preskill, demonstrated that he was wrong.  The theoretical refutation of Hawking's position was later confirmed by data from the Wilkinson Microwave Anisotropy Probe.  Significantly, no one -- including Hawking himself -- said, "you have to listen to me, I'm an authority."

If anything, the trend of rejection of authority and "personal experience" works entirely in science's favor.  The less personal bias a scientist has, the less dependence on the word of authority, the more (s)he can think critically about how the world works.

So all in all, I'd like to thank Grossman and Ecklund for the good news, however they delivered it in odd packaging.  Given my own set of biases, I'm not going to be likely to see the data they so lauded in anything but an optimistic light.

Just like I do my own ability to drive.  Because whatever else you might say about me, I have mad driving skills.

Monday, February 9, 2015

The random comment department

Two news stories I came across this weekend are mostly interesting in juxtaposition.

First, a paper in the Journal of Advertising, by Ioannis Kareklas, Darrel Muehling, and T. J. Weber of Washington State University, tells a frightening but unsurprising story.  Their study shows that people who are presented with data about vaccination safety are more likely to consider online comments from random individuals as credible than they are information from institutions like the Center for Disease Control.

[image courtesy of the Wikimedia Commons]

Here's how the experiment was set up:
Participants were led to believe that the pro-vaccination PSA was sponsored by the U.S. Centers for Disease Control and Prevention (CDC), while the anti-vaccination PSA was sponsored by the National Vaccine Information Council (NVIC). Both PSAs were designed to look like they appeared on each organization's respective website to enhance validity. 
The PSAs were followed by comments from fictitious online commenters who either expressed pro- or anti-vaccination viewpoints. Participants weren't told anything about who the commenters were, and unisex names were used to avoid potential gender biases.
The researchers then presented participants with a questionnaire to determine how (or if) their views on vaccination had changed.

"The results kind of blew us away," said Kareklas in a press release.  "People were trusting the random online commenters just as much as the PSA itself."

Which, as I said, is disheartening but unsurprising, given that people like Jenny McCarthy are the public version of a Random Online Commenter.

Kareklas et al. followed this up with a second study, to see if the commenters were believed even more strongly if they were identified as doctors (as opposed to one of two other professions).  The commenters who were self-identified doctors had an even stronger effect -- i.e., they outweighed the CDC information even more.

Which explains "Dr." Andrew Wakefield.

And this brings me to the second story, which comes out of Kansas -- where a bill has been introduced into the legislature that would prevent professionals from mentioning their titles or credentials in opinion articles and letters to the editor.

The story about how House Bill 2234 was introduced is interesting in and of itself.  The bill was offered into committee by Representative Virgil Peck (R-Tyro), but Peck initially denied having done so.

"I introduce bills in committee sometimes when I’m asked out of courtesy," Peck said.  "It’s not because I have any skin in the game or I care about it.  I’m not even sure I introduced it, but if he said I did, I did."

Our leaders, ladies and gentlemen.  "Not even sure" what bills they introduce regarding issues they don't care about.

While on the one hand, the Kareklas study does point out the danger -- if someone thinks you're a doctor (for example), they're more likely to be swayed by your comments even if you're wrong -- in what universe is the public better off not knowing the background of the person whose words they're reading?

Representative John Carmichael (D-Wichita) nailed it.  With regards to the bill, he said, "If you are in fact a professor at The University of Kansas, that is part of your identity and part of your resume.  To muzzle an academic in identifying him or herself, and their accomplishment, not only does it have the effect of denying them their right to free speech, it also denies the public the right to understand who is commenting and what their, perhaps, bias or interest might be."

And that last bit is the important part.  My views on education -- which I throw out frequently and often vehemently -- are clearly affected by the fact that I'm a public school teacher.  Whether that makes me more or less credible would, I suppose, depend on your viewpoint.  But how on earth would you be better informed by not knowing what my profession is, by having less information with which to evaluate what I've said?

The Kareklas study and the bill introduced by Virgil "What Bill Did I Just Introduce?" Peck highlight one tremendously important thing, however; the general public is incredibly bad at critical reading.  One of the most important things you can do, when you read (or listen to) media, is to weigh what's being said against the facts and evidence, and consider the possibility of bias and appeal to authority.  The Kareklas study shows that we're pretty terrible at the former, and the Kansas bill proposes to eliminate our ability even to attempt the latter.

All making it even more important that children be taught critical thinking skills.  Because if adults don't consider information from the Center for Disease Control to have more credibility than opinions coming from an online commenter, there's something seriously wrong.