Skeptophilia (skep-to-fil-i-a) (n.) - the love of logical thought, skepticism, and thinking critically. Being an exploration of the applications of skeptical thinking to the world at large, with periodic excursions into linguistics, music, politics, cryptozoology, and why people keep seeing the face of Jesus on grilled cheese sandwiches.
Showing posts with label knowledge. Show all posts
Showing posts with label knowledge. Show all posts

Tuesday, April 2, 2024

Mysterious mountains

It's amazing how far human knowledge has come in only a hundred years.

Consider the following about the year 1924:

  • This is the year we would figure out that there are other galaxies beyond the Milky Way; before this, astronomers thought the Milky Way was all there was.  They called the galaxies they knew about (such as Andromeda and the Whirlpool Galaxy) "nebulae" (Latin for "clouds") and thought they were blobs of dust within our own galaxy.  This marks the moment we realized how big the universe actually is.
  • In 1924, the quantum nature of reality was still unknown; the first major papers by Heisenberg, Schrödinger, and Born would come out next year.
  • It'd be another four years before the first antibiotic -- penicillin -- was discovered.
  • It'd be five years before Edwin Hubble announced his discovery of red shift, which showed the universe is expanding and led to the Big Bang model of cosmology.
  • We'd have another seventeen years before we'd see the first electron micrograph of a virus; before that, it was known they caused disease, but no one knew what they were or had ever seen one.
  • It'd be another twenty years before DNA was shown to be the genetic material, and a good twenty years after that when Franklin, Watson, and Crick figured out its structure and the basics of how it works.
  • The first papers outlining the mechanics of plate tectonics were still forty years in the future; at this point, the only one who championed the idea that the continents moved was German geologist and climatologist Alfred Wegener, who was pretty much laughed out of the field because of it (and ultimately died in 1930 on an expedition to Greenland).

It's the last one that's germane to our topic today, which is a largely-unexplained (and massive) feature of North Africa that goes to show that however far we've come, there are still plenty of things left for the scientists to explain.  It's called the Tibesti Massif, and largely lies in the far north of the country of Chad, with a bit spilling over the southern border of Libya.

It's a strange, remote, and forbidding landscape:

[Image is in the Public Domain courtesy of photographer Michael Kerling]

What's peculiar about it -- besides the fact that it looks like the "desert planet" set from Lost in Space -- is that its terrain was largely created by volcanism, despite the fact that it lies smack in the center of one of those "stable continental cratons" I talked about in my previous post.  It's got a very peculiar geology -- the basement rock is Precambrian granite, over which there's a layer of Paleozoic sandstone, but above that is a layer of basalt which is in some places three hundred meters thick.  Basalt is one of those mafic rocks I mentioned; iron-rich, silica-poor, and ordinarily associated with seafloor rift zones like Iceland and deep-mantle hotspots like Hawaii.  But over that are felsic rocks like dacite, rhyolite, and ignimbrite, which are usually found in explosive, subduction zone volcanoes like the ones in the Caribbean, Japan, and Indonesia.

What's odd about all this is that there's no mechanism known that would generate all these kinds of rocks from the same system.  The current guess is that there was a mantle hotspot that started in the late Oligocene Epoch, on the order of twenty-five million years ago, that has gradually weakened and incorporated lower-density continental rocks as the upwelling slowed, but the truth is, nobody really knows.

It's still active, too.  The Tibesti Massif is home to hot springs, mud pools, and fumaroles, some of which contain water at 80 C or above.

So we've got a volcanic region in the southern Sahara where, by conventional wisdom, there shouldn't be one, with a geology that thus far has defied explanation.  Some geologists have tried to connect it to the Cameroon Line or the East African Rift Zone, but the truth is, Africa is a much bigger place than most people think it is, and it's a very long way away from either one.  (It's about three thousand kilometers from the northernmost active volcanoes in both Cameroon and Ethiopia to the southern edge of the Tibesti Massif; that's roughly the distance between New York City and Denver, Colorado.  So connecting Tibesti to either the Cameroon Line or the East African Rift is a bit like trying to explain the geology of Long Island using processes happening in the Rocky Mountains.)

And the problem is, figuring out this geological conundrum isn't going to be easy.  It's one of the most remote and difficult-to-access places on Earth, hampered not only by the fact that there are virtually no roads but the one-two punch of extreme poverty and political instability in the country of Chad.  So even getting a scientific team in to take a look at the place is damn near impossible.  The geologists studying the region have resorted to -- I swear I'm not making this up -- using comparisons to research on the geology of volcanoes on Mars, because even that is easier than getting a team into northern Chad.

The idea that we have a spot on the Earth still so deeply mysterious, despite everything we've learned, is both astonishing and thrilling.  Here we sit, in 2024, as arrogantly confident we have a bead on the totality of knowledge as the people did back in 1924, despite the fact that history has always shown such confidence in our understanding is unfounded.  The reality is humbling, and far more exciting.  As Carl Sagan put it, "Somewhere, something amazing is waiting to be known."

I wonder what the next hundred years will bring, and if the people in 2124 will look back at us with that same sense of "how could they not have known that?"

Onward -- into the great unknown!

****************************************



Wednesday, September 21, 2022

Memory offload

In James Burke's brilliant series The Day the Universe Changed, there's a line that never fails to shock me when I think about it, but which goes by so quickly you might miss it if you're not paying attention.  (This is typical of Burke -- I've heard his deservedly famous series Connections as being like "watching a pinball game on fast-forward.")

The line comes up at the beginning of the last episode, "Worlds Without End," in which he's giving a quick summary of humankind's progression through technology.  He says, "In the fifteenth century, the invention of the printing press took our memories away."

Recording our knowledge in some kind of semi-permanent fashion is at odds with our need to keep anything important in memory.  I'm riffing on that concept in my current work-in-progress, The Scattering Winds, which is about a post-apocalyptic world in which some parts of society in what is now the United States have gone back to being non-literate.  All of the knowledge of the culture is entrusted to the mind of one person -- the Keeper of the Word -- whose sacred task it is to remember all lore, language, music, and history.

Then... because of a refugee from another place -- the apprentice to the Keeper learns about written language, and acquires the rudiments of reading, then goes in search of any books that might have survived the disasters and plagues that ended the world as we know it.  He realizes that this (re)discovery will end the vocation he's studied his whole life for, but the lure of lost knowledge is too powerful to resist even so.

He knows that in a very real sense, the rediscovery of written language will take his memory away.

The internet, of course, has only deepened the scope of the problem.  A few years ago, I had a student who had what seemed to me a weird approach to figuring things out.  When presented with a question he didn't know the answer to, his immediate response was to pull out his school-issued iPad and Google it.  Often, he didn't even give his brain a chance to wrestle with the question; if the answer wasn't immediately obvious, out came the electronics.

"What have you learned by doing that?" I recall asking him, trying to keep the frustration out of my voice.

"I got the right answer," he said.

"But the answer isn't the point!"  Okay, at that point my frustration was pretty clear.

I think the issue I had with this student comes from two sources.  One is the education system's unfortunate emphasis on Getting The Right Answer -- that if you have The Right Answer on your paper, it doesn't matter how you got it, or whether you really understand how to get there.  But the other is our increasing reliance on what amounts to external memory.  When we don't know something, the ease and accessibility of answers online makes us default to that, rather than taking the time to search our own memories for the answer.


The loss of our own facility for recall because of the external storage of information was the subject of a study in the journal Memory.  Called "Cognitive Offloading: How the Internet is Increasingly Taking Over Human Memory," the study, by cognitive psychologists Benjamin Storm, Sean Stone, and Aaron Benjamin, looked at how people approach the recall of information, and found that once someone has started relying on the internet, it becomes the go-to source, superseding one's own memory:
The results revealed that participants who previously used the Internet to gain information were significantly more likely to revert to Google for subsequent questions than those who relied on memory.  Participants also spent less time consulting their own memory before reaching for the Internet; they were not only more likely to do it again, they were likely to do it much more quickly.  Remarkably, 30% of participants who previously consulted the Internet failed to even attempt to answer a single simple question from memory.
This certainly mirrors my experience with my students.  Not all of them were as hooked to their electronics as the young man in my earlier anecdote, but it is more and more common for students to bypass thinking altogether and jump straight to Google.

"Memory is changing," lead author Storm said.  "Our research shows that as we use the Internet to support and extend our memory we become more reliant on it.  Whereas before we might have tried to recall something on our own, now we don't bother.  As more information becomes available via smartphones and other devices, we become progressively more reliant on it in our daily lives."

What concerns me is something that the researchers say was outside the scope of their research; what effect this might have on our own cognitive processes.  It's one thing if the internet becomes our default, but that our memories are still there, unaltered, should the Almighty Google not be available.  It's entirely another if our continual reliance on external "offloaded" memory ultimately weakens our own ability to process, store, and recall.  It's not as far-fetched as it sounds; there have been studies that suggest that mental activity can stave off or slow down dementia, so the "if you don't use it, you lose it" aphorism may work just as much for our brains as it does for our muscles.

In any case, maybe it'd be a good idea for all of us to put away the electronics.  No one questions the benefits of weightlifting if you're trying to gain strength; maybe we should push ourselves into the mental weightlifting of processing and recalling without leaning on the crutch of the internet.  And as Kallian discovers in The Scattering Winds, the bounty of information that comes from the external storage of information -- be it online or in print -- comes at a significant cost to our own reverence for knowledge and depth of understanding.

****************************************


Tuesday, February 9, 2021

Fooling the experts

I was bummed to hear about the death of the inimitable Cloris Leachman a week and a half ago at the venerable age of 94.  Probably most famous for her role as Frau Blücher *wild neighing horse noises* in the movie Young Frankenstein, I was first introduced to her unsurpassed sense of comic timing in the classic 1970s sitcom The Mary Tyler Moore Show, where she played the tightly-wound self-styled intellectual Phyllis Lindstrom.

One of my favorite moments in that show occurred when Phyllis was playing a game of Scrabble against Mary's neighbor Rhoda Morgenstern (played with equal panache by Valerie Harper).  Rhoda puts down the word oxmersis, and Phyllis challenges it.

"There's no such thing as 'oxmersis,'" Phyllis says.

Rhoda looks at her, aghast.  "Really, Phyllis?  I can not believe that someone who knows as much about psychology as you do has never heard of oxmersis."

Long pause, during which you can almost see the gears turning in Phyllis's head.  "Oh," she finally says.  "That oxmersis."

I was immediately reminded of that scene when I ran into a paper while doing some background investigation for yesterday's post, which was about psychologist David Dunning's research with Robert Proctor regarding the deliberate cultivation of stupidity.  This paper looked at a different aspect of ignorance -- what happens when you combine the Dunning-Kruger effect (people's tendency to overestimate their own intelligence and abilities) with a bias called Appeal to Authority.

Appeal to Authority, you probably know, is when someone uses credentials, titles, or educational background -- and no other evidence -- to support a claim.  Put simply, it is the idea that if Richard Dawkins said it, it must be true, regardless of whether the claim has anything to do with Dawkins's particular area of expertise, evolutionary biology.  (I pick Dawkins deliberately, because he's fairly notorious for having opinions about everything, and seems to relish being the center of controversy regardless of the topic.)  

Dunning teamed up with Cornell University researchers Stav Atir and Emily Rosenzweig, and came up with what could be described as the love child of Dunning-Kruger and Appeal to Authority.  And what this new phenomenon -- dubbed, predictably, the Atir-Rosenzweig-Dunning Effect -- shows us is that people who are experts in a particular field tend to think their expertise holds true even for disciplines far outside their chosen area of study, and because of that are more likely to fall for plausible-sounding falsehoods -- like Phyllis's getting suckered by Rhoda's "oxmersis" bluff.

[Image is in the Public Domain]

In one experiment, the three researchers asked people to rate their own knowledge in various academic areas, then asked them to rank their level of understanding of various finance-related terms, such as "pre-rated stocks, fixed-rate deduction and annualized credit."  The problem is, those three finance-related terms actually don't exist -- i.e., they were made up by the researchers to sound plausible.

The test subjects who had the highest confidence level in their own fields were most likely to fall for the ruse.  Simon Oxenham, who described the experiments in Big Think, says it's only natural.  "A possible explanation for this finding," Oxenham writes, "is that the participants with a greater vocabulary in a particular domain were more prone to falsely feeling familiar with nonsense terms in that domain because of the fact that they had simply come across more similar-sounding terms in their lives, providing more material for potential confusion."

Interestingly, subsequent experiments showed that the correlation holds true even if you take away the factor of self-ranking.  Presumably, someone who is cocky and arrogant and ranks his/her ability higher than is justified in one area would be likely to do it in others.  But when they tested the subjects' knowledge of terms from their own field -- i.e., actually measured their expertise -- high scores still correlated with overestimating their knowledge in other areas.

And telling the subjects ahead of time that some of the terms might be made up didn't change the results. "[E]ven when participants were warned that some of the statements were false, the 'experts' were just as likely as before to claim to know the nonsense statements, while most of the other participants became more likely in this scenario to admit they’d never heard of them," Oxenham writes.

I have a bit of anecdotal evidence supporting this result from my experience in the classroom.  On multiple-choice tests, I had to concoct plausible-sounding wrong answers as distractors.  Every once in a while, I ran out of good wrong answers, and just made something up.  (On one AP Biology quiz on plant biochemistry, I threw in the term "photoglycolysis," which sounds pretty fancy until you realize that there's no such thing.)   What I found was that it was the average to upper-average students who were the most likely to be taken in.  The top students didn't get fooled because they knew what the correct answer was; the lowest students were equally likely to pick any of the wrong answers, because they didn't understand the material well.  The mid-range students saw something that sounded technical and vaguely familiar -- and figured that if they weren't sure, it must be that they'd missed learning that particular term.

It was also the mid-range students who were most likely to miss questions where the actual answer seemed too simple.  Another botanical question I liked to throw at them was, "What do all non-vascular land plants have in common?"  I always provided three wrong answers with appropriately technical-sounding jargon.

The actual answer is, "They're small."

Interestingly, the reason for the small size of non-vascular land plants (the most familiar example is moss) isn't simple at all.  But the answer itself just looked too easy to merit being the correct choice on an AP Biology quiz.

So Atir, Rosenzweig, and Dunning have given us yet another mental pitfall to watch out for -- our tendency to use our knowledge in one field to overestimate our knowledge in others.  But I really should run along, and make sure that the annualized credit on my pre-rated stocks exceeds the recommended fixed-rate deduction.  I worry a lot about that kind of thing, but I suppose my anxiety really just another case of excessive oxmersis.

*********************************

Science writer Elizabeth Kolbert established her reputation as a cutting-edge observer of the human global impact in her wonderful book The Sixth Extinction (which was a Skeptophilia Book of the Week a while back).  This week's book recommendation is her latest, which looks forward to where humanity might be going.

Under a White Sky: The Nature of the Future is an analysis of what Kolbert calls "our ten-thousand-year-long exercise in defying nature," something that immediately made me think of another book I've recommended -- the amazing The Control of Nature by John McPhee, the message of which was generally "when humans pit themselves against nature, nature always wins."  Kolbert takes a more nuanced view, and considers some of the efforts scientists are making to reverse the damage we've done, from conservation of severely endangered species to dealing with anthropogenic climate change.

It's a book that's always engaging and occasionally alarming, but overall, deeply optimistic about humanity's potential for making good choices.  Whether we turn that potential into reality is largely a function of educating ourselves regarding the precarious position into which we've placed ourselves -- and Kolbert's latest book is an excellent place to start.

[Note: if you purchase this book using the image/link below, part of the proceeds goes to support Skeptophilia!]



Friday, April 3, 2020

The risk of knowing

One of the hallmarks of the human condition is curiosity.  We spend a lot of our early years learning by exploring, by trial-and-error, so it makes sense that curiosity should be built into our brains.

Still, it comes at a cost.  "Curiosity killed the cat" isn't a cliché for nothing.  The number of deaths in horror movies alone from someone saying, "I hear a noise in that abandoned house, I think I'll go investigate" is staggering.  People will take amazing risks out of nothing but sheer inquisitiveness -- so the gain in knowledge must be worth the cost.

[Image is in the Public Domain]

The funny thing is that we'll pay the cost even when what we gain isn't worth anything.  This was demonstrated by a clever experiment described in a paper by Johnny King Lau and Kou Murayama (of the University of Reading (U.K.)), Hiroko Ozono (of Kagoshima University) and Asuka Komiya (of Hiroshima University) that came out two days ago.  Entitled "Shared Striatal Activity in Decisions to Satisfy Curiosity and Hunger at the Risk of Electric Shocks," we hear about a set of experiments showing that humans will risk a painful shock to find out entirely useless information (in this case, how a card trick was performed).  The cleverest part of the experiments, though, is that they told test subjects ahead of time how much of a chance there was of being shocked -- so they had a chance to decide, "how much is this information worth?"

What they found was that even when told that there was a higher than 50% of being shocked, most subjects were still curious enough to take the risk.  The authors write:
Curiosity is often portrayed as a desirable feature of human faculty.  However, curiosity may come at a cost that sometimes puts people in harmful situations.  Here, using a set of behavioural and neuroimaging experiments with stimuli that strongly trigger curiosity (for example, magic tricks), we examine the psychological and neural mechanisms underlying the motivational effect of curiosity.  We consistently demonstrate that across different samples, people are indeed willing to gamble, subjecting themselves to electric shocks to satisfy their curiosity for trivial knowledge that carries no apparent instrumental value.
The researchers added another neat twist -- they used neuroimaging techniques to see what was going on in the curiosity-driven brain, and they found a fascinating overlap with another major driver of human behavior:
[T]his influence of curiosity shares common neural mechanisms with that of hunger for food.  In particular, we show that acceptance (compared to rejection) of curiosity-driven or incentive-driven gambles is accompanied by enhanced activity in the ventral striatum when curiosity or hunger was elicited, which extends into the dorsal striatum when participants made a decision.
So curiosity, then, is -- in nearly a literal sense -- a hunger.  The satisfaction we feel at taking a big bite of our favorite food when we're really hungry causes the same reaction in the brain as having a curiosity satisfied.  And like hunger, we're willing to take significant risks to satisfy our curiosity.  Even if -- to re-reiterate it -- the person in question knows ahead of time that the information they're curious about is technically useless.

I can definitely relate to this.  In me, it mostly takes the form of wasting inordinate amounts of time going down a rabbit hole online because some weird question came my way.  The result is that my brain is completely cluttered up with worthless trivia.  For example, I can tell you the scientific name of the bird you're looking at or why microbursts are common in the American Midwest or the etymology of the word "juggernaut," but went to the grocery store yesterday to buy three things and came back with only two of them.  (And didn't realize I'd forgotten 1/3 of the grocery order until I walked into the kitchen and started putting away what I'd bought.)

Our curiosity is definitely a double-edged sword.  I'm honestly fine with it, because often, knowing something is all the reward I need.  As physicist Richard Feynman put it, "The chief prize (of science) is the pleasure of finding things out."

So I suspect I'd have been one of the folks taking a high risk of getting shocked to see how the card trick was performed.  Don't forget that the corollary to the quote we started with -- "Curiosity killed the cat" -- is "...but satisfaction brought him back."

*******************************

In the midst of a pandemic, it's easy to fall into one of two errors -- to lose focus on the other problems we're facing, and to decide it's all hopeless and give up.  Both are dangerous mistakes.  We have a great many issues to deal with besides stemming the spread and impact of COVID-19, but humanity will weather this and the other hurdles we have ahead.  This is no time for pessimism, much less nihilism.

That's one of the main gists in Yuval Noah Harari's recent book 21 Lessons for the 21st Century.  He takes a good hard look at some of our biggest concerns -- terrorism, climate change, privacy, homelessness/poverty, even the development of artificial intelligence and how that might impact our lives -- and while he's not such a Pollyanna that he proposes instant solutions for any of them, he looks at how each might be managed, both in terms of combatting the problem itself and changing our own posture toward it.

It's a fascinating book, and worth reading to brace us up against the naysayers who would have you believe it's all hopeless.  While I don't think anyone would call Harari's book a panacea, at least it's the start of a discussion we should be having at all levels, not only in our personal lives, but in the highest offices of government.





Tuesday, April 17, 2018

Superior ignorance

I've written before on the topic of the Dunning-Kruger effect, the idea that we all tend to overestimate our own knowledge of a topic (parodied brilliantly by Garrison Keillor in his spot "News from Lake Woebegon" on Prairie Home Companion -- where "all of the children are above average").


A study released last week in the Journal of Experimental Social Psychology gives us another window into this unfortunate tendency of the human brain.  In the paper "Is Belief Superiority Justified by Superior Knowledge?", by Michael P. Hall and Kaitlin T. Raimi, we find out the rather frustrating corollary to the Dunning-Kruger effect: that the people who believe their opinions are superior actually tend to know less about the topic than the people who have a more modest view of their own correctness.

The authors write:
Individuals expressing belief superiority—the belief that one's views are superior to other viewpoints—perceive themselves as better informed about that topic, but no research has verified whether this perception is justified.  The present research examined whether people expressing belief superiority on four political issues demonstrated superior knowledge or superior knowledge-seeking behavior.  Despite perceiving themselves as more knowledgeable, knowledge assessments revealed that the belief superior exhibited the greatest gaps between their perceived and actual knowledge.  
The problem, of course, is that if you think your beliefs are superior, you're much more likely to go around trying to talk everyone into believing like you do.  If you really are more knowledgeable, that's at least justifiable; but the idea that the less informed you are, the more likely you are to proselytize, is alarming to say the least.

There is at least a somewhat encouraging piece to this study, which indicated that this tendency may be remediable:
When given the opportunity to pursue additional information in that domain, belief-superior individuals frequently favored agreeable over disagreeable information, but also indicated awareness of this bias.  Lastly, experimentally manipulated feedback about one's knowledge had some success in affecting belief superiority and resulting information-seeking behavior.  Specifically, when belief superiority is lowered, people attend to information they may have previously regarded as inferior.  Implications of unjustified belief superiority and biased information pursuit for political discourse are discussed.
So belief-superior people are more likely to fall for confirmation bias (which you'd expect), but if you can somehow punch a hole in the self-congratulation, those people will be more willing to listen to contrary viewpoints.

The problem remains of how to get people to admit that their beliefs are open to challenge.  I'm thinking in particular of Ken Ham, who in the infamous Ken Ham/Bill Nye debate on evolution and creationism, was asked what, if anything, could change his mind.  Nye had answered the question that a single piece of incontrovertible evidence is all it would take; Ham, on the other hand, said that nothing, nothing whatsoever, could alter his beliefs.

Which highlights brilliantly the difference between the scientific and religious view of the world.

So the difficulty is that counterfactual viewpoints are often well insulated from challenge, and the people who hold them resistant to considering even the slightest insinuation that they could be wrong.  I wrote last week about Donald Trump's unwillingness to admit he's wrong about anything, ever, even when presented with unarguable facts and data.  If that doesn't encapsulate the Dunning-Kruger attitude, and the Hall-Raimi corollary to it, I don't know what does.

Doesn't mean we shouldn't try, of course.  After all, if I thought it was hopeless, I wouldn't be here on Skeptophilia six days a week.  The interesting part of the study by Hall and Raimi, however, is the suggestion that we might be going about it all wrong.  The way to fix wrong-headed thinking may not be to present the person with evidence, but to get someone to see that they could, in fact, be wrong in a more global sense.  This could open them up to considering other viewpoints, and ultimately, looking at the facts in a more skeptical, open-minded manner.

On the other hand, I still don't think there's much we can do about Ken Ham and Donald Trump.

*********************
This week's Featured Book on Skeptophilia:

This week I'm featuring a classic: Carl Sagan's The Demon-Haunted World: Science as a Candle in the Dark.  Sagan, famous for his work on the series Cosmos, here addresses the topics of pseudoscience, skepticism, credulity, and why it matters -- even to laypeople.  Lucid, sometimes funny, always fascinating.




Thursday, June 30, 2016

Viral stupidity

My dad used to say that ignorance was only skin deep, but stupid goes all the way to the bone.

There's a lot to that.  Ignorance can be cured; after all, the etymology of the word comes from a- (not) and -gnosis (knowledge).  There are plenty of things I'm ignorant about, but I'm always willing to cure that ignorance by working at understanding.

Stupidity, on the other hand, is a different matter.  There's something willful about stupidity.  There's a stubborn sense of "I don't know and I don't care," leading to my dad's wise assessment that on some level stupidity is a choice.  Stupidity is not simply ignorance; it's ignorance plus the decision that ignorance is good enough.

What my dad may have not realized, though, is that there's a third circle of hell, one step down even from stupidity.  Science historian Robert Proctor of Stanford University has made this his area of study, a field he has christened agnotology -- the "study of culturally constructed ignorance."

Proctor is interested in something that makes stupidity look positively innocent; the deliberate cultivation of stupidity by people who are actually intelligent.  This happens when special interest groups foster confusion among laypeople for their own malign purposes, and see to it that such misinformation goes viral.  For example, this is clearly what is happening with respect to anthropogenic climate change.  There are plenty of people in the petroleum industry who are smart enough to read and understand scientific papers, who can evaluate data and evidence, who can follow a rational argument.  That they do so, and still claim to be unconvinced, is stupidity.

That they then lie and misrepresent the science in order to cast doubt in the minds of less well-informed people in order to push a corporate agenda is one step worse.

"People always assume that if someone doesn't know something, it's because they haven't paid attention or haven't yet figured it out," Proctor says.  "But ignorance also comes from people literally suppressing truth—or drowning it out—or trying to make it so confusing that people stop caring about what's true and what's not."

[image courtesy of Nevit Dilman and the Wikimedia Commons]

The same sort of thing accounts for the continuing claims that President Obama is a secret Muslim, that Hillary Clinton was personally responsible for the Benghazi attacks, that jet impacts were insufficient to bring down the Twin Towers on 9/11 so it must have been an "inside job."  Proctor says the phenomenon is even responsible for the spread of creationism -- although I would argue that this isn't quite the same thing.  Most of the people pushing creationism are, I think, true believers, not cynical hucksters who know perfectly well that what they're saying isn't true and are only spreading the message to bamboozle the masses.  (Although I have to admit that the "why are there still monkeys?" and "the Big Bang means that nothing exploded and made everything" arguments are beginning to seem themselves like they're one step lower than stupidity, given how many times these objections have been answered.)

"Ignorance is not just the not-yet-known, it’s also a political ploy, a deliberate creation by powerful agents who want you 'not to know'," Proctor says.  "We live in a world of radical ignorance, and the marvel is that any kind of truth cuts through the noise.  Even though knowledge is accessible, it does not mean it is accessed."

David Dunning of Cornell University, who gave his name to the Dunning-Kruger effect (the idea that people systematically overestimate their own knowledge), agrees with Proctor.  "While some smart people will profit from all the information now just a click away, many will be misled into a false sense of expertise," Dunning says.  "My worry is not that we are losing the ability to make up our own minds, but that it’s becoming too easy to do so.  We should consult with others much more than we imagine.  Other people may be imperfect as well, but often their opinions go a long way toward correcting our own imperfections, as our own imperfect expertise helps to correct their errors."

All of which, it must be said, is fairly depressing.  That we can have more information at our fingertips than ever before in history, and still be making the same damned misjudgments, is a dismal conclusion.  It is worse still that there are people who are taking advantage of this willful ignorance to push popular opinion around for their own gain.

So my dad is right; ignorance is curable, stupidity reaches the bone.  And what Proctor and Dunning study, I think, goes past the bone, all the way to the heart.