The eminent physicist Stephen Hawking said, "The greatest enemy of knowledge is not ignorance; it is the illusion of knowledge."
Somewhat more prosaically, my dad once said, "Ignorance can be cured. We're all ignorant about some things. Stupid, on the other hand, goes all the way to the bone."
Both of these sayings capture an unsettling idea; that often it's more dangerous to think you understand something than it is to admit you don't. This idea was illustrated -- albeit using an innocuous example -- in a 2002 paper called "The Illusion of Explanatory Depth" by Leo Rozenblit and Frank Keil, of Yale University. What they did is to ask people to rate their level of understanding of a simple, everyday object (for example, how a zipper works), on a scale of zero to ten. Then, they asked each participant to write down an explanation of how zippers work in as much detail as they could. Afterward, they asked the volunteers to re-rate their level of understanding.
Across the board, people rated themselves lower the second time, after a single question -- "Okay, then explain it to me" -- shone a spotlight on how little they actually knew.
The problem is, unless you're in school, usually no one asks the question. You can claim you understand something, you can even have a firmly-held opinion about it, and there's no guarantee that your stance is even within hailing distance of reality.
And very rarely does anyone challenge you to explain yourself in detail.
If that's not bad enough, a recent paper by Adrian Ward (of the University of Texas - Austin) showed that not only do we understand way less than we think we do, we fold what we learn from other sources into our own experiential knowledge, regardless of the source of that information. Worse still, that incorporation is so rapid and smooth that afterward, we aren't even aware of where our information (right or wrong) comes from.
People frequently search the internet for information. Eight experiments provide evidence that when people “Google” for online information, they fail to accurately distinguish between knowledge stored internally—in their own memories—and knowledge stored externally—on the internet. Relative to those using only their own knowledge, people who use Google to answer general knowledge questions are not only more confident in their ability to access external information; they are also more confident in their own ability to think and remember. Moreover, those who use Google predict that they will know more in the future without the help of the internet, an erroneous belief that both indicates misattribution of prior knowledge and highlights a practically important consequence of this misattribution: overconfidence when the internet is no longer available. Although humans have long relied on external knowledge, the misattribution of online knowledge to the self may be facilitated by the swift and seamless interface between internal thought and external information that characterizes online search. Online search is often faster than internal memory search, preventing people from fully recognizing the limitations of their own knowledge. The internet delivers information seamlessly, dovetailing with internal cognitive processes and offering minimal physical cues that might draw attention to its contributions. As a result, people may lose sight of where their own knowledge ends and where the internet’s knowledge begins. Thinking with Google may cause people to mistake the internet’s knowledge for their own.
I recall vividly trying, with minimal success, to fight this in the classroom. Presented with a question, many students don't stop to try to work it out themselves, they immediately jump to looking it up on their phones. (One of many reasons I had a rule against having phones out during class, another exercise in frustration given how clever teenagers are at hiding what they're doing.) I tried to make the point over and over that there's a huge difference between looking up a fact (such as the average number of cells in the human body) and looking up an explanation (such as how RNA works). I use Google and/or Wikipedia for the former all the time. The latter, on the other hand, makes it all too easy simply to copy down what you find online, allowing you to have an answer to fill in the blank irrespective of whether you have the least idea what any of it means.
Even Albert Einstein, pre-internet though he was, saw the difference, and the potential problem therein. Once asked how many feet were in a mile, the great physicist replied, "I don't know. Why should I fill my brain with facts I can find in two minutes in any standard reference book?”
In the decades since Einstein's said this, that two minutes has shrunk to about ten seconds, as long as you have internet access. And unlike the standard reference books he mentioned, you have little assurance that the information you found online is even close to right.
Don't get me wrong; I think that our rapid, and virtually unlimited, access to human knowledge is a good thing. But like most good things, it comes at a cost, and that cost is that we have to be doubly cautious to keep our brains engaged. Not only is there information out there that is simply wrong, there are people who are (for various reasons) very eager to convince you they're telling the truth when they're not. This has always been true, of course; it's just that now, there are few barriers to having that erroneous information bombard us all day long -- and Ward's paper shows just how quickly we can fall for it.
The cure is to keep our rational faculties online. Find out if the information is coming from somewhere reputable and reliable. Compare what you're being told with what you know to be true from your own experience. Listen to or read multiple sources of information -- not only the ones you're inclined to agree with automatically. It might be reassuring to live in the echo chamber of people and media which always concur with our own preconceived notions, but it also means that if something is wrong, you probably won't realize it.
Like I said in Saturday's post, finding out you're wrong is no fun. More than once I've posted stuff here at Skeptophilia and gotten pulled up by the short hairs when someone who knows better tells me I've gotten it dead wrong. Embarrassing as it is, I've always posted retractions, and often taken the original post down. (There's enough bullshit out on the internet without my adding to it.)
So we all need to be on our guard whenever we're surfing the web or listening to the news or reading a magazine. Our tendency to absorb information without question, regardless of its provenance -- especially when it seems to confirm what we want to believe -- is a trap we can all fall into, and Ward's paper shows that once inside, it can be remarkably difficult to extricate ourselves.