This was the genesis of the Critical Thinking class that it is my privilege to teach. I was given the green light to develop the curriculum, and (if I can indulge in a moment of self-congratulation here) it has become one of the most popular electives in the school.
Critical thinking is a skill, and like every skill, it (1) doesn't necessarily come naturally, but (2) becomes easier the more you do it. As humans, we come pre-programmed with a whole host of cognitive biases we have to learn to work around -- dart-thrower's bias (the tendency of people to pay more attention to outliers), a natural bent for magical thinking, the unfortunate likelihood of our memories being malleable, inaccurate, or outright false. But with time and effort, you can learn some strategies for sifting fact from fiction, for detecting it if you're being hoodwinked or misled.
In other words, a skeptical approach can be taught.
I'm delighted to say that great strides are being taken in this area outside of my little rural school district. Right now, a pilot program in Uganda, led by Sir Iain Chalmers of the Cochrane Foundation, has tested a new curriculum for critical thinking with respect to health and medicine with 15,000 grade-school children. Chalmers is unequivocal about the program's intent; what he wants, he says, is for kids to be able to "detect bullshit when bullshit is being presented to them."
[image courtesy of the Wikimedia Commons]
Yes, I know that in the industrialized world we have the highest human life expectancy the world has ever seen, and we've virtually eradicated dozens of infectious diseases using exactly the sort of "allopathic" medicine that Oz and his cronies rail against. This isn't about fact; it's about being swung around by your fears and emotions.
But we're not the only place in the world that has this problem. Central Africa, where Chalmers's trial is being run, is a hotbed of superstition, with people rejecting vaccines and antibiotics in favor of "herbal remedies" based on fear. Quack cures are common -- for example, putting cow dung on burns. Allen Nsangi, a researcher in Uganda who is working with Chalmers on the project, said that this practice is "almost the best-known treatment."
The Uganda project was the brainchild of Andy Oxman, research director at the Norwegian Institute of Public Health. "Working with policymakers made it clear most adults don’t have time to learn, and they have to unlearn a lot of stuff," Oxman said. "I’m looking to the future. I think it’s too late for my generation... My hope is that these resources get used in curricula in schools around the world, and that we end up with the children ... who become science-literate citizens and who can participate in sensible discussion about policy and our health."
All of which I find tremendously encouraging. (Not the part about my generation being a lost cause, because I don't really think that's true, honestly.) If we can equip children with a good skeptical toolkit, they'll be much less likely to get taken advantage of -- not only in the realm of health, but in every other way. These skills aren't limited to one discipline. Once you've adopted a skeptical outlook, you'll find that you apply it to everything.
At least that's my hope. It's certainly what I've seen in my own classes. As one of my students told me not long ago, "I thought at first that it was impossible to do what you were asking us to do -- to read and listen to evaluate, not just to memorize and regurgitate. But now I can't help myself. When I read something, I think, 'Okay, how do I know this is true? What's the evidence? Could there be another explanation?'"
Which is it exactly. Skepticism isn't cynicism; disbelieving everything out of hand is as lazy as gullibility. But it's essential that we learn to consider what we're hearing rather than simply trusting that we're being told the truth. As Satoshi Kanazawa put it: "There are only two legitimate criteria by which you may evaluate scientific ideas: logic and evidence."