It's a necessary skill. Kids who have not taken the time to understand the topic being studied are notorious for bullshitting answers on essay questions, often padding their writing with vague but sciency-sounding words. An example is the following, which is verbatim (near as I can recall) from an essay on how photosynthesis is, and is not, the reverse of aerobic cellular respiration:
From analyzing photosynthesis and the process of aerobic cellular respiration, you can see that certain features are reversed between the two reactions and certain things are not. Aerobic respiration has the Krebs Cycle and photosynthesis has the Calvin Cycle, which are also opposites in some senses and not in others. Therefore, the steps are not the same. So if you ran them in reverse, those would not be the same, either.I returned this essay with one comment: "What does this even mean?" The student in question at least had the gumption to admit he'd gotten caught. He grinned sheepishly and said, "You figured out that I had no idea what I was talking about, then?" I said, "Yup." He said, "Guess I better study next time."
I said, "Yup."
Developing a sensitive nose for bullshit is critical not only for teachers, because there's a lot of it out there, and not just in academic circles. Writer Scott Berkun addressed this in his wonderful piece, "How to Detect Bullshit," which gives some concrete suggestions about how to figure out what is USDA grade-A prime beef, and what is the cow's other, less pleasant output. One of the best is simply to ask the questions, "How do you know that?", "Who else has this opinion?", and "What is the counter-argument?"
You say your research will revolutionize the field?
Says who? Based on what evidence?
He also says to be very careful whenever anyone says, "Studies show," because usually if studies did show what the writer claims, (s)he'd be specific about what those studies were. Vague statements like "studies show" are often a red flag that the claim doesn't have much in its favor.
Using ten-dollar buzzwords is also a good way to cover up the fact that you're sailing pretty close to the wind. Berkun recommends asking, "Can you explain this in simpler terms?" If the speaker can't give you a good idea of what (s)he's talking about without resorting to jargon, the fancy verbiage is fairly likely to be there to mislead.
This is the idea behind BlaBlaMeter, a website I found out about from a student of mine, into which you can cut-and-paste text and get a score (from 0 to 1.0) for how much bullshit it contains. I'm not sure what the algorithm does besides detecting vague filler words, but it's a clever idea. It'd certainly be nice to have a rigorous way to detect it when you're being bamboozled with words.
The importance of being able to detect fancy-sounding nonsense was highlighted just this week by the acceptance of a paper for the International Conference on Atomic and Nuclear Physics -- when it turned out that the paper had been created by hitting iOS Autocomplete over and over. The paper, written (sort of) by Christoph Bartneck, associate professor at the Human Interface Technology laboratory at the University of Canterbury in New Zealand, was titled "Atomic Energy Will Have Been Made Available to a Single Source" (the title was also generated by autocomplete), and contained passages such as:
The atoms of a better universe will have the right for the same as you are the way we shall have to be a great place for a great time to enjoy the day you are a wonderful person to your great time to take the fun and take a great time and enjoy the great day you will be a wonderful time for your parents and kids.Which, of course, makes no sense at all. In this case, I wonder if the reviewers simply didn't bother to read the paper -- or read a few sample sentences and found that they (unlike the above) made reasonable sense, and said, "Looks fine to me."
Although I'd like to think that even considering my lack of expert status on atomic and nuclear physics, I'd have figured out that what I was looking at was ridiculous.
On a more serious note, there's a much more pressing reason that we all need to arm ourselves against bullshit, because so much of what's on the internet is outright false. A team of political fact-checkers was hired by Buzzfeed News to sift through claims on politically partisan Facebook pages, and found that on average, a third of the claims made by partisan sites were outright false. And lest you think one side was better than the other, the study found that both right and left were making a great many unsubstantiated, misleading, or wrong claims. And we're not talking about fringe-y wingnut sites here; these were sites that if you're on Facebook you see reposts from on a daily basis -- Occupy Democrats, Eagle Rising, Freedom Daily, The Other 98%, Addicting Info, Right Wing News, and U.S. Uncut.
What this means is that when you see posts from these sites, there is (overall) about a 2/3 chance that what you're seeing is true. So if you frequent those pages -- or, more importantly, if you're in the habit of clicking "share" on every story that you find mildly appealing -- you damn well better be able to figure out which third is wrong.
The upshot of it is, we all need better bullshit filters. Given that we are bombarded daily by hundreds of claims from the well-substantiated to the outrageous, it behooves us to find a way to determine which is which.
And, if you're curious, a 275-word passage from this Skeotphilia post was rated by BlaBlaMeter as having a bullshit rating of 0.13. Which I find reassuring. Not bad, considering the topic I was discussing.