If there's one phrase that torques the absolute hell out of me -- and just about every actual scientist out there -- it's, "Well, I did my research."
Oh, you did, did you? What lab did you do your research in? Or was it field work? Let's see your data! Which peer-reviewed journal published your research? How many times has it been cited in other scientific journals?
Part of the problem, of course, is like a lot of words in the English language -- "theory" and "proof" are two examples that come to mind -- the word "research" is used one way by actual researchers and a different way by most other people. We were taught the alternate definition of "research" in grade school, with being assigned "research papers," which meant "go out and look up stuff other people have found out on the topic, and summarize that in your own words." There's a value to doing this; it's a good starting place to understanding a subject, and is honestly where we all began with scholarship.
The problem is -- and it exists even at the grade-school level of inquiry -- this kind of "research" is only as good as the sources you choose. When I was a teacher, one of the hardest things to get students to understand was that all sources are not created equal. A paper in Science, or even the layperson's version of it in Scientific American or Discover, is head-and-shoulders above the meanderings of Some Random Guy in his blog. (And yes, I'm well aware that this pronouncement is being made by Some Random Guy in his blog.)
That doesn't mean those less-reputable sources are necessarily wrong, of course. It's more that they can't be relied upon. While papers in Science (and other comparable journals) are occasionally retracted for errors or inaccuracies, there is a vetting process that makes their likelihood of being correct vastly higher. After all, any oddball with a computer can create a website, and post whatever they want on it, be it brilliant posts about cutting-edge science or the looniest of wingnuttery.
The confusion between the two definitions of the word research has the effect of increasing people's confidence in the kind we were all doing in middle school, and giving that low-level snooping about an undeserved gloss of reputability. This was the upshot of a paper in Nature (peer-reviewed science, that), by Kevin Aslett of the University of Central Florida et al., entitled, "Online Searches to Evaluate Misinformation Can Increase Its Perceived Veracity." Their results are kind of terrifying, if not unexpected given the "post-truth society" we've somehow slid into. The authors write:
Although conventional wisdom suggests that searching online when evaluating misinformation would reduce belief in it... across five experiments, we present consistent evidence that online search to evaluate the truthfulness of false news articles actually increases the probability of believing them... We find that the search effect is concentrated among individuals for whom search engines return lower-quality information. Our results indicate that those who search online to evaluate misinformation risk falling into data voids, or informational spaces in which there is corroborating evidence from low-quality sources.
The tendency appears to be that when someone is "doing their research" on a controversial subject, what they do is an online search, pursued until they find two or three hits on sources that corroborate what they already believed, and that strengthens their conviction that they were right in the first place. The study found that very little attention was usually given to the quality of those sources, or where those sources got the information themselves. If it makes the "researcher" nod sagely and say, "Yeah, that's what I thought," it doesn't matter if the information came from NASA -- or from QAnon.
The problem is, a lot of those bogus sources can look convincing.
People see data in some online source, and rarely consider (1) who collected the data and why, (2) how it was analyzed, (3) what information wasn't included in the analysis, and (4) whether it was verified, and if so how and by whom. I first ran into the old joke about "73.4% of all statistics are made up on the spot" years ago, and it's still funny, even if our laughs are rather wry these days. Sites like Natural News, Food Babe, Before It's News, Breitbart.com, Mercola.com, InfoWars, One America News, and even a few with scholarly-sounding names -- like The Society for Scientific Exploration, Evolution News, and The American College of Pediatricians are three examples -- are clearinghouses for fringe-y and discredited ideas, often backed up by data that's either cherry-picked and misrepresented, or from sources even further down the ladder of sketchy credibility.
Given how much bullshit is out there, a lot of it well-hidden behind facts, figures, and fancy writing, it can be a challenge for laypeople (and I very much count myself amongst their numbers) to discern truth from fiction. It's also an uphill struggle to fight against the very natural human tendency of confirmation bias; we all would love it if our cherished notions of how the world works were one hundred percent correct. But if we want to make smart decisions, we all need to stop saying "I did my research" when all that "research" involved was a twenty-minute Google search to find the website of some random crank who confirmed what we already believed.
Remember, as the brilliant journalist Kathryn Schulz points out, that one of the most mind-expanding and liberating things we can say is, "I don't know. Maybe I'm wrong." And to start from that open-minded perspective and find out what the facts really are -- from the actual researchers.
No comments:
Post a Comment