The study's principle author, Philip Fernbach of the University of Colorado, explained that the study came out of an observation that people who loudly expressed views on politics often seemed not to have much in the way of factual knowledge about the topic upon which they were expounding.
"We wanted to know how it's possible that people can maintain such strong positions on issues that are so complex -- such as macroeconomics, health care, foreign relations -- and yet seem to be so ill-informed about those issues," Fernbach said.
What the study did was to ask a group of test subjects to rate how well they understood six different political issues, including instituting merit pay for teachers, raising the age on Social Security, and enacting a flat tax. The subjects then were asked to explain two of the policies, including their own position and why they held it, and were questioned on their understanding of facts of the policy by the researchers. Afterwards, they were asked to re-rate their level of comprehension.
Across the board, self-assessment scores went down on the subjects they were asked to explain. More importantly, their positions shifted -- there was a distinct movement toward the center that occurred regardless of the political affiliation of the participant. Further, the worse the person's explanation had been -- i.e., the more their ignorance of the facts had been uncovered -- the further toward the center they shifted.
This seems to be further evidence for the Dunning-Kruger effect -- a bias in which people nearly always tend to overestimate their own knowledge and skill. (It also brings to mind Dave Barry's comment, "Everyone thinks they're an above-average driver.")
I'm also reminded of Philip Tetlock's brilliant work Expert Political Judgment, which is summarized here but which anyone who is a student of politics or sociology should read in its entirety. In the research for his book, he analyzed the political pronouncements of hundreds of individuals, evaluating the predictions of experts in a variety of fields to the actual outcome in the real world, and uses this information to draw some fascinating conclusions about human social behavior. The relevant part of his argument, for our purposes here, is that humans exhibit two basic "cognitive styles," which he calls "the fox and the hedgehog" (the symbols come from a European folk tale).
Foxes, Tetlock says, tend to be able to see multiple viewpoints, and have a high tolerance for ambiguity (in the interest of conciseness, quotes are taken from the summary, not from the original book):
Experts who think in the 'Fox' cognitive style are suspicious of a commitment to any one way of seeing the issue, and prefer a loose insight that is nonetheless calibrated from many different perspectives. They use quantification of uncertain events more as calibration, as a metaphor, than as a prediction. They are tolerant of dissonance within a model - for example, that an 'enemy' regime might have redeeming qualities - and relatively ready to recalibrate their view when unexpected events cast doubt on what they had previously believed to be true.
In contrast to this, Hedgehogs work hard to exclude dissonance from their models. They prefer to treat events which contradict their expectations as exceptions, and to re-interpret events in such a way as to allocate exceptions to external events. For example, positive aspects of an enemy regime may be assigned to propaganda, either on the part of the regime or through its sympathizers... Hedgehogs tend to flourish and excel in environments in which uncertainty and ambiguity have been excluded, either by actual or artificial means. The mantra of "targets and accountability" was made by and for Hedgehogs.The differences, Tetlock said, are irrespective of political leaning; there are conservative and liberal foxes, and conservative and liberal hedgehogs. But, most importantly, the foxes' tolerance of many viewpoints, and awareness of their own ignorance, gives them the appearance of knowing less than they actually do, and lessens their influence on policy and society; and the hedgehogs' certainty, and clear, concise answers to complex problems, gives them the appearance of knowing more than they actually do, and increases their influence.
Hedgehogs, Tetlock found, were more often wrong in their assessment of political situations, but their views achieved wide impact. Foxes were more often right -- but no one listened.
So, anyway, I read all of this with a vague sense of unease. Having a blog, after all, implies some level of arrogance -- that you believe your views to be important, intelligent, and interesting enough that people, many of them total strangers, will want to read what you have to say. Given Fernbach's study, not to mention the Dunning-Kruger effect and the conclusion of Tetlock's research, it does leave me with a bit of a chill. Would my views on topics become less extreme if I were forced to reconsider the facts of the situation? Do I really think I'm more knowledgeable than I actually am? Worst of all (for a blogger), am I a simplistic thinker that is often wrong but whose views have wide social impact, or a complex thinker that no one pays attention to?
Oy. I'm not sure I, um, want to reevaluate all this. I think I'll just go have breakfast. That sounds like a definitive solution to the problem, right?
Of course right.