Most of us are pretty certain of our own moral compasses. Circumstances might change, the attitudes of those around us might waver, but at least we know right from wrong -- and why we believe what we do.
A new study, headed by Lars Hall at Lund University in Sweden, seems to indicate that we're not as rock-solid in our beliefs as we think we are. [Source]
The experiment took 160 volunteers and asked them to fill out a two-page survey rating how strongly they believed each of twelve statements on the morality of certain issues, from the ongoing struggles in the Middle East to prostitution to covert government surveillance of its citizens. The researchers, however, employed a magic trick; the surveys were actually composed of two sheets lightly stuck together, and the clipboard had a dab of adhesive on it. When the subject turned the sheet over to answer the back, the top sheet stuck to the clipboard and pulled away, revealing the second sheet. The participants' answers were also recorded there (presumably using some sort of carbon paper to impress the answers onto the page as the subject wrote) -- but the second page had two questions that were different.
For example, sheet one (the questions originally answered by the subject) might have the statement, "Large-scale governmental surveillance of e-mail and Internet traffic
ought to be forbidden as a means to combat international crime and
terrorism." The subject then rated how strongly they agreed or disagreed with that statement. The second sheet had the same statement, with the word "permitted" substituted for "forbidden." So, if the subject strongly agreed with the first statement, you'd think (s)he would strongly disagree with the second, and would recognize that the statement had been altered.
That's not what happened.
When showed their responses to the original statements marked on statements that had the opposite meaning, half of the subjects did not notice any changes -- even when they were asked to read the statements, and their answers, out loud. Only 31% noticed every change that had been made. A full 53% were willing to argue in support of their answers on one or both of the statements that had been altered to mean the opposite of the statement they had actually responded to.
Hall and his team call this "choice blindness." Once we are confronted with evidence that we made a certain choice, many of us internalize that information even if it's in conflict with what we really believed at the time the choice was made. And after that, we are perfectly willing to argue in favor of our new opinions. Memory, of course, is plastic and unreliable, as a multitude of experiments have shown, and any good book on optical illusions can illustrate how easy it is to baffle our sensory apparatus. Now Hall's clever little experiment shows that our moral sense might be as easy to fool as our memory centers and sensory organs.
I find all of this simultaneously creepy and reassuring. I've always felt that on those topics I have strong moral opinions about, I wouldn't flex just because I'm around someone with different attitudes. I've always had the impression that my ethical sense is rooted in concrete. Okay, there are gray areas; but some things are simply wrong, and they'll always be wrong. It's kind of scary to think that if a sneaky researcher convinced me that I'd answered a question the opposite way to how I actually feel, that I could be tricked into arguing for something that (minutes ago) I'd disagreed with.
On the other hand, maybe it is a good thing that the human mind is as open as it is. After all, it's raging dogmatism that is now ripping the Middle East to pieces, after a moronic filmmaker made a banal fourteen minute film insulting the Prophet Muhammad, and devout Muslims worldwide have responded by blowing themselves up and setting stuff on fire. Maybe the fact that we have such play in our moral compasses is a hopeful sign, that if we're somehow forced into considering opposing viewpoints, we might actually be capable of seeing the other side of the argument.