Now, to be fair, they work well enough. It'd be a pretty significant evolutionary disadvantage if our sensory processing organs and memory storage were as likely to be wrong as right. But a system that is built to work along the lines of "meh, it's good enough to get by, at least by comparison to everyone else" -- and let's face it, that's kind of how evolution works -- is inevitably going to miss a lot. "Our experience of reality," said neuroscientist David Eagleman, "is constrained by our biology." He talks about the umwelt -- the world as experienced by a particular organism -- and points out that each species picks up a different tiny slice of all the potential sensory inputs that are out there, and effectively misses everything else.
It also means that even of the inputs in our particular umwelt, the brain is going to make an executive decision regarding which bits are important to pay attention to. People with normal hearing (for example) are being bombarded constantly by background sounds, which for most of us most of the time, we ignore as irrelevant. In my intro to neuroscience classes, I used to point this out by asking students how many of them were aware (prior to my asking the question) of the sound of the fan running in the heater. Afterward, of course, they were; beforehand, the sound waves were striking their ears and triggering nerve signals to the brain just like any other noise, but the brain was basically saying "that's not important." (Once it's pointed out, of course, you can't not hear it; one of my students came into my room four days later, scowled at me, and said, "I'm still hearing the heater. Thanks a lot.")
The point here is that we are about as far away from precision reality-recording equipment as you can get. What we perceive and recall is a small fraction of what's actually out there, and is remembered only incompletely and inaccurately.
The Doors of Perception by Alan Levine [Image licensed under the Creative Commons cogdogblog, Doors of Perception (15354754466), CC BY 2.0]
Worst of all, what we do perceive and recall is also modified by what we think we should be perceiving and recalling. This point was underscored by some cool new research done by a team led by Hernán Aniló at the Université Paris Sciences et Lettres, which showed that all it takes is a simple (false) suggestion of what we're seeing to foul up our perception completely.
The experiment was simple and elegant. Subjects were shown a screen with an image of a hundred dots colored either blue or yellow. Some of the screens had exactly fifty of each; others were sixty/forty (one way or the other). The volunteers were then asked to estimate the proportions of the colors on a sequence of different screens, and to give an assessment of how confident they were in their guess.
The twist is that half of the group was given a "hint" -- a statement that in some of the screens, one of the colors was twice as frequent as the other. (Which, of course, is never true.) And this "hint" caused the subjects not only to mis-estimate the color frequencies, but to be more confident in their wrong guesses, especially in volunteers for whom a post-test showed a high inclination toward social suggestibility.
As easily-understood as the experiment is, it has some profound implications. "Information is circulating with unprecedented speed, and it even finds its way into our social feeds against our will sometimes," Aniló said. "It’s becoming increasingly difficult to observe events without having to go through some level of information on those events beforehand (e.g. buying a shirt, but not before reading its reviews online). What we are looking at in our research here is how much the information you receive is going to contribute to the construction of your perceptual reality, and fundamentally, what are the individual psychological features that condition the impact that that information will have in shaping what you see and think, whether you like it or not. Of course, we are not talking about enormous effects that can completely distort the world around you (e.g., no amount of false/imprecise information can make you misperceive a small bird as a 3-ton truck), but what our study shows is that, provided you are permeable enough to social influence (which we all are, the key here being how much), then false information can slightly shift your perception in whatever direction the information points."
What this means, of course, is that we have to be constantly aware of our built-in capacity for being fooled. And although we clearly vary in that capacity, we shouldn't fall for believing "I'm seeing reality, it's everyone else who is wrong." The truth is, we're all prone to inaccurate perception and recall, and all capable of having the power of suggestion alter what we see. "Perception is a complex construction, and information is never an innocent bystander in this process," Anlló said. "Always be informed, but make sure that your sources are of high quality, and trustworthy. Importantly, when I say high-quality I do not mean a source that you may trust because of emotional reasons or social links, but rather by the accuracy of the information they provide and the soundness of the evidence. Indeed, our experiment shows that your level of suggestibility to your social environment (how much you dress like your friends, or feel influenced by their taste in music) will also predict your permeability to perceptual changes triggered by false information. This, much like many other cognitive biases, is part of the human experience, and essentially nothing to worry about. Being susceptible to your social environment is actually a great thing that makes us humans thrive as a species, we just need to be aware of it and try our best to limit our exposure to bad information."
The most alarming thing of all, of course, is that the people who run today's news media are well aware of this capacity, and use it to reinforce the perception by their consumers that only they are providing accurate information. "Listen to us," they tell us, "because everyone else is lying to you." The truth is, there is no unbiased media; given that their profits are driven by telling viewers the bit of the news that supports what they think the viewers already want to believe, they have exactly zero incentive to provide anything like balance. The only cure is to stay as aware as we can of our own capacity for being fooled, and to stick as close to the actual facts as possible (and, conversely, as far away as possible from the talking heads and spin-meisters who dominate the nightly news on pretty much whichever channel you choose).
If our perceptions of something as simple and concrete as the number of colored dots on a screen can be strongly influenced by a quick and inaccurate "hint," how much easier is it to alter our perception of the world with respect to complex and emotionally-laden issues -- especially when there's a powerful profit motive on the part of the people giving us the hints?