I've learned through the years that my feelings are an unreliable guide to evaluating reality.
Part of this, I suppose, comes from having fought depression for forty years. I know that what I'm thinking is influenced by my neurotransmitters, and given the fact that they spend a lot of the time out of whack, my sense that five different mutually-exclusive worst-case scenarios can all happen simultaneously is probably not accurate. It could be that this was in part what drove me to skepticism, and to my understanding that my best bet for making good decisions is to rely not on feelings, but on evidence.
It surprises me how many people don't get that. I saw two really good examples of this in the news last week, both of them centered around embattled President Donald Trump. In the first, he was questioned about why he was putting so much emphasis on securing the border with Mexico -- to the extent of sending in the National Guard -- when in fact, illegal border crossings are at a 46-year low. (You could argue that current levels are still too high; but the fact is, attempted border crossings have steadily dropped from a high of 1.8 million all the way back in 2000; the level now is about a quarter of that.)
I'm not here to discuss immigration policy per se. It's a complex issue and one on which I am hardly qualified to weigh in. What strikes me about this is that the powers-that-be are saying, "I don't care about the data, facts, and figures, the number of illegal migrants is increasing because I feel like it is."
An even more blatant example of trust-your-feelings-not-the-facts came from presidential spokesperson Sarah Huckabee Sanders, who has the unenviable and overwhelming job of doing damage control every time Trump lies about something. This time, it was at a roundtable discussion on taxes in West Virginia, where he veered off script and started railing about voter fraud. "In many places, like California, the same person votes many times — you've probably heard about that," he said. "They always like to say 'oh, that's a conspiracy theory' — not a conspiracy theory, folks. Millions and millions of people."
Of course, the states he likes to claim were sites of rampant voter fraud are always states in which he lost, because the fact that Hillary Clinton won the popular vote still keeps him up at night. But the fact is, he's simply wrong. A fourteen-year study by Loyola law professor Justin Levitt found that a "specific, credible allegation existed that someone pretended to be someone else at the polls" accounted for 31 instances out of a billion votes analyzed.
To make it clear: 31 does not equal "millions and millions." And a fraud rate of 0.0000031% does not constitute "many times."
So, Trump lied. At this point, that's hardly news. It'd be more surprising if you turned on the news and found out Trump had told the truth about something. But when asked about this actual data, in juxtaposition to what Trump said, Sarah Sanders said, "The president still feels there was a large amount of voter fraud."
Wait, what?
What Trump or Sanders, or (for that matter) you or I, "feel" about something is completely irrelevant. If there's hard data available -- which there is, both on the border crossings and on allegations of voter fraud -- that is what should be listened to. And when you say something, and are confronted by someone who has facts demonstrating the opposite, the appropriate response is, "Whoa, okay. I guess I was wrong."
But that's if you're not Donald Trump. Trump never admits to being wrong. He doesn't have to, because he's surrounded himself with a Greek chorus of people like Sanders (and his sounding boards over at Fox News) who, no matter what Trump says or does, respond, "Exactly right, sir. You're amazing. A genius. Your brain is YUUUGE."
Hell, he said a couple of years ago that he could kill someone in full view on 5th Avenue and not lose a single supporter, and we had a rather alarming proof of that this week when a fire broke out at Trump Tower on, actually, 5th Avenue -- which, contrary to the law, had no fire alarms or sprinkler system installed -- killing one man and injuring six.
The response? One Trump supporter said that the man who died had deliberately set the fire to make Trump look bad, and then didn't get out in time.
Facts don't matter. "I feel like Trump is a great leader and a staunch Christian" wins over "take a look at the hard data" every time.
I'd like to say I have a solution to this, but this kind of fact-resistance is so self-insulating that there's no way in. It's like living inside a circular argument. "Trump is brilliant because I feel like he's brilliant, so anything to the contrary must be a lie." And when you have Fox News pushing this attitude hard -- ignoring any information to the contrary -- you can't escape.
If you doubt that, take a look at what Tucker Carlson was talking about while every other news agency in the world was covering the raid on Trump lawyer Michael Cohen's office: a piece on how "pandas are aggressive and sex-crazed." (No, I'm not making this up. An actual quote: "You know the official story about pandas — they’re cute but adorably helpless, which is why they are almost extinct. But like a lot of what we hear, that is a lie... The real panda is a secret stud with a thirst for flesh and a fearsome bite.")
That's some cutting-edge reporting, right there. No wonder Fox News viewers were found in a 2012 study to be the worst-informed of all thirty media sources studied, only exceeded by people who didn't watch the news at all.
So sorry to end on a rather dismal note, but it seems like until people decide to start valuing facts above feelings, we're kind of stuck. Honestly, the only answer I can come up with is educating children to be critical thinkers, but in the current environment of attacking teachers and public schools, I'm not sure that's feasible either.
In the interim, though, I'm gonna avoid pandas. Because they sound a lot sketchier than I'd realized.
Skeptophilia (skep-to-fil-i-a) (n.) - the love of logical thought, skepticism, and thinking critically. Being an exploration of the applications of skeptical thinking to the world at large, with periodic excursions into linguistics, music, politics, cryptozoology, and why people keep seeing the face of Jesus on grilled cheese sandwiches.
Showing posts with label gut feelings. Show all posts
Showing posts with label gut feelings. Show all posts
Wednesday, April 11, 2018
Tuesday, September 26, 2017
Right in the gut
I know I've said it before, but it bears saying again: the strength of science lies in its reliance on hard evidence as the sine qua non of understanding.
I've tried to embrace this outlook myself, insofar as a fallible and biased human can do so. Okay, so every day I poke fun at all sorts of odd beliefs, sometimes pissing people off. But you know what? You want to convince me, show me some reliable evidence. For any of the claims I've scoffed at. Bigfoot. Ghosts. ESP. Astrology. Tarot divination. Homeopathy.
Even the existence of god.
I'm convinceable. All you have to do is show me one piece of irrefutable, incontrovertible evidence, and I'm sold.
The problem is, to my unending frustration and complete bafflement, most people don't approach the world that way. Instead, they rely on their gut -- which seems to me to be a really good way to get fooled. I'm a pretty emotional guy, and I know my gut is unreliable.
Plus, science just doesn't seem to obey common sense at times. As an example, consider the Theory of Relativity. Among its predictions:
None of which we would know now if people relied solely on their gut to tell them how things work.
Despite all this, there are people who still rely on impulse and intuition to tell them what's true and what's not. And now a study jointly conducted by researchers at Ohio State University and the University of Michigan has shown conclusively that if you do this, you are more prone to being wrong.
Kelly Garrett and Brian Weeks decided to look into the connection between how people view evidence, and their likelihood of falling for incorrect information. They looked at survey data from almost 3,000 people, in particular focusing on whether or not the respondents agreed with the following statements:
"Misperceptions don’t always arise because people are blinded by what their party or favorite news outlet is telling them," Weeks said. "While trusting your gut may be beneficial in some situations, it turns out that putting faith in intuition over evidence leaves us susceptible to misinformation."
"People sometimes say that it’s too hard to know what’s true anymore," Garrett said. "That’s just not true. These results suggest that if you pay attention to evidence you’re less likely to hold beliefs that aren’t correct... This isn’t a panacea – there will always be people who believe conspiracies and unsubstantiated claims – but it can make a difference."
I've tried to embrace this outlook myself, insofar as a fallible and biased human can do so. Okay, so every day I poke fun at all sorts of odd beliefs, sometimes pissing people off. But you know what? You want to convince me, show me some reliable evidence. For any of the claims I've scoffed at. Bigfoot. Ghosts. ESP. Astrology. Tarot divination. Homeopathy.
Even the existence of god.
I'm convinceable. All you have to do is show me one piece of irrefutable, incontrovertible evidence, and I'm sold.
The problem is, to my unending frustration and complete bafflement, most people don't approach the world that way. Instead, they rely on their gut -- which seems to me to be a really good way to get fooled. I'm a pretty emotional guy, and I know my gut is unreliable.
Plus, science just doesn't seem to obey common sense at times. As an example, consider the Theory of Relativity. Among its predictions:
- The speed of light is the ultimate universal speed limit.
- Light moves at the same speed in every reference frame (i.e., your own speed relative to the beam of light doesn't matter; you'll still measure it as traveling at 300,000,000 meters per second).
- When you move, time slows down. The faster you move, the slower time goes. So if you took off in a rocket ship to Alpha Centauri at 95% of the speed of light, when you came back from your trip you'd find that while twelve years or so would have passed for you, hundreds of years would have passed on Earth.
- When you move, to a stationary person your mass increases and your length in the direction of motion contracts. The faster you move, the more pronounced this effect becomes.
None of which we would know now if people relied solely on their gut to tell them how things work.
Despite all this, there are people who still rely on impulse and intuition to tell them what's true and what's not. And now a study jointly conducted by researchers at Ohio State University and the University of Michigan has shown conclusively that if you do this, you are more prone to being wrong.
[image courtesy of the Wikimedia Commons]
- I trust my gut to tell me what’s true and what’s not.
- Evidence is more important than whether something feels true.
- Facts are dictated by those in power.
They then correlated the responses with the participants' likelihood of believing a variety of conspiracy theories. Unsurprisingly, they found that the people who relied on gut feelings and emotions to determine the truth were far more likely to fall for conspiracies and outright untruths.
"Misperceptions don’t always arise because people are blinded by what their party or favorite news outlet is telling them," Weeks said. "While trusting your gut may be beneficial in some situations, it turns out that putting faith in intuition over evidence leaves us susceptible to misinformation."
"People sometimes say that it’s too hard to know what’s true anymore," Garrett said. "That’s just not true. These results suggest that if you pay attention to evidence you’re less likely to hold beliefs that aren’t correct... This isn’t a panacea – there will always be people who believe conspiracies and unsubstantiated claims – but it can make a difference."
I'd say it makes all the difference. And in the current political environment -- where accusations of "fake news" are thrown around right and left, and what people consider to be the truth depends more on political affiliation than it does on rational fact -- it's more than ever absolutely essential.
Subscribe to:
Comments (Atom)