Skeptophilia (skep-to-fil-i-a) (n.) - the love of logical thought, skepticism, and thinking critically. Being an exploration of the applications of skeptical thinking to the world at large, with periodic excursions into linguistics, music, politics, cryptozoology, and why people keep seeing the face of Jesus on grilled cheese sandwiches.
Showing posts with label Tali Sharot. Show all posts
Showing posts with label Tali Sharot. Show all posts

Tuesday, August 14, 2018

Giving bad news to Pollyanna

Two months ago my younger son moved to Houston, Texas for a new job, and although he's 27, this of course elicited all the usual parental worries from Carol and me.  But we submerged our nervousness, not to mention our awareness that this means we'll only see him once or twice a year at best, and helped him pack up and get on his way.

He spent his last night in New York at our house, and left on a sunny Sunday morning with hugs and good lucks and farewells.  Five hours later I got a telephone call from him that is something no parent would want to hear.

"Dad?  I need some help.  I was in an accident.  The wheel fell off my truck."

After I got past the say-whats and what-the-fucks, and returned my heart rate to as near normal as I could manage, I asked him for details.  The bare facts are as follows.

He was heading down I-90 at the obligatory seventy miles per hour, out in the hinterlands of Ohio, when there was a loud bang and his truck skidded to the right.  What apparently had happened is that three weeks earlier, when he was having the tires replaced, the mechanic had overtightened one of the bolts and cracked it.  At some point, the pressure made it give way, and the torque sheared off all four of the other bolts.

As luck would have it -- and believe me, there's a lot to credit luck with in this story -- the wheel went under his truck and got lodged, so he was skidding with the wheel and tire as padding.  He maneuvered his truck to the shoulder, miraculously without hitting anything or anyone, and without putting a scratch on his truck -- or himself.  But there he was, alongside the freeway ten miles from Ashtabula, wondering what the hell he was going to do.

The story ends happily enough; I called a tow truck and had him towed to a place where they botched a second repair job, but he figured that out before he'd gotten very far (believe me, now he's aware of every stray shudder or wobble), and we had him towed a second time to the Mazda dealership in Erie, Pennsylvania, where they completed the repair the right way.  The remainder of his journey to Houston was uneventful.

[Image is licensed under the Creative Commons Dual Freq, I-72 North of Seymour Illinois, CC BY-SA 3.0]

This all comes up because there's been a new study from University College, London, about our reactions to bad news, and how those reactions change when we're under stress.  The research team was made up of experimental psychologists Neil Garrett, Ana María González-Garzón, Lucy Foulkes, Liat Levita, and Tali Sharot (regular readers of Skeptophilia may recognize Sharot's name; she was part of a team that investigated why people find lying progressively less shame-inducing the more we do it, a study that I wrote about last year).

The Garrett et al. team's paper, "Updating Beliefs Under Perceived Threat," looked at why we are better at accepting positive news than negative.  It isn't, apparently, just wishful thinking, or resisting believing bad news.  The authors write:
Humans are better at integrating desirable information into their beliefs than undesirable.  This asymmetry poses an evolutionary puzzle, as it can lead to an underestimation of risk and thus failure to take precautionary action.  Here, we suggest a mechanism that can speak to this conundrum.  In particular, we show that the bias vanishes in response to perceived threat in the environment.  We report that an improvement in participants' tendency to incorporate bad news into their beliefs is associated with physiological arousal in response to threat indexed by galvanic skin response and self-reported anxiety.  This pattern of results was observed in a controlled laboratory setting (Experiment I), where perceived threat was manipulated, and in firefighters on duty (Experiment II), where it naturally varied.  Such flexibility in how individuals integrate information may enhance the likelihood of responding to warnings with caution in environments rife with threat, while maintaining a positivity bias otherwise, a strategy that can increase well-being.
In practice what they did was to induce anxiety in one group of their test subjects by telling them that as part of the experiment, they were going to have to give a public speech to a room full of listeners, and then asked them to estimate their risk of falling victim to a variety of dangers -- automobile accident, heart attack, homicide, and so on.  A second group (as the paragraph above explains) was simply exposed to anxiety-inducing situations naturally because of their job as firefighters, and then given the same questions.  Each of those two groups were again split into two groups; one was given bad news (that the chance of their experiencing the negative events was higher than they thought), and the other good news (that the chance was lower than they thought).

The volunteers were then asked to re-estimate their odds of each of the occurrences.

And what they found was that the subjects who had experienced anxiety had no Pollyanna bias -- they were much more realistic about estimating their odds, and revised their estimates either upward or downward (depending on which response they'd been given).

More interesting were the people who were in a control group, and had not experienced anxiety.  The ones who were given good news readily revised their estimates of bad outcomes downward, but the ones given bad news barely budged.  It's as if they thought, "Hey, I'm feeling pretty good, I can't believe I was really that far off in estimating my risk."

My question is whether this might be the origin of anxiety disorders, which are a little hard to explain evolutionarily otherwise.  They're terribly common, and can be debilitating.  Could this be some kind of evolutionary misfire -- that in the risk-filled environments our ancestors inhabited, keeping some background level of anxiety made us more realistic about our likelihood of harm?  And now that the world is a far safer place for many of us, that anxiety loses its benefit, and spirals out of control?

All of that is just speculation, of course.  But as far as what happened to my son, you'd be correct in surmising that it was not easy for me to hear.  My anxiety blew a hole through the roof, even though (1) he was fine, (2) his truck was fine, and (3) once we got him towed and the truck repaired, everything was likely to be fine.

I swear, I spent the next three days shaking.

Which, I guess, constitutes "integrating undesirable information."

In any case, the research by Garrett et al. gives us an interesting window into how induced anxiety alters our ability to modify our worldviews.  Myself, I'm just glad my son is settled in Houston and loves his new job.  It's not like this means I won't be anxious any more, but having one less thing to fret about is definitely a good thing.

*****************************

I picked this week's Skeptophilia book recommendation because of the devastating, and record-breaking, fires currently sweeping across the American west.  Tim Flannery's The Weather Makers is one of the most cogent arguments I've ever seen for the reality of climate change and what it might ultimately mean for the long-term habitability of planet Earth.  Flannery analyzes all the evidence available, building what would be an airtight case -- if it weren't for the fact that the economic implications have mobilized the corporate world to mount a disinformation campaign that, so far, seems to be working.  It's an eye-opening -- and essential -- read.

[If you purchase the book from Amazon using the image/link below, part of the proceeds goes to supporting Skeptophilia!]





Wednesday, June 7, 2017

Liar liar

In my youth, I was quite an accomplished liar.

I say "accomplished" more to mean "I did it a lot" rather than "I did it well."  I honestly don't know how well I lied -- it might be that people in general didn't believe what I said and were simply too polite to call me out on it.  On the other hand, I did get away with a lot of stuff.  So apparently I was at least marginally successful.

What I lied about tended to be exaggerations about my past -- I rarely if ever lied out of malice.  But I felt my own circumstances to be boring and bland, a sense compounded by the fact that I've always suffered from serious social anxiety, so I think I felt as if building up a fictional persona who was interesting and adventurous might assuage my fear of being judged by the people I met.  Eventually, though, I realized that all I was doing was sabotaging the relationships I had, because once people found out I wasn't who I said I was, they'd be understandably pissed that I hadn't been straight with them.  So I dedicated myself to honesty, a commitment I've tried my hardest to keep ever since then.

On the other hand, I became a fiction writer, which means now I make up elaborate lies, write them down, and people pay me to read them.  So maybe I haven't progressed as far as I'd thought.

Kang Lee and Victoria Talwar of the University of Toronto have been studying lying for some time, and they've found that the propensity of children to lie increases as they age.  Presumably, once they develop a sense of shame and a better impulse control, they find themselves sheepish when they transgress, and lie to cover up their feelings or escape the consequences.  In a study in the International Journal of Behavioral Development, Lee and Talwar gave children of varying ages a task while a music-playing toy played behind them, and told them not to peek at the toy:
When the experimenter asked them whether they had peeked, about half of the 3-year-olds confessed to their transgression, whereas most older children lied.  Naive adult evaluators (undergraduate students and parents) who watched video clips of the children’s responses could not discriminate lie-tellers from nonliars on the basis of their nonverbal expressive behaviours.  However, the children were poor at semantic leakage control and adults could correctly identify most of the lie-tellers based on their verbal statements made in the same context as the lie.  The combined results regarding children’s verbal and nonverbal leakage control suggest that children under 8 years of age are not fully skilled lie-tellers.
Lee considers this behavior a completely normal part of social development, and in fact, says he worries about the 10% of older children in his study who could not be induced to lie -- because telling the truth 100% of the time, without regard for others' feelings or the consequences thereof, might not be the best thing, either.

But the tendency to lie doesn't vanish with adulthood.  A study by Robert Feldman, of the University of Massachusetts-Amherst, found that 60% of adults lied at least once during a ten-minute conversation.

"People tell a considerable number of lies in everyday conversation," Feldman said about his study.  "It was a very surprising result.  We didn't expect lying to be such a common part of daily life...  When they were watching themselves on videotape, people found themselves lying much more than they thought they had... It's so easy to lie.  We teach our children that honesty is the best policy, but we also tell them it's polite to pretend they like a birthday gift they've been given.  Kids get a very mixed message regarding the practical aspects of lying, and it has an impact on how they behave as adults."

Of course, all lies aren't equally blameworthy.  Telling Aunt Bertha that the knitted sweater she made for your Christmas gift is lovely probably is better than saying, "Wow, that is one ugly-ass sweater, and I'm bringing it down to the Salvation Army as soon as I get a chance."

[image courtesy of Aunt Bertha and the Wikimedia Commons]

As for the kind of thing I did as a kid -- saying that I'd spent my summer vacation riding musk oxen in the Aleutian Islands -- it's kind of ridiculous and pointless, but other than distancing one from one's friends (as I described before) probably isn't really very high on the culpability scale, either.

But lying to hurt, lying for personal gain, lying to gain or retain power (I'm lookin' at you, Donald Trump) -- those are serious issues.

Unfortunately, however, even the less serious lies can cause problems, because there is the tendency for small lies to lead to bigger ones.  A study by Tali Sharot of University College London found out that our amygdala -- the structure in the brain that appears to mediate fear, shame, and anxiety -- actually fires less the more we lie.  The first lies we tell elicit a strong response; but we become habituated quickly.

The more we lie, the easier it gets.

So the old adage of "honesty is the best policy" really does seem to apply in most circumstances.

Unless, of course, you're a fiction writer.  Then the rules don't apply at all.  Now you'll have to excuse me, as I've got a herd of musk oxen to attend to.

Thursday, September 8, 2016

The political teeter-totter

During election seasons, you often find out far more than you wanted about your friends' political leanings, pet issues, biases, and blind spots.  We all have them, of course; but the natural tendency is to feel like everyone else is falling for fallacious thinking, whereas we are (in Kathryn Schulz's words) "looking out at the world through a completely crystal-clear window, seeing everything as it actually is."

The problem is, it's amazingly difficult to root out errors in thinking.  People are prone to the backfire effect -- being presented with facts supporting an opposing point of view often make people double down and believe what they already did more strongly.  But it goes deeper than that. A paper written by Tali Sharot, Cass Sunstein, Sebastian Bobadilla-Suarez and Stephanie Lazzaro was released this week in the Social Science Research Network, and showed that not only does presentation with the facts often cause people to veer back into their previous thinking, it increases polarization in general.

The research team used three hundred test subjects, first giving them questionnaires designed to determine their attitude about anthropogenic climate change.  From their answers, they divided the subjects into three groups -- strong believers, moderate believers, and weak believers.  Each group was asked what their estimate was of the increase in global average temperature by the years 2100.   Unsurprisingly, the strong believers had the highest estimate (6.3 degrees on average), the weak believers the lowest (3.6 degrees) and the moderate believers were in the middle (5.9 degrees).

When it got interesting was when the researchers presented half of each group with data that was good news for the planet (global warming isn't going to be as bad as predicted) and the other half with bad news (global warming is going to be far worse than predicted).  Afterwards, they were reassessed about their opinions, and asked to revise their estimate for the change.  The strong believers presented with bad news revised their estimates upwards; those presented with good news revised their estimates downward, but only a little (0.9 degrees on average).  The weak believers were highly responsive to the good news -- lowering their estimate by a degree on average -- but didn't respond to the bad news at all!

What this shows is rather frightening.  Presented with facts, both the believers and the doubters will change -- but always in such a way as to increase the overall polarization of the group.  This sort of backfire effect will result in a society where the degree of separation between two opposing factions will inevitably increase.

Sobering stuff.  But not as much as a different study, which shows how easily our political beliefs can be changed...

... without our realizing it.

According to a study in Frontiers in Human Neuroscience, all scientists had to do was stimulate one part of the brain -- the dorsolateral prefrontal cortex -- and it caused test subjects' views to tilt to the right.

The paper, entitled "Alteration of Belief by Non-invasive Brain Stimulation," describes research by Caroline Chawke and Ryota Kanai, of the University of Sussex - Brighton's School of Psychology.  They begin with the sentence, "People generally have imperfect introspective access to the mechanisms underlying their political beliefs, yet can confidently communicate the reasoning that goes into their decision making process" -- which sums up in only a few words how little real faith we should have in the stuff our brain comes up with.

Previous research had suggested that the dorsolateral prefrontal cortex was involved in political decision-making (via its role in resolving cognitive conflict).  Specifically, it was observed that DLPFC activity was higher when people were challenged on their preconceived opinions with regard to political views.  So what Chawke and Kanai did was to stimulate that area of the brain while showing participants a campaign video from either the Labour (liberal) or Conservative party.  The expectation was that when the DLPFC was activated, it would push cognitive conflict resolution by moving both left- and right-leaning individuals toward more centrist beliefs.

That's not what happened.  The people shown a Labour video showed a movement toward the right -- but so did the people shown a Conservative video.  In other words, you stimulate the DLPFC, and everyone becomes more conservative.


Ready for the scariest part?  Let me give it to you in their own words:
It is also interesting to note that none of the participants in the current study reported any awareness of changes to their political beliefs... conclusively disagreeing with the possibility that political thoughts and values had been altered in any way.  Therefore, during the conscious deliberation of political statements, it appears as though implicit cognitive control processes may have biased subsequent belief formation in the absence of conscious awareness.  Although research has argued that rationalization and reappraisal must require some degree of conscious deliberation, the findings of the current study would provide reason to speculate an unconscious role of the DLPFC in changing political orientation.
The authors suggest the explanation that the DLPFC may have evolved as a structure whose activity is involved in perceptions of security, certainty, and social dominance, all characteristics that are associated with conservative ideology.  But wherever it comes from, the most bizarre part of all of this is how little we seem to be choosing our political leanings based on anything logical -- or even conscious.

So, there you are.  More reason to distrust the whole political process, as if this year you needed another one.  Myself, I think I'm being forced to the opinion that, as Alexis de Tocqueville said in Book II of Democracy in America: "In the United States, the majority undertakes to supply a multitude of ready-made opinions for the use of individuals, who are thus relieved from the necessity of forming opinions of their own."  Little did he know how accurate that statement was -- not only about Americans, but about everyone.