Skeptophilia (skep-to-fil-i-a) (n.) - the love of logical thought, skepticism, and thinking critically. Being an exploration of the applications of skeptical thinking to the world at large, with periodic excursions into linguistics, music, politics, cryptozoology, and why people keep seeing the face of Jesus on grilled cheese sandwiches.
Showing posts with label decision making. Show all posts
Showing posts with label decision making. Show all posts

Saturday, August 9, 2025

The cost of regret

One of the most tragicomic moments in my life happened at my twentieth high school reunion.

I was painfully shy when I was young.  I brought the concept of "awkward teenager" to its absolute apex.  I made some passing attempts to fit in, but those were by and large failures.  I did have a few friends -- some of whom I am still in touch with, and whose friendship I treasure -- but to say I had no social life back then is an odds-on favorite for Understatement of the Year.

Anyhow, I was at the evening dance/party for my reunion, and did what I usually do at parties: got a drink and then stood around looking uncomfortable.  While I was standing there, I was approached by a woman on whom, when we were in high school, I had a crush of life-threatening proportions.  She came up and started chatting with me, and I relaxed a little, especially I after silently reassured myself that we weren't teenagers any more, and that I was indeed twenty years older than I had been when I graduated.

The conversation went here and there, and after a while she blushed a little and said, "I have a confession to make.  When we were in high school, I had a terrible crush on you, but I was too nervous to ask you out."

I goggled at her for a moment, and said, "Well, that's a little ironic..." and told her I'd felt the same way, and didn't ask her out for the same reason.

We had a good laugh over it, but really, it's kind of sad, isn't it?  We're so wrapped up in our neuroses and insecurities that we become our own worst enemies -- passing up opportunities that could have been rewarding purely out of fear.

Later that evening, in my hotel room, I spent an inordinate amount of time beating myself up over having been such a coward, and avoiding emotional risks whenever it was possible.  I spent a lot of time on that most fruitless of pursuits -- trying to map out what would have happened had I been braver.  I was put in mind of the poignant passage from C. S. Lewis's novel Prince Caspian, do you know it?
"But what would have been the good?" Lucy asked.

Aslan said nothing.

"You mean," said Lucy rather faintly, "that it would have turned out all right – somehow?  But how?  Please, Aslan!  Am I not to know?"

"To know what would have happened, child?" said Aslan.  "No. Nobody is ever told that."

"Oh dear," said Lucy.

"But anyone can find out what will happen," said Aslan.  "If you go back to the others now, and wake them up; and tell them you have seen me again; and that you must all get up at once and follow me – what will happen?  There is only one way of finding out."
When I first read this -- age about fifteen or so -- I remember being thunderstruck, because on some level I'd already recognized that one of the most consistent themes of my life was regret at not having made different decisions.  People I dearly wish I had not hurt.  Opportunities I passed up because of my shyness and risk-aversion.  And sadly, I didn't learn from these experiences, but allowed them to drive me further into avoidance.  I seemed to spend most of the following years planting my feet, mule-like, in a desperate attempt not to misstep, not recognizing that refusing to choose was itself a choice.

So of course it kept happening.  My (all things considered) terrible choice not to fight against my parents' decision that I should live at home while going to college.  My (at the time) barely-acknowledged choice to keep my bisexuality hidden for decades.

It's not, mind you, that I'm unhappy with my life as it is. I have a wonderful wife, two sons I'm proud of, and spent 32 years in a rewarding career that I discovered quite by accident, as a consequence of other seemingly unrelated decisions I made.  I have twenty-four books in print, something I have dreamed about since elementary school.  I live in a wonderful part of the world, and have had the good fortune to travel and see dozens of other amazing places.

And I'm well aware of the fact that things could have turned out far worse. Whatever else you can say about the decision, my choice to live at home during college, with conservative, strait-laced parents who kept close tabs on me, kept me out of all sorts of trouble I might otherwise have gotten into.  If I'd come out as bisexual in college, it would have been in around 1980 -- and this was right at the beginning of the AIDS epidemic, when the disease was still poorly understood, and a diagnosis was tantamount to a death sentence.

There's any number of ways the course of my life could have been deflected into an alternate path, and led me to somewhere very different, for better or worse.  Big decisions -- where to go to college, who to marry, what career to pursue.  Tiny actions with big effects, such as Donna Noble's choice of which direction to turn at an intersection in the mind-blowing Doctor Who episode "Turn Left" -- and of which in my own case I'm almost certainly unaware because looking back, they seem entirely insignificant.


As I said, I like my life just fine.  Even so, I've never been able to shuck the regret, and more than that the fact that like Lucy Pevensie in Prince Caspian, I'll never know what would have happened had I done otherwise.

The topic comes up because of a fascinating paper in the journal Psychological Science called "The Lure of Counterfactual Curiosity: People Incur a Cost to Experience Regret," by Lily FitzGibbon and Kou Muryama (of the University of Reading), and Asuka Komiya (of Hiroshima University).  They did a risk/choice/reward assessment task with 150 adults, and after the task was completed, the volunteers were allowed to pay for information about how they would have fared had they chosen differently.

It turns out, people are willing to pay a lot, even when they find out that they chose poorly (i.e. they would have had a greater reward had they made a different choice), and even though knowledge of their poor decision causes regret, self-doubt, and worse performance on subsequent tasks.  The authors write:
After one makes a decision, it is common to reflect not only on the outcome that was achieved but also on what might have been.  For example, one might consider whether going to a party would have been more fun than staying home to work on a manuscript.  These counterfactual comparisons can have negative emotional consequences; they can lead to the experience of regret. In the current study, we examined a commonly observed yet understudied aspect of counterfactual comparisons: the motivational lure of counterfactual information—counterfactual curiosity.  Specifically, we found that people are so strongly seduced to know counterfactual information that they are willing to incur costs for information about how much they could have won, even if the information is likely to trigger negative emotions (regret) and is noninstrumental to obtaining rewards.
Why would people seek out information when they know ahead of time it is likely to make them feel bad?  The authors write:
One explanation for seeking negative information is that people may also find it interesting to test their emotional responses—a mechanism that might also underlie so-called morbid curiosity.  Counterfactual information of the kind sought in the current experiments may be desirable because it has high personal relevance—it relates to decisions that one has made in the recent past.  People’s desire for information about their own performance is known to be strong enough to overcome cognitive biases such as inequality aversion.  Thus, opportunities to learn about oneself and the actual and counterfactual consequences of one’s decisions may have powerful motivational status.
Chances are, if I was able to do what Donna did in "Turn Left" and see the outcome had I chosen differently, I'd find the results for my life's path would be better in some aspects and worse in others.  Like everything, it's a mixed bag.  Given the opportunity to go back in time and actually change something -- well, tempting as it would be, I would be mighty hesitant to take that step and risk everything I currently have and have accomplished.

But still -- I'd like to know.  Even if in some cases, I'd have done far better making a different choice, and then would add the certainty of having made a bad decision on top of the more diffuse regret I already have.  The temptation to find out would be almost irresistible.

Maybe it's better, honestly, that we don't see the long-term consequences of our actions.  Fortunate, to put it in Aslan's words, that "Nobody is ever told that."  It's hard enough living with knowing you fell short or behaved badly; how much worse would it be if we saw that things could have been far better if we'd only chosen differently?

****************************************


Saturday, June 11, 2022

Locked into error

Back in 2011, author Kathryn Schulz did a phenomenal TED Talk called "On Being Wrong."  She looks at how easy it is to slip into error, and how hard it is not only to correct it, but (often) even to recognize that it's happened.  At the end, she urges us to try to find our way out of the "tiny, terrified space of rightness" that virtually all of us live in.

Unfortunately, that's one thing that she herself gets wrong.  Because for a lot of people, their belief in their rightness about everything isn't terrified; it's proudly, often belligerently, defiant.

I'm thinking of one person in particular, here, who regularly posts stuff on social media that is objectively wrong -- I mean, hard evidence, no question about it -- and does so in a combative way that comes across as, "I dare you to contradict me."  I've thus far refrained from saying anything.  One of my faults is that I'm a conflict avoider, but I also try to be cognizant of the cost/benefit ratio.  Maybe I'm misjudging, but I think the likelihood of my eliciting a "Holy smoke, I was wrong" -- about anything -- is as close to zero as you could get.

Now, allow me to say up front that I'm not trying to imply here that I'm right about everything, nor that I don't come across as cocky or snarky at times.  Kathryn Schulz's contention (and I think she's spot-on about this one) is that we all fall into the much-too-comfortable trap of believing that our view of the world perfectly reflects reality.  One of the most startling bullseyes Schulz makes in her talk is about how it feels to be wrong:

So why do we get stuck in this feeling of being right?  One reason, actually, has to do with the feeling of being wrong.  So let me ask you guys something...  How does it feel -- emotionally -- how does it feel to be wrong?  Dreadful.  Thumbs down.  Embarrassing...  Thank you, these are great answers, but they're answers to a different question.  You guys are answering the question: How does it feel to realize you're wrong?  Realizing you're wrong can feel like all of that and a lot of other things, right?  I mean, it can be devastating, it can be revelatory, it can actually be quite funny...  But just being wrong doesn't feel like anything.

I'll give you an analogy.  Do you remember that Looney Tunes cartoon where there's this pathetic coyote who's always chasing and never catching a roadrunner?  In pretty much every episode of this cartoon, there's a moment where the coyote is chasing the roadrunner and the roadrunner runs off a cliff, which is fine -- he's a bird, he can fly.  But the thing is, the coyote runs off the cliff right after him.  And what's funny -- at least if you're six years old -- is that the coyote's totally fine too.  He just keeps running -- right up until the moment that he looks down and realizes that he's in mid-air.  That's when he falls.  When we're wrong about something -- not when we realize it, but before that -- we're like that coyote after he's gone off the cliff and before he looks down.  You know, we're already wrong, we're already in trouble, but we feel like we're on solid ground.  So I should actually correct something I said a moment ago.  It does feel like something to be wrong; it feels like being right.
What brought this talk to mind -- and you should take fifteen minutes and watch the whole thing, because it's just that good -- is some research out of the University of California - Los Angeles published a couple of weeks ago in Psychological Review that looked at the neuroscience of these quick -- and once made, almost impossible to undo -- judgments about the world.


The study used a technique called electrocorticography to see what was going on in a part of the brain called the gestalt cortex, which is known to be involved in sensory interpretation.  In particular, the team analyzed the activity of the gestalt cortex when presented with the views of other people, some of which the test subjects agreed with, some with which they disagreed, and others about which they had yet to form an opinion.

The most interesting result had to do with the strength of the response.  The reaction of the gestalt cortex is most pronounced when we're confronted with views opposing our own, and with statements about which we've not yet decided.  In the former case, the response is to suppress the evaluative parts of the brain -- i.e., to dismiss immediately what we've read because it disagrees with what we already thought.  In the latter case, it amplifies evaluation, allowing us to make a quick judgment about what's going on, but once that's happened any subsequent evidence to the contrary elicits an immediate dismissal.  Once we've made our minds up -- and it happens fast -- we're pretty much locked in.

"We tend to have irrational confidence in our own experiences of the world, and to see others as misinformed, lazy, unreasonable or biased when they fail to see the world the way we do," said study lead author Matthew Lieberman, in an interview with Science Daily.  "We believe we have merely witnessed things as they are, which makes it more difficult to appreciate, or even consider, other perspectives.  The mind accentuates its best answer and discards the rival solutions.  The mind may initially process the world like a democracy where every alternative interpretation gets a vote, but it quickly ends up like an authoritarian regime where one interpretation rules with an iron fist and dissent is crushed.  In selecting one interpretation, the gestalt cortex literally inhibits others."

Evolutionarily, you can see how this makes perfect sense.  As a proto-hominid out on the African savanna, it was pretty critical to look at and listen to what's around you and make a quick judgment about its safety.  Stopping to ponder could be a good way to become a lion's breakfast.  The cost of making a wrong snap judgment and overestimating the danger is far lower than blithely going on your way and assuming everything is fine.  But now?  This hardwired tendency to squelch opposing ideas without consideration means we're unlikely to correct -- or even recognize -- that we've made a mistake.

I'm not sure what's to be done about this.  If anything can be done.  Perhaps it's enough to remind people -- including myself -- that our worldviews aren't flawless mirrors of reality, they're the result of our quick evaluation of what we see and hear.  And, most importantly, that we never lose by reconsidering our opinions and beliefs, weighing them against the evidence, and always keeping in mind the possibility that we might be wrong.  I'll end with another quote from Kathryn Schulz:
This attachment to our own rightness keeps us from preventing mistakes when we absolutely need to, and causes us to treat each other terribly.  But to me, what's most baffling and most tragic about this is that it misses the whole point of being human.  It's like we want to imagine that our minds are these perfectly translucent windows, and we just gaze out of them and describe the world as it unfolds.  And we want everybody else to gaze out of the same window and see the exact same thing.  That is not true, and if it were, life would be incredibly boring.  The miracle of your mind isn't that you can see the world as it is, it's that you can see the world as it isn't.  We can remember the past, and we can think about the future, and we can imagine what it's like to be some other person in some other place.  And we all do this a little differently...  And yeah, it is also why we get things wrong.

Twelve hundred years before Descartes said his famous thing about "I think therefore I am," this guy, St. Augustine, sat down and wrote "Fallor ergo sum" -- "I err, therefore I am."  Augustine understood that our capacity to screw up, it's not some kind of embarrassing defect in the human system, something we can eradicate or overcome.  It's totally fundamental to who we are.  Because, unlike God, we don't really know what's going on out there.  And unlike all of the other animals, we are obsessed with trying to figure it out.  To me, this obsession is the source and root of all of our productivity and creativity.

**************************************

Saturday, March 26, 2022

Siding with the tribe

Springboarding off yesterday's post, about our unfortunate tendency to believe false claims if we hear them repeated often enough, today we have another kind of discouraging bit of psychological research; our behavior is strongly influenced by group membership -- even if we know from the start that the group we're in is arbitrary, randomly chosen, and entirely meaningless.

Psychologists Marcel Montrey and Thomas Shultz of McGill University set up a fascinating experiment in which volunteers were assigned at random to one of two groups, then instructed to play a simple computer game called "Where's the Rabbit?" in which a simulated rabbit is choosing between two different nest sites.  The participant gets five points if (s)he correctly guesses where the rabbit is going.  In each subsequent round, the rabbit has a 90% chance of picking the same nest again, and a 10% chance of switching to the other.

The twist comes when in mid-game, the participants are offered the option of seeing the guesses of three members from either group (or a mix of the two).  They can also pay two points to use a "rabbit-finding machine" which is set up to be unreliable -- it has a two-thirds chance of getting it right, and a one-third chance of getting it wrong (and the participants know this).  Given that this is (1) expensive, points-wise, and (2) already a lower likelihood of success than simply working on your own and basing your guess on what the rabbit did in the previous round, you'd think no one would choose this option, right?

Wrong.  It turns out that when you looked at how people chose, they were way more likely to do the same thing as the people who belonged to their own group.  Next in likelihood is the wonky, inaccurate rabbit-finding machine.  Dead last was copying what was done by members of the other group.

[Image licensed under the Creative Commons Sara 506, Group people icon, CC BY-SA 3.0]

Remember what I started with -- these groups were entirely arbitrary.  Group affiliation was assigned at the beginning of the experiment by the researchers, and had nothing to do with the participants' intelligence, or even with their previous success at the game.  But the volunteers were still more likely to side with the members of their own tribe.  In fact, when choosing whose decisions to observe, the test subjects decided by a two-to-one margin to consult in-group members and not even consider the decisions made by the out-group.

How much more powerful would this effect be if the group membership wasn't arbitrary, but involved an identity that we're deeply invested in?

"Researchers have known for some time that people prefer to copy members of their own social group (e.g., political affiliation, race, religion, etc.), but have often assumed that this is because group members are more familiar with or similar to each other," said study co-author Marcel Montrey, in an interview in PsyPost.  "However, our research suggests that people are more likely to copy members of their own group even when they have nothing in common.  Simply belonging to the same random group seems to be enough.  Surprisingly, we found that even people who rated their own group as less competent still preferred to copy its members."

It's easy to see how this tendency can be exploited by advertisers and politicians.  "Human social learning is a complex and multifaceted phenomenon, where many factors other than group membership play a role," Montrey said.  "For example, we know that people also prefer to copy successful, popular, or prestigious individuals, which is why companies advertise through endorsements.  How do people’s various learning biases interact, and which ones are most important?  Because these questions have only recently begun to be explored, the real-world relevance of our findings is still up in the air."

This also undoubtedly plays a role in the echo-chamber effect, about which I've written here more than once -- and which is routinely amplified by social media platforms.  "By offering such fine-grained control over whom users observe," Montrey said, "these platforms may spur the creation of homogeneous social networks, in which individuals are more inclined to copy others because they belong to the same social group."

We like to think of ourselves as modern and knowledgeable and savvy, but the truth is that we still retain a core of tribalism that it's awfully hard to overcome.  Consider how often you hear people say things like, "I'll only vote for a person if they belong to the _____ Party."  I've sometimes asked, in some bewilderment, "Even if the person in question is known to be dishonest and corrupt, and their opponent isn't?"  Appallingly, the response is often, "Yes.  I just don't trust people of the other party."

And of course, a great many of the politicians themselves encourage this kind of thinking.  If you can get a voter to eliminate out of hand half of the candidates for no other reason than party affiliation, it raises the likelihood you'll be the one who gets elected.  So the benefits are obvious.

Unfortunately, once you look at the Montrey and Shultz study, the downsides of this sort of thinking should also be frighteningly obvious.

**************************************

Tuesday, January 26, 2021

The cost of regret

"But what would have been the good?"

Aslan said nothing.

"You mean," said Lucy rather faintly, "that it would have turned out all right – somehow?  But how?  Please, Aslan!  Am I not to know?"

"To know what would have happened, child?" said Aslan.  "No.  Nobody is ever told that."

"Oh dear," said Lucy.

"But anyone can find out what will happen," said Aslan.  "If you go back to the others now, and wake them up; and tell them you have seen me again; and that you must all get up at once and follow me – what will happen?  There is only one way of finding out."
This passage, from C. S. Lewis's novel Prince Caspian, has always struck me with particular poignancy, because one of the most consistent themes of my life has been regret at not having made different decisions.  People I dearly wish I had not hurt.  Opportunities I passed up because of my shyness and risk-aversion.  More specific ones, like my (all things considered) terrible decision to live at home while going to college.  My (at the time) barely-acknowledged choice to keep my bisexuality hidden for decades.

It's not, mind you, that I'm unhappy with my life as it is.  I have a wonderful wife, two sons I'm proud of, and spent 32 years in a rewarding career that I discovered quite by accident,  as a consequence of other seemingly unrelated decisions I made.  I have seventeen books in print, something I have dreamed about since elementary school.  I live in a wonderful part of the world, and have had the good fortune to travel and see dozens of other wonderful places.

And I'm aware of the fact that things could have turned out far worse.  Whatever else you can say about the decision, my choice to live at home during college, with conservative, strait-laced parents who kept close tabs on me, kept me out of all sorts of trouble I might otherwise have gotten into.  If I'd come out as bisexual in college, it would have been in around 1980 -- and this was right at the beginning of the AIDS epidemic, when the disease was still poorly understood, and a diagnosis was tantamount to a death sentence.

There's any number of ways the course of my life could have been deflected into an alternate path, and led me to somewhere very different.  Big decisions -- where to go to college, who to marry, what career to pursue.  Tiny actions with big effects, such as Donna Noble's choice of which direction to turn at an intersection in the mind-blowing Doctor Who episode "Turn Left" -- and of which in my own case I'm almost certainly unaware because looking back, they seem entirely insignificant.  


As I said, I like my life just fine.  Even so, I've never been able to shuck the regret, and more than that the fact that like Lucy Pevensie in Prince Caspian, I'll never know what would have happened had I done otherwise.

The topic comes up because of a fascinating paper in the journal Psychological Science called "The Lure of Counterfactual Curiosity: People Incur a Cost to Experience Regret," by Lily FitzGibbon and Kou Muryama (of the University of Reading), and Asuka Komiya (of Hiroshima University).  They did a risk/choice/reward assessment task with 150 adults, and after the task was completed, the volunteers are allowed to pay for information about how they would have fared had they chosen differently.

It turns out, people are willing to pay a lot, even when they find out that they chose poorly (i.e. they would have had a greater reward had they made a different choice), and even though knowledge of their poor decision causes regret, self-doubt, and worse performance on subsequent tasks.  The authors write:
After one makes a decision, it is common to reflect not only on the outcome that was achieved but also on what might have been.  For example, one might consider whether going to a party would have been more fun than staying home to work on a manuscript.  These counterfactual comparisons can have negative emotional consequences; they can lead to the experience of regret.  In the current study, we examined a commonly observed yet understudied aspect of counterfactual comparisons: the motivational lure of counterfactual information—counterfactual curiosity.  Specifically, we found that people are so strongly seduced to know counterfactual information that they are willing to incur costs for information about how much they could have won, even if the information is likely to trigger negative emotions (regret) and is noninstrumental to obtaining rewards.
Why would people seek out information when they know ahead of time it is likely to make them feel bad?  The authors write:
One explanation for seeking negative information is that people may also find it interesting to test their emotional responses—a mechanism that might also underlie so-called morbid curiosity.  Counterfactual information of the kind sought in the current experiments may be desirable because it has high personal relevance—it relates to decisions that one has made in the recent past.  People’s desire for information about their own performance is known to be strong enough to overcome cognitive biases such as inequality aversion.  Thus, opportunities to learn about oneself and the actual and counterfactual consequences of one’s decisions may have powerful motivational status.
Chances are, if I was able to do what Donna did in "Turn Left" and see the outcome had I chosen differently, I'd find the results for my life's path would be better in some aspects and worse in others.  Like everything, it's a mixed bag.  Given the opportunity to go back in time and actually change something -- well, tempting as it would be, I would be mighty hesitant to take that step and risk everything I currently have and have accomplished.

But still -- I'd like to know.  Even if in some cases, I'd have done far better making a different choice, and then would add the certainty of having made a bad decision on top of the more diffuse regret I already have.  The temptation to find out would be almost irresistible.

Maybe it's better, honestly, that we don't see the long-term consequences of our actions.  Fortunate, to put it in Aslan's words, that "Nobody is ever told that."  It's hard enough living with knowing you fell short or behaved badly; how much worse it would be if we saw that things could have been far better if we'd only chosen differently.

****************************************

Just last week, I wrote about the internal voice most of us live with, babbling at us constantly -- sometimes with novel or creative ideas, but most of the time (at least in my experience) with inane nonsense.  The fact that this internal voice is nearly ubiquitous, and what purpose it may serve, is the subject of psychologist Ethan Kross's wonderful book Chatter: The Voice in our Head, Why it Matters, and How to Harness It, released this month and already winning accolades from all over.

Chatter not only analyzes the inner voice in general terms, but looks at specific case studies where the internal chatter brought spectacular insight -- or short-circuited the individual's ability to function entirely.  It's a brilliant analysis of something we all experience, and gives some guidance not only into how to quiet it when it gets out of hand, but to harness it for boosting our creativity and mental agility.

If you're a student of your own inner mental workings, Chatter is a must-read!

[Note: if you purchase this book using the image/link below, part of the proceeds goes to support Skeptophilia!]



Thursday, August 29, 2019

Social media and bad decisions

In his famous dialogue Phaedrus, Plato puts the following words in Socrates's mouth:
If men learn [writing], it will implant forgetfulness in their souls.  They will cease to exercise memory because they rely on that which is written, calling things to remembrance no longer from within themselves, but by means of external marks. 
What you have discovered is a recipe not for memory, but for reminder.  And it is no true wisdom that you offer your disciples, but only the semblance of wisdom, for by telling them of many things without teaching them you will make them seem to know much while for the most part they know nothing.  And as men filled not with wisdom but with the conceit of wisdom they will be a burden to their fellows...
You know, Phaedrus, that is the strange thing about writing, which makes it truly correspond to painting.  The painter’s products stand before us as though they were alive.  But if you question them, they maintain a most majestic silence.  It is the same with written words.  They seem to talk to you as though they were intelligent, but if you ask them anything about what they say from a desire to be instructed they go on telling just the same thing forever.
I'm always reminded of this every time I hear the "kids these days" schtick from People Of A Certain Age, about how young adults are constantly hunched over their phones and rely on Google and don't know anything because they can look it up on Wikipedia.   Back In Our Day, we had to go to the library if we wanted to look something up.  On foot, uphill, and in the snow.  And once we got there, find what we were looking for in a card catalog.

That was printed in freakin' cuneiform on clay tablets.

And we appreciated it, dammit.

You hear this kind of thing aimed most often at social media -- that the use of Snapchat, Instagram, Facebook, and so on, not to mention text messaging, takes people away from face-to-face social interactions they would have otherwise had, and the current ubiquity of this technology is correlated with depression, poor relationship outcomes, and even teen suicide.  The evidence, however, is far from rock solid; these correlations are tenuous at best, and even if there are correlations, it's a long way from proven that the use of social media caused all of the negative trends.

My (admittedly purely anecdotal) observations of teenagers leads me to the conclusion that the number of truly internet-addicted kids is small, and that social, well-adjusted kids are social and well-adjusted with or without their cellphones.  And I can say from my own socially-isolated childhood that having a cellphone would probably not have affected it one way or the other -- even if I magically had Facebook when I was sixteen, I probably would still have been the shy, lonely kid who spent most of his free time in his room.

[Image is in the Public Domain]

That's not to say there aren't some interesting, if troubling, correlations.  A study published recently in The Journal of Behavioral Addictions looked at the connection between social media use and performance on the "Iowa Gambling Task," a simulation that is used to pinpoint impaired decision-making in situations like heroin addiction.  The authors write:
Our results demonstrate that more severe, excessive SNS [social networking site] use is associated with more deficient value-based decision making.  In particular, our results indicate that excessive SNS users may make more risky decisions during the IGT task...  This result further supports a parallel between individuals with problematic, excessive SNS use, and individuals with substance use and behavioral addictive disorders.
The trouble with the study -- which, to be fair, the researchers are up front about -- is that it's a small sample size (71 individuals) and relied on self-reporting for measurement of the daily duration of social media use for each participant.  Self-reporting is notoriously inaccurate -- there have been dozens of studies showing that (for example) self-reporting of diet consistently results in underestimates of the number of calories consumed, and participants have even reported calorie intakes that are "insufficient to support life" without any apparent awareness that they were giving the researchers wildly incorrect information.

So self-reporting of the number of hours spent on social media?  Especially given the negative press social media has gotten recently?  I'm a little suspicious.  The researchers say that their experiment should be repeated with a larger sample size and up-front monitoring of social media use -- which, honestly, should have been done in the first place, prior to publishing the study.

But even so, it's a curious result, and if it bears out, it'll be interesting to parse why Facebook use should be correlated with poor decision-making.  These sorts of correlations often lead to deeper understanding of our own behavior, and that's all to the good.

But now that I'm done writing this, y'all'll have to excuse me so I can post links to today's Skeptophilia on Facebook and Twitter.  You know how it goes.

********************************

This week's Skeptophilia book recommendation is about a subject near and dear to my heart; the possibility of intelligent extraterrestrial life.  In The Three-Body Problem, Chinese science fiction writer Cixin Liu takes an interesting angle on this question; if intelligent life were discovered in the universe -- maybe if it even gave us a visit -- how would humans react?

Liu examines the impact of finding we're not alone in the cosmos from political, social, and religious perspectives, and doesn't engage in any pollyanna-ish assumptions that we'll all be hunky-dory and ascend to the next plane of existence.  What he does think might happen, though, makes for fascinating reading, and leaves you pondering our place in the universe for days after you turn over the last page.

[Note: if you purchase this book from the image/link below, part of the proceeds goes to support Skeptophilia!]





Monday, July 16, 2018

Mice, rats, and sunk costs

One of the most difficult-to-fight biases in human nature is the sunk-cost fallacy.

The idea is the more time, effort, and/or money we've put into a decision, the less likely we are to abandon it -- even after it has been proven a bad choice.  It's what makes people stick with cars that are lemons, investments that are financial disasters, marriages that are horrible, and politicians who have proven themselves to be unethical and self-serving, long after cut-and-run would, all things considered, be the most logical course of action.

The tendency is so ubiquitous that it's often taken for granted.  You even see it in far less logical scenarios than the ones I mentioned above, where there could be at least some rational reason for sticking with the original choice.  A good example is games of pure chance, where gamblers will keep on wasting money because they are certain that a losing streak is bound to end.  "I'm already a thousand dollars in the hole," they'll say.  "I can risk five hundred more."  Here, sunk-cost makes no sense whatsoever; the lost thousand is not an investment that could pay off in any sense of the word, and losing streaks in games of pure chance are not bound to do anything.

That's why they're called "games of pure chance."

So the ubiquity of the sunk-cost fallacy is undeniable, but what's less obvious is why we do it.  Sticking with a bad choice is rarely ever advantageous.  But despite its dubious benefits to survival, what seems certain is that it's a very old behavior, evolutionarily speaking.  Because researchers at the University of Minnesota have just shown that sunk-cost decision making not only occurs in humans, but in...

... mice and rats.

[Image licensed under the Creative Commons Rasbak, Apodemus sylvaticus bosmuis, CC BY-SA 3.0]

In a paper entitled "Sensitivity to 'Sunk Costs' in Mice, Rats, and Humans" that appeared last week in the journal Science, neuropsychologists Brian M. Sweis, Samantha V. Abram, Brandy J. Schmidt, Kelsey D. Seeland, Angus W. MacDonald III, Mark J. Thomas, and A. David Redish showed that even our very distant relatives engage in sunk-cost errors.  The authors write:
Sunk costs are irrecoverable investments that should not influence decisions, because decisions should be made on the basis of expected future consequences.  Both human and nonhuman animals can show sensitivity to sunk costs, but reports from across species are inconsistent.  In a temporal context, a sensitivity to sunk costs arises when an individual resists ending an activity, even if it seems unproductive, because of the time already invested.  In two parallel foraging tasks that we designed, we found that mice, rats, and humans show similar sensitivities to sunk costs in their decision-making.  Unexpectedly, sensitivity to time invested accrued only after an initial decision had been made.  These findings suggest that sensitivity to temporal sunk costs lies in a vulnerability distinct from deliberation processes and that this distinction is present across species.
In both the experiments with humans and rodents, the setup was the same -- the subject navigates a maze looking for rewards (a food pellet for the mice, and hilariously, a video of kittens playing for the humans, showing that cat videos really are an incentive for us) which are scattered randomly through the maze.  Each time a reward is encountered, the subject is told how long it will take for the reward to be delivered (a tone the mice and rats are trained to associate with wait time, and a countdown timer for the humans).  Because the rewards are plentiful and some of the waits are long, what would make logical sense is to abandon a reward if the wait time is too long, so more time could be spent searching for rewards with short wait times.

But that's not what happened.  Both the rodents and the humans would often stick with rewards with very long wait times -- and the ones who said, "Screw it, this is too long to sit here twiddling my thumbs" all gave up early on.  The longer the test subject stuck with the wait, the more likely they were to hang on to the very end, even at the cost of a considerable amount of time that could have been spent foraging more productively.

"Obviously, the best thing is as quick as possible to get into the wait zone," said David Redish, who co-authored the study.  "But nobody does that.  Somehow, all three species know that if you get into the wait zone, you’re going to pay this sunk cost, and they actually spend extra time deliberating in the offer zone so that they don’t end up getting stuck."

What this research doesn't indicate though, its why we all do this.  Behaviors that are common throughout groups of related species -- what are called evolutionarily-conserved behaviors -- are thought to have some kind of significant survival advantage.  (Just as evolutionarily-conserved genes are thought to be essential, even if we don't know for certain what they do.)  "Evolution by natural selection would not promote any behavior unless it had some — perhaps obscure — net overall benefit," said Alex Kacelnik, a professor of behavioral ecology at Oxford, who was not part of the study, but praised its design and rigor.  "If everybody does it, the reasoning goes, there must be a reason."

But what that reason is remains unclear.  We have to leave it at "we're not as logical as we like to think, and our motivation for decision-making not as based in solid fact as you might expect," however unsatisfying that might be.

But it is something to consider next time we're weighing the benefits of sticking with a decision we already made -- whether it's the wait time for downloading a kitten video, or continuing our support for a politician.

***********************************

This week's Skeptophilia book recommendation is a must-read for anyone concerned about the current state of the world's environment.  The Sixth Extinction, by Elizabeth Kolbert, is a retrospective of the five great extinction events the Earth has experienced -- the largest of which, the Permian-Triassic extinction of 252 million years ago, wiped out 95% of the species on Earth.  Kolbert makes a persuasive, if devastating, argument; that we are currently in the middle of a sixth mass extinction -- this one caused exclusively by the activities of humans.  It's a fascinating, alarming, and absolutely essential read.





Thursday, October 8, 2015

Fear talk and bad decisions

I've always been curious about why politicians spend so much time trying to make their constituencies afraid.

A scared person, you would think, is likely to behave unpredictably.  Faced with a raging tiger, some of us would run, some fight back, some piss their pants and faint.  (I suspect I'd be in the last-mentioned group.)  But the point is, you'd think that as a political strategy, making people fearful would backfire as often as not.

But it seems to be all you hear these days.  "Obama is coming for your guns, to leave you defenseless."  "The illegal immigrants are stealing our jobs."  "The economy is going to crash."  "Public schools are failing."  "The terrorists are winning."

A study released last week in the Proceedings of the National Academy of Sciences may give us a perspective on why that is.  In a paper called "Power Decreases Trust in Social Exchanges," by Oliver Schilke, Martin Reimann, and Karen S. Cook (the first two from the University of Arizona, the last from Stanford University), we find out that being low in the power structure makes people more willing to trust authority:
How does lacking vs. possessing power in a social exchange affect people’s trust in their exchange partner?  An answer to this question has broad implications for a number of exchange settings in which dependence plays an important role...  Over a variety of different experimental paradigms and measures, we find that more powerful actors place less trust in others than less powerful actors do.  Our results contradict predictions by rational actor models, which assume that low-power individuals are able to anticipate that a more powerful exchange partner will place little value on the relationship with them, thus tends to behave opportunistically, and consequently cannot be trusted.  Conversely, our results support predictions by motivated cognition theory, which posits that low-power individuals want their exchange partner to be trustworthy and then act according to that desire.  Mediation analyses show that, consistent with the motivated cognition account, having low power increases individuals’ hope and, in turn, their perceptions of their exchange partners’ benevolence, which ultimately leads them to trust.
Scary result, isn't it?  The politicians have a vested interest in making us fearful not only to push a particular political agenda; they make us more likely to blindly trust whoever is saying, "... and I have a solution."

And look what it does to our ability to process facts.  We are told that social programs (read: welfare cheats) are bankrupting the United States, and the way to balance the budget is to end what opponents like to call "entitlements," when the actual situation looks like this:


I'd like someone to explain to me how we can balance the budget by eliminating social services, when social services account for only around 13% of overall expenditures.  In fact, you could argue that our disproportionate military spending -- 718 billion dollars, 20% of the federal budget, accounting for 41% of the military spending worldwide, and four times higher than the country in second place (China) -- is also motivated by fear and a perception of being in a precarious position in the power structure.

It's amazing how blind you become to reality when you're motivated by fear and anxiety.  Remember the idiotic thing that was going around last year, about how we should balance the budget by eliminating salaries for the president, vice president, and members of congress?   Apparently scared people also really don't understand math, because I fail to see how stopping the paychecks of 537 people is going to offset a $426 billion budget deficit.

So if fear accomplishes one other thing besides making you trust whoever you believe to be an authority, it makes you ignore all the evidence to the contrary.  Consider the conviction with which the pro-gun faction believes that gun ownership makes you safer -- while a Boston University study found two years ago that there is a "robust correlation" between the rate of (legal) gun ownership in a state and the rate of violence.

But instead of reasoned debate on the topic, just about all we see is inflammatory rhetoric.  Since when is "Passing laws restricting gun ownership is stupid, because criminals don't obey laws" a logical argument?  No one is suggesting that we make rape and murder legal, because after all, "rapists and murderers don't obey laws."  These sorts of statements aren't meant to engage your brain; they're meant to grab you by the fear centers and swing you around.  "I'm being left defenseless against the criminals" is a powerful motivator.

And as the study by Schilke et al. shows, once we're in a state of fear, we're more likely to trust whoever it is that claims to have a solution.

Look, it's not like I have all the answers myself.  My difficulty with politics is that I find most of the problems they wrestle with so complicated and multi-faceted that I can't imagine how anyone could find a solution that works.  But succumbing to fear certainly doesn't make you more likely to make good decisions, either about what to do or about who should lead us.

As Dave Barry said, "When trouble arises and things look bad, there is always one individual who perceives a solution and is willing to take command.  Very often, that individual is crazy."