Skeptophilia (skep-to-fil-i-a) (n.) - the love of logical thought, skepticism, and thinking critically. Being an exploration of the applications of skeptical thinking to the world at large, with periodic excursions into linguistics, music, politics, cryptozoology, and why people keep seeing the face of Jesus on grilled cheese sandwiches.
Showing posts with label conflict. Show all posts
Showing posts with label conflict. Show all posts

Saturday, January 25, 2020

Arguing to learn

That we live in contentious times is so blatantly obvious it's kind of silly even to point it out.

Politics.  Immigration.  LGBTQ rights.  Climate change.  Religion, and its influence on public policy.  Discrimination.  Gun laws.  Poverty and homelessness.  The list of topics to argue about seems endless, as does the vitriol they engender.  A number of people I know look upon the holidays with dread, because of the potential to bring together family members with diametrically opposite opinions -- making the conditions right for turning Thanksgiving dinner into a conflagration.

Avoiding the topics, of course, strikes a lot of people as cowardly, especially when the topics themselves are of such critical importance.  We're not talking goofy arguments about trivia, like the time a friend of mine and I got in a heated debate over which Olympic sport was silliest, short-track speed skating or curling.  (The answer, obviously, is curling.  Not that I could get him to admit that.  Unfortunately, the most our argument accomplished was getting long-suffering eyerolls from both of our wives, and some sotto voce discussion between the two of them about how they ended up paired with the likes of us.)

But here, the stakes are much higher.  We argue because we feel strongly that the outcome is vitally important.  And because of this, we often adopt an argue-to-win stance -- feet planted firmly, unwilling to give an inch or accede to any suggestion that our opponents might have some points on their side.

The problem is -- and I think most people could affirm that anecdotally from their own experience -- this approach has a very poor success rate.  How often has the fight on Christmas Eve between liberal, tree-hugging Cousin Sally and conservative, Trump-supporting, Fox-News-watching Uncle Jake actually resulted in either of them changing their views?  About anything?  Yes, I think it's important to stand your ground and fight for what you believe in, but maybe it's time to admit that the approach most people take isn't working.

[Image licensed under the Creative Commons Hector Alejandro, Two boys engaged in arm wrestling, CC BY 2.0]

That's the gist of some research published this week in Cognitive Science.  In "The Influence of Social Interaction on Intuitions of Objectivity and Subjectivity," by Matthew Fisher, Joshua Knobe, and Frank C. Keil (of Yale University), and Brent Strickland (of the École Normale Supérieure), we learn that a lot more could be accomplished if instead of an argue-to-win strategy, we adopted an argue-to-learn approach.

The researchers paired people up based on their having opposing opinions about a variety of controversial topics.  The members of one group of pairs were instructed to interact with their partners with an argue-to-win approach -- to justify their own position and to try to argue their points as objectively and convincingly as they could, and were told they would be graded on their success at convincing their partners of the error of their ways.  The members of the other group of pairs were instructed to use an argue-to-learn approach -- they were told they'd be paired with people of opposing viewpoints, but the task was to learn as much as they could about the reasoning behind their partner's opinions, and to be able to then articulate it coherently afterward.

Neither group of participants knew that the mode of argument was going to be different between different pairs -- they only knew what their own task was.  And in order to make sure that the experiment was controlled, the researchers took the extra step of having independent raters evaluate the pairs on how well they'd fulfilled the requirements of their tasks, to make certain that by random chance the argue-to-win group wasn't populated with combative types and the argue-to-learn group with peaceniks.

Afterward, test subjects were given a set of questions to determine their attitudes about there being an objective correct answer to every dilemma.  The questions included, "Given that most issues have more than one side, it is inevitable that one must be correct and the others wrong," and "Even if someone disagrees with me, (s)he has rational reasons for doing so."

Interestingly, just the task of being forced for four minutes into adopting either a no-quarter-given approach or a let's-listen-to-each-other approach created a marked difference between how participants answered the questionnaire.  The argue-to-win group were much more likely to agree with statements suggesting there was one objective correct answer, and that anyone not believing that answer were simply wrong; the argue-to-learn group were more likely to agree with statements implying that truth was nuanced, and that people of opposing opinions aren't necessarily ignorant or irrational.

So if you want to sway people, the way to do it is not through verbal fisticuffs; it's through listening to the other side's reasons and making it clear you want to learn more about their arguments, not batter them down with your own.

Now, understand that I'm not trying to say -- and neither were the researchers -- that there aren't objective truths and moral absolutes out there, or that everything is on that mushy ground of subjectivity.  Climate change deniers and young-Earth creationists are simply factually wrong.  People who support discrimination on the basis of race or sexual orientation are espousing an inherently immoral stance.  But there are a lot of ways even to approach these topics without a knock-down-drag-out fight.  Even if climate change itself is undeniable, what (if anything) we can or should do about it is certainly up for discussion.  And perhaps it might be more successful to tackle the issues of racism and misogyny and homophobia not by wavering in our own conviction to do what is right and moral, but to find out why our opponents believe what they do.  When someone shouts, "All liberals are America-haters who want to destroy our country," it might be better to say, "I'm a liberal and I don't think that.  Why do you say all liberals think that way?" rather than "Oh, yeah?  Well, you're an ignorant hate-monger!"

Not that the latter is necessarily false, just that pointing it out doesn't accomplish anything.

So that's the latest on how to keep arguments from going thermonuclear, and maybe even convincing some folks to rethink their views at the same time.  Heaven knows with the increasing polarization in the world, and the news media and pundits feeding into that every chance they get, we need to stop and listen to each other more often.

And even if the two of you leave with your views substantially unchanged, who knows?  Maybe both of you will have learned something.

*********************************

I don't often recommend historical books here at Skeptophilia, not because of a lack of interest but a lack of expertise in identifying what's good research and what's wild speculation.  My background in history simply isn't enough to be a fair judge.  But last week I read a book so brilliantly and comprehensively researched that I feel confident in recommending it -- and it's not only thorough, detailed, and accurate, it's absolutely gripping.

On May 7, 1915, the passenger ship Lusitania was sunk as it neared its destination of Liverpool by a German U-boat, an action that was instrumental in leading to the United States joining the war effort a year later.  The events leading up to that incident -- some due to planning, other to unfortunate chance -- are chronicled in Erik Larson's book Dead Wake, in which we find out about the cast of characters involved, and how they ended up in the midst of a disaster that took 1,198 lives.

Larson's prose is crystal-clear, giving information in such a straightforward way that it doesn't devolve into the "history textbook" feeling that so many true-history books have.  It's fascinating and horrifying -- and absolutely un-put-downable.

[Note: if you purchase this book using the image/link below, part of the proceeds goes to support Skeptophilia!]





Saturday, May 4, 2019

An exercise in futility

I'm going to ask a question, not because I'm trying to lead my readers toward a particular answer, but because I honestly don't know the answer myself.

To what extent are we ethically obligated to confront strangers on social media who post immoral or offensive claims?

I ask this because this morning I saw a post by a friend of a distant relative on Facebook stating that "the origin of homosexuality is in pedophilia."  First of all, this is factually wrong; there probably are some homosexuals who are pedophiles, but they're no more common among the LGBTQ population than they are among the cis-heterosexuals.  But worse, this is vile homophobia, implying that there is an equivalence between a loving, committed relationship between two adults of the same sex, and a person of either sex harming or abusing a child.

So I wrote, "this is bullshit."

The response came back almost immediately: "Typical libtard excuses for the immorality that is destroying America."

I answered, "You want research showing that there's no connection between homosexuality and pedophilia?  I can provide it."

The response: "Why would I be convinced by pro-gay atheistic scientists?  They are hand-in-glove with the queers anyhow."

At that point, I gave up.

This is troubling from a plethora of angles.  Not only does this person espouse ugly bigotry, she has decided that anything contrary to her views must be a "libtard" opinion motivated by a desire to destroy America's moral fiber.  She's successfully insulated herself from ever discovering she's wrong.  About anything.  Further, this enables her to write off anyone who disagrees with her as a dupe at best and actively evil at worst.

So the argument I got into was an exercise in futility, which I knew it would be from the outset.  Someone who would post what she did isn't going to have their views changed by a nasty exchange with a total stranger.  All it did was raise both of our blood pressures and leave us more firmly entrenched in what we already believed.

But does that mean we shouldn't try?

[Image licensed under the Creative Commons David Shankbone creator QS:P170,Q12899557, Anger during a protest by David Shankbone, CC BY-SA 3.0]

That doesn't set well with me, either.  If you don't challenge evil when you see it, what good are your moral convictions?  It also bears consideration that my antagonist is not the only person who saw the back-and-forth.  Presumably a lot of people read what we wrote -- and interestingly, not a single person, including my (very conservative and religious) cousin, decided to weigh in.  It may be that one of them was on the fence, and seeing his or her unexamined views expressed in such a blatantly vicious fashion caused some level of reconsideration.

But I don't know.  I detest conflict, and am the last person who would seek out a battle just for the hell of it.  Also, I can say that when I've engaged in this kind of thing with a stranger, it has resulted in an exactly zero percent success rate of moving the person who posted the initial comment.  So was it worth the unpleasantness?

I honestly don't know.  It felt a great deal like tilting at windmills to me.  But like I said, with some things staying silent really isn't an option.

If anyone has any better perspective on this, I'd love to hear it, either privately or in the comments section.  Because right now, I'm feeling pretty despondent about ever convincing anyone of anything -- even when their views are immoral, unfair, bigoted, or demonstrably false.

**********************************

This week's Skeptophilia book recommendation is for any of my readers who, like me, grew up on Star Trek in any of its iterations -- The Physics of Star Trek by Lawrence Krauss.  In this delightful book, Krauss, a physicist at Arizona State University, looks into the feasibility of the canonical Star Trek technology, from the possible (the holodeck, phasers, cloaking devices) to the much less feasible (photon torpedoes, tricorders) to the probably impossible (transporters, replicators, and -- sadly -- warp drive).

Along the way you'll learn some physics, and have a lot of fun revisiting some of your favorite tropes from one of the most successful science fiction franchises ever invented, one that went far beyond the dreams of its creator, Gene Roddenberry -- one that truly went places where no one had gone before.






Saturday, July 26, 2014

Arguing by agreement

My job would be easier, as a skeptic, if humans were basically rational beings.

The fact is, though, we're not controlled solely by the higher-cognitive parts of our brains.  We are also at the mercy of our emotions and biases, not to mention a set of perceptual apparati that work well enough most of the time, but are hardly without their own faults and (sometimes literal) blind spots.

This is why the backfire effect occurs.  A pair of psychologists, Brendan Nyhan and Jason Reifler, found that most people, after being confronted with evidence against their prior beliefs, will espouse those beliefs more strongly:
Nyhan and Reifler found a backfire effect in a study of conservatives. The Bush administration claimed that tax cuts would increase federal revenue (the cuts didn't have the promised effect). One group was offered a refutation of this claim by prominent economists that included current and former Bush administration officials. About 35 percent of conservatives told about the Bush claim believed it. The percentage of believers jumped to 67 when the conservatives were provided with the refutation of the idea that tax cuts increase revenue.  (from The Skeptic's Dictionary)
As a blogger, this makes it hard to know how to approach controversial topics.  By calmly and dispassionately citing evidence against silly claims, am I having the effect of making the True Believers double down on their position?  If so, how could I approach things differently?

A study published this week in The Proceedings of the National Academy of Sciences provides the answer.  To convince people of the error of their ways, agree with them, strenuously, following their beliefs to whatever absurd end they drive you, and without once uttering a contrary word.

Psychologists Eran Halperin, Boaz Hameiri, and Roni Porat of the Interdisciplinary Center Herzliya in Israel were looking at a way to alter attitudes between Israelis and Palestinians -- a goal as monumental as it is laudable.  Given the decades that have been spent in futile negotiations between these two groups, always approached from a standpoint of logic, rationality, and compromise, Halperin, Hameiri, and Porat decided to try a different tack.

150 Israeli volunteers were split into two groups -- one was shown video clips of neutral commercials, the other video clips that related the Israeli/Palestinian conflict back to the values that form the foundation of the Israeli self-identity.  In particular, the clips were based on the idea that Israel has a god-given right to exist, and is the most deeply moral society in the world.  But instead of taking the obvious approach that attacks against Palestinians (including innocent civilians) called into question the morality of the Israeli stance, the videos followed these concepts to their logical conclusion -- that the conflict should continue, even if innocent Palestinians died, because of Israel's inherent moral rectitude.

And attitudes changed.  The authors of the study report that members of the experimental group showed a 30% higher willingness to reevaluate their positions on the issue, as compared to the control group.  They showed a greater openness to discussion of the opposing side's narrative, and a greater likelihood of voting for moderate political candidates.  And the attitude change didn't wear off -- the subjects still showed the same alteration in their beliefs a year later.  Hameiri writes:
The premise of most interventions that aim to promote peacemaking is that information that is inconsistent with held beliefs causes tension, which may motivate alternative information seeking.  However, individuals—especially during conflict—use different defenses to preserve their societal beliefs.  Therefore, we developed a new paradoxical thinking intervention that provides consistent—though extreme—information, with the intention of raising a sense of absurdity but not defenses.
So apparently, Stephen Colbert is on the right track.


I find the whole thing fascinating, if a little frustrating.   Being a science-geek-type, I have always lived in hope that rational argument and hard data would eventually win.

It appears, however, that it doesn't, always.  It may be that for the deepest, most lasting changes in attitude, we have to take those beliefs we are trying to change, and force them to their logical ends, and hope that after that, the absurdity will speak for itself.