Skeptophilia (skep-to-fil-i-a) (n.) - the love of logical thought, skepticism, and thinking critically. Being an exploration of the applications of skeptical thinking to the world at large, with periodic excursions into linguistics, music, politics, cryptozoology, and why people keep seeing the face of Jesus on grilled cheese sandwiches.
Showing posts with label argument. Show all posts
Showing posts with label argument. Show all posts

Saturday, January 25, 2020

Arguing to learn

That we live in contentious times is so blatantly obvious it's kind of silly even to point it out.

Politics.  Immigration.  LGBTQ rights.  Climate change.  Religion, and its influence on public policy.  Discrimination.  Gun laws.  Poverty and homelessness.  The list of topics to argue about seems endless, as does the vitriol they engender.  A number of people I know look upon the holidays with dread, because of the potential to bring together family members with diametrically opposite opinions -- making the conditions right for turning Thanksgiving dinner into a conflagration.

Avoiding the topics, of course, strikes a lot of people as cowardly, especially when the topics themselves are of such critical importance.  We're not talking goofy arguments about trivia, like the time a friend of mine and I got in a heated debate over which Olympic sport was silliest, short-track speed skating or curling.  (The answer, obviously, is curling.  Not that I could get him to admit that.  Unfortunately, the most our argument accomplished was getting long-suffering eyerolls from both of our wives, and some sotto voce discussion between the two of them about how they ended up paired with the likes of us.)

But here, the stakes are much higher.  We argue because we feel strongly that the outcome is vitally important.  And because of this, we often adopt an argue-to-win stance -- feet planted firmly, unwilling to give an inch or accede to any suggestion that our opponents might have some points on their side.

The problem is -- and I think most people could affirm that anecdotally from their own experience -- this approach has a very poor success rate.  How often has the fight on Christmas Eve between liberal, tree-hugging Cousin Sally and conservative, Trump-supporting, Fox-News-watching Uncle Jake actually resulted in either of them changing their views?  About anything?  Yes, I think it's important to stand your ground and fight for what you believe in, but maybe it's time to admit that the approach most people take isn't working.

[Image licensed under the Creative Commons Hector Alejandro, Two boys engaged in arm wrestling, CC BY 2.0]

That's the gist of some research published this week in Cognitive Science.  In "The Influence of Social Interaction on Intuitions of Objectivity and Subjectivity," by Matthew Fisher, Joshua Knobe, and Frank C. Keil (of Yale University), and Brent Strickland (of the École Normale Supérieure), we learn that a lot more could be accomplished if instead of an argue-to-win strategy, we adopted an argue-to-learn approach.

The researchers paired people up based on their having opposing opinions about a variety of controversial topics.  The members of one group of pairs were instructed to interact with their partners with an argue-to-win approach -- to justify their own position and to try to argue their points as objectively and convincingly as they could, and were told they would be graded on their success at convincing their partners of the error of their ways.  The members of the other group of pairs were instructed to use an argue-to-learn approach -- they were told they'd be paired with people of opposing viewpoints, but the task was to learn as much as they could about the reasoning behind their partner's opinions, and to be able to then articulate it coherently afterward.

Neither group of participants knew that the mode of argument was going to be different between different pairs -- they only knew what their own task was.  And in order to make sure that the experiment was controlled, the researchers took the extra step of having independent raters evaluate the pairs on how well they'd fulfilled the requirements of their tasks, to make certain that by random chance the argue-to-win group wasn't populated with combative types and the argue-to-learn group with peaceniks.

Afterward, test subjects were given a set of questions to determine their attitudes about there being an objective correct answer to every dilemma.  The questions included, "Given that most issues have more than one side, it is inevitable that one must be correct and the others wrong," and "Even if someone disagrees with me, (s)he has rational reasons for doing so."

Interestingly, just the task of being forced for four minutes into adopting either a no-quarter-given approach or a let's-listen-to-each-other approach created a marked difference between how participants answered the questionnaire.  The argue-to-win group were much more likely to agree with statements suggesting there was one objective correct answer, and that anyone not believing that answer were simply wrong; the argue-to-learn group were more likely to agree with statements implying that truth was nuanced, and that people of opposing opinions aren't necessarily ignorant or irrational.

So if you want to sway people, the way to do it is not through verbal fisticuffs; it's through listening to the other side's reasons and making it clear you want to learn more about their arguments, not batter them down with your own.

Now, understand that I'm not trying to say -- and neither were the researchers -- that there aren't objective truths and moral absolutes out there, or that everything is on that mushy ground of subjectivity.  Climate change deniers and young-Earth creationists are simply factually wrong.  People who support discrimination on the basis of race or sexual orientation are espousing an inherently immoral stance.  But there are a lot of ways even to approach these topics without a knock-down-drag-out fight.  Even if climate change itself is undeniable, what (if anything) we can or should do about it is certainly up for discussion.  And perhaps it might be more successful to tackle the issues of racism and misogyny and homophobia not by wavering in our own conviction to do what is right and moral, but to find out why our opponents believe what they do.  When someone shouts, "All liberals are America-haters who want to destroy our country," it might be better to say, "I'm a liberal and I don't think that.  Why do you say all liberals think that way?" rather than "Oh, yeah?  Well, you're an ignorant hate-monger!"

Not that the latter is necessarily false, just that pointing it out doesn't accomplish anything.

So that's the latest on how to keep arguments from going thermonuclear, and maybe even convincing some folks to rethink their views at the same time.  Heaven knows with the increasing polarization in the world, and the news media and pundits feeding into that every chance they get, we need to stop and listen to each other more often.

And even if the two of you leave with your views substantially unchanged, who knows?  Maybe both of you will have learned something.

*********************************

I don't often recommend historical books here at Skeptophilia, not because of a lack of interest but a lack of expertise in identifying what's good research and what's wild speculation.  My background in history simply isn't enough to be a fair judge.  But last week I read a book so brilliantly and comprehensively researched that I feel confident in recommending it -- and it's not only thorough, detailed, and accurate, it's absolutely gripping.

On May 7, 1915, the passenger ship Lusitania was sunk as it neared its destination of Liverpool by a German U-boat, an action that was instrumental in leading to the United States joining the war effort a year later.  The events leading up to that incident -- some due to planning, other to unfortunate chance -- are chronicled in Erik Larson's book Dead Wake, in which we find out about the cast of characters involved, and how they ended up in the midst of a disaster that took 1,198 lives.

Larson's prose is crystal-clear, giving information in such a straightforward way that it doesn't devolve into the "history textbook" feeling that so many true-history books have.  It's fascinating and horrifying -- and absolutely un-put-downable.

[Note: if you purchase this book using the image/link below, part of the proceeds goes to support Skeptophilia!]





Friday, November 17, 2017

Motivated reasoning

Last week there was a paper released in the Journal of Personality and Individual Differences called, "Epistemic Rationality: Skepticism Toward Unfounded Beliefs Requires Sufficient Cognitive Ability and Motivation to be Rational."  Understandably enough, the title made me sit up and take notice, as this topic has been my bread and butter for years.  The authors, Tomas Ståhl (of the University of Illinois) and Jan-Willem van Prooijen (of the Vrije Universiteit Amsterdam), describe their work thus:
Why does belief in the paranormal, conspiracy theories, and various other phenomena that are not backed up by evidence remain widespread in modern society?  In the present research we adopt an individual difference approach, as we seek to identify psychological precursors of skepticism toward unfounded beliefs.  We propose that part of the reason why unfounded beliefs are so widespread is because skepticism requires both sufficient analytic skills, and the motivation to form beliefs on rational grounds...  [W]e show that analytic thinking is associated with a lower inclination to believe various conspiracy theories, and paranormal phenomena, but only among individuals who strongly value epistemic rationality...  We also provide evidence suggesting that general cognitive ability, rather than analytic cognitive style, is the underlying facet of analytic thinking that is responsible for these effects.
The first bit is hardly a surprise, and is the entire raison d'être of my Critical Thinking class.  Skepticism is not only a way of looking at the world, it's a skill; and like any skill, it takes practice.  Adopting a rational approach to understanding the universe means learning some of the ways in which irrationality occurs, and figuring out how to avoid them.

The second part, though, is more interesting, but also more insidious: in order to be a skeptic, you have to be motivated toward rational thought -- and value it.

Aristotle Teaching Alexander the Great (Charles Laplante, 1866) [image courtesy of the Wikimedia Commons]

This explains the interaction I had with one of my AP Biology students many years ago.  Young-Earth creationists don't, by and large, take my AP class.  My background is in evolutionary genetics, so most of them steer clear, sensing that they're in hostile territory.  (I will say in my own defense that I never treat students in a hostile manner; and the few times I have had a creationist take my class, it was a positive experience, and kept me on my toes to present my arguments as cogently as possible.)

This young lady, however, stood out.  She was absolutely brilliant, acing damn near every quiz I gave.  She had a knack for understanding science that was nothing short of extraordinary.  So we went through the unit on genetics, and I presented the introduction to the unit on evolution, in which I laid out the argument supporting the theory of evolution, explaining how it fits every bit of hard evidence we've got.

That day, she asked if she could talk to me after class.  I said, "Sure," and had no guess about what she might have wanted to talk to me about.

I was absolutely flabbergasted when she said, "I just want you to know that I'm a creationist."

I must have goggled at her for a moment -- after (at that point) two decades as a teacher, I had pretty good control over my facial expressions, but not that good.  She hastily added, "I'm not saying I'm going to argue with you, or that I'm refusing to learn the material, or anything.  I just wanted you to know where I was coming from."

I said, "Okay.  That's fine, and thanks for being up front with me.  But do you mind if I ask you a couple of questions?"

She said, "Not at all."

So I asked her where the argument I'd presented in class fell apart for her.  What part of the evidence or logical chain didn't work?

She said, "None of it.  It's all logical and makes perfect sense."

I must have goggled again, because she continued, "I understand your argument, and it's logically sound.  I don't disbelieve in the evidence you told us about.  But I still don't believe in evolution."

The upshot of it was that for her, belief and rationality did not intersect.  She believed what she believed, and if rational argument contradicted it, that was that.  She didn't argue, she didn't look for counterevidence; she simply dismissed it.  Done.

The research by Ståhl and van Prooijen suggests that the issue with her is that she had no motivation to apply rationality to this situation.  She certainly wasn't short of cognitive ability; she outperformed most of the students in the class (including, I might add, on the test on evolutionary theory).  But there was no motive for her to apply logic to a situation that for her, was beyond the reach of logic.  You got there by faith, or not at all.

To this day, and of all the students I've taught, this young lady remains one of the abiding puzzles.  Her ability to compartmentalize her brain that way -- I'll apply logic here, and it gives me the right answers, but not here, because it'll give me the wrong answers -- is so foreign to my way of thinking that it borders on the incomprehensible.  For me, if science, logic, and rationality work as a way of teasing out fact from falsehood, then -- they work.  You can't use the same basic principles and have them alternate between giving you true and false conclusions, unless the method itself is invalid.

Which, interestingly, is not what she was claiming.

And this is a difficulty that I have a hard time seeing any way to surmount.  Anyone can be taught some basic critical thinking skills; but if they have no motivation to apply them, or (worse) if pre-existing religious or political beliefs actually give them a motivation not to apply them, the argument is already lost.

So that's a little depressing.  Sorry.  I'm still all for teaching cognitive skills (hell, if I wasn't, I'm seriously in the wrong profession).  But what to do about motivation is a puzzle.  It once again seems to me that like my student's attitude toward faith-based belief, being motivated to use logic to understand your world is something about which you have to make a deliberate choice.

You get there because you choose to accept rational argument, or you don't get there at all.

Thursday, November 9, 2017

A self-portrait drawn by others

As you might imagine, I get hate mail pretty frequently.

Most of it has to do with my targeting somebody's sacred cow, be it homeopathy, fundamentalist religion, astrology, climate change denial, or actual sacred cows.  And it seems to fall into three general categories:
  • Insults, some of which never get beyond the "you stupid poopyhead fuckface" level.  These usually have the worst grammar and spelling.
  • Arguments that are meant to be stinging rebuttals.  They seldom are, at least not from the standpoint of adding anything of scientific merit to the conversation, although their authors inevitably think they've skewered me with the sharp rapier of their superior knowledge.  (Sometimes I get honest, thoughtful comments or criticisms on what I've written; I have always, and will always, welcome those.)
  • Diatribes that tell me what I actually believe, as if I'm somehow unaware of it.
I've gotten several of the latter in the last few weeks, and it's these that I want to address in this post, because they're the ones I find the most curious.  I've got a bit of a temper myself, so I can certainly understand the desire to strike back with an insult at someone who's angered you; and it's unsurprising that a person who is convinced of something will want to rebut anyone who says different.  But the idea that I'd tell someone I was arguing with what they believed, as if I knew it better than they did, is just plain weird.

Here are a handful of examples, from recent fan mail, to illustrate what I'm talking about:
  • In response to a post I did on the vitriolic nonsense spouted by televangelist Jim Bakker: "Atheists make me want to puke.  You have the nerve to attack a holy man like Jim Bakker.  You want to tear down the foundation of this country, which is it's [sic] churches and pastors, and tell Christian Americans they have no right to be here."
  • In response to my post on a group of alt-med wingnuts who are proposing drinking turpentine to cure damn near everything: "You like to make fun of people who believe nature knows best for curing us and promoting good health.  You pro-Monsanto, pro-chemical types think that the more processed something is, the better it is for you.  I bet you put weed killer on your cereal in the morning."
  • In response to the post in which I described a push by EPA chief Scott Pruitt to remove scientists from the EPA advisory board and replace them with corporate representatives: "Keep reading us your fairy tales about 'climate change' and 'rising sea levels.'  Your motives are clear, to destroy America's economy and hand over the reigns [sic] to the wacko vegetarian enviro nuts.  Now that we've got people in government who are actually looking out for AMERICAN interests, people like you are crapping your pants because you know your [sic] not in control any more."
  • And finally, in response to a post I did on the fact that the concept of race has little biological meaning: "You really don't get it, do you?  From your picture you're as white as I am, and you're gonna stand there and tell me that you have no problem being overrun by people who have different customs and don't speak English?  Let's see how you feel when your kid's teacher requires them to learn Arabic."
So, let's see.  That makes me a white English-only wacko vegetarian enviro nut (with crap in my pants) who eats weed killer for breakfast while writing checks to Monsanto and plotting how to tear down churches so I can destroy the United States.

Man, I've got a lot on my to-do list today.

I know it's a common tendency to want to attribute some set of horrible characteristics to the people we disagree with.  It engages all that tribal mentality stuff that's pretty deeply ingrained in our brains -- us = good, them = bad.  The problem is, reality is a hell of a lot more complex that that, and it's only seldom that you can find someone who is so bad that they have no admixture whatsoever of good, no justification for what they're doing, no explanation at all for how they got to be the way they are.  We're all mixed-up cauldrons of conflicting emotions.  It's hard to understand ourselves half the time; harder still to parse the motives of others.

So let me disabuse my detractors of a few notions.

While I'm not religious myself, I really have a live-and-let-live attitude toward religious folks, as long as they're not trying to impose their religion on others or using it as an excuse to deny others their rights as humans.  I have religious friends and non-religious friends and friends who don't care much about the topic one way or the other, and mostly we get along pretty well.

I have to admit, though, that being a card-carrying atheist, I do have to indulge every so often in the dietary requirements as set forth in the official Atheist Code of Conduct.


Speaking of diet, I'm pretty far from a vegetarian, even when I'm not dining on babies. In fact, I think that a medium-rare t-bone steak with a glass of good red wine is one of the most delicious things ever conceived by the human species.  But neither am I a chemical-lovin' pro-Monsanto corporate shill who drinks a nice steaming mug of RoundUp in the morning.  I'll stick with coffee, thanks.

Yes, I do accept climate change, because I am capable of reading and understanding a scientific paper and also do not think that because something is inconvenient to American economic expediency, it must not be true.  I'd rather that the US economy doesn't collapse, mainly because I live here, but I'd also like my grandchildren to be born on a planet that is habitable in the long term.

And finally: yes, I am white. You got me there.  If I had any thought of denying it, it was put to rest when I did a 23 & Me test and found out that I'm... white.  My ancestry is nearly all from western Europe, unsurprising given that three of my grandparents were of French descent and one of Scottish descent.  But my being white doesn't mean that I always have to place the concerns of other white people first, or fear people who aren't white, or pass laws making sure that America stays white.  For one thing, it'd be a little hypocritical if I demanded that everyone in the US speak English, given that my mother and three of my grandparents spoke French as their first language; and trust me when I say that I would have loved my kids to learn Arabic in school.  The more other cultures you learn about in school, the better, largely because it's hard to hate people when you realize that they're human, just like you are.

So anyway.  Nice try telling me who I am, but you got a good many of the details wrong.  Inevitable, I suppose, when it's a self-portrait drawn by someone else.  Next time, maybe you should try engaging the people you disagree with in dialogue, rather than ridiculing, demeaning, dismissing, or condescending to them.  It's in general a nicer way to live, and who knows?  Maybe you'll learn something.

And if you want to know anything about me, just ask rather than making assumptions.  It's not like I'm shy about telling people what I think.  Kind of hiding in plain sight, here.

Tuesday, September 26, 2017

Right in the gut

I know I've said it before, but it bears saying again: the strength of science lies in its reliance on hard evidence as the sine qua non of understanding.

I've tried to embrace this outlook myself, insofar as a fallible and biased human can do so.  Okay, so every day I poke fun at all sorts of odd beliefs, sometimes pissing people off.  But you know what?  You want to convince me, show me some reliable evidence.  For any of the claims I've scoffed at.  Bigfoot.  Ghosts.  ESP.  Astrology.  Tarot divination.  Homeopathy.

Even the existence of god.

I'm convinceable.  All you have to do is show me one piece of irrefutable, incontrovertible evidence, and I'm sold.

The problem is, to my unending frustration and complete bafflement, most people don't approach the world that way.  Instead, they rely on their gut -- which seems to me to be a really good way to get fooled.  I'm a pretty emotional guy, and I know my gut is unreliable.

Plus, science just doesn't seem to obey common sense at times.  As an example, consider the Theory of Relativity.  Among its predictions:
  • The speed of light is the ultimate universal speed limit.
  • Light moves at the same speed in every reference frame (i.e., your own speed relative to the beam of light doesn't matter; you'll still measure it as traveling at 300,000,000 meters per second).
  • When you move, time slows down.  The faster you move, the slower time goes.  So if you took off in a rocket ship to Alpha Centauri at 95% of the speed of light, when you came back from your trip you'd find that while twelve years or so would have passed for you, hundreds of years would have passed on Earth.
  • When you move, to a stationary person your mass increases and your length in the direction of motion contracts.  The faster you move, the more pronounced this effect becomes.
And so on.  But the kicker: all of these predictions of the Theory of Relativity have been experimentally verified.  As counterintuitive as this might be, that's how the world is.  (In fact, relativistic effects have to be taken into account to have accurate GPS.)

None of which we would know now if people relied solely on their gut to tell them how things work.

Despite all this, there are people who still rely on impulse and intuition to tell them what's true and what's not.  And now a study jointly conducted by researchers at Ohio State University and the University of Michigan has shown conclusively that if you do this, you are more prone to being wrong.

[image courtesy of the Wikimedia Commons]

Kelly Garrett and Brian Weeks decided to look into the connection between how people view evidence, and their likelihood of falling for incorrect information.  They looked at survey data from almost 3,000 people, in particular focusing on whether or not the respondents agreed with the following statements:
  • I trust my gut to tell me what’s true and what’s not. 
  • Evidence is more important than whether something feels true.
  • Facts are dictated by those in power.
They then correlated the responses with the participants' likelihood of believing a variety of conspiracy theories.  Unsurprisingly, they found that the people who relied on gut feelings and emotions to determine the truth were far more likely to fall for conspiracies and outright untruths.

"Misperceptions don’t always arise because people are blinded by what their party or favorite news outlet is telling them," Weeks said.  "While trusting your gut may be beneficial in some situations, it turns out that putting faith in intuition over evidence leaves us susceptible to misinformation."

"People sometimes say that it’s too hard to know what’s true anymore," Garrett said.  "That’s just not true.  These results suggest that if you pay attention to evidence you’re less likely to hold beliefs that aren’t correct...  This isn’t a panacea – there will always be people who believe conspiracies and unsubstantiated claims – but it can make a difference."

I'd say it makes all the difference.  And in the current political environment -- where accusations of "fake news" are thrown around right and left, and what people consider to be the truth depends more on political affiliation than it does on rational fact -- it's more than ever absolutely essential.

Friday, September 1, 2017

Argue with me

In recent months, I've done several posts that reference the backfire effect -- the tendency of people to double down on their previous beliefs when challenged, even when shown hard evidence that their views are incorrect.  But of course, this brings up the question, if people tend to plant their feet when you offer counterarguments, how do you change someone's mind?

A quartet of researchers at Cornell University, Chenhao Tan, Vlad Niculae, Cristian Danescu-Niculescu-Mizil, and Lillian Lee, have studied this very question, and presented their findings in a paper called, "Winning Arguments: Interaction Dynamics and Persuasion Strategies in Good-faith Online Discussions."  My wife stumbled onto this study a couple of days ago, and knowing this was right down my alley, forwarded it to me.

What the researchers did was to study patterns on r/ChangeMyView, a subreddit where people post opinions and invite argument.  If someone does succeed in changing the original poster's view, the successful arguer is awarded a ∆ (the Greek letter delta, which in science is used to represent change).  By seeing who was awarded deltas, and analyzing their statements, the researchers were able to determine the characteristics of statements that were the most successful, and the ones that were generally unsuccessful.

Argument Irresistible, by Robert Macaire (from the magazine Le Charivari, May 1841) [image courtesy of the Wikimedia Commons]

And the results are a fascinating window into how we form, and hold on to, our opinions.  The authors write:
Changing someone's opinion is arguably one of the most important challenges of social interaction.  The underlying process proves difficult to study: it is hard to know how someone's opinions are formed and whether and how someone's views shift. Fortunately, ChangeMyView, an active community on Reddit, provides a platform where users present their own opinions and reasoning, invite others to contest them, and acknowledge when the ensuing discussions change their original views.  In this work, we study these interactions to understand the mechanisms behind persuasion. 
We find that persuasive arguments are characterized by interesting patterns of interaction dynamics, such as participant entry-order and degree of back-and-forth exchange.  Furthermore, by comparing similar counterarguments to the same opinion, we show that language factors play an essential role.  In particular, the interplay between the language of the opinion holder and that of the counterargument provides highly predictive cues of persuasiveness. Finally, since even in this favorable setting people may not be persuaded, we investigate the problem of determining whether someone's opinion is susceptible to being changed at all.  For this more difficult task, we show that stylistic choices in how the opinion is expressed carry predictive power.
More simply put, Tan et al. found that it wasn't the content of the argument that determined its success, it was how it was worded.  In particular, they found that the use of calmer words, statements that were serious (i.e. not joking or sarcasm), and arguments that were worded differently from the original statement (i.e. were not simply direct responses to what was said) were the most effective.  Quotes from sources were relatively ineffective, but if you can post a link to a corroborating site, it strengthens your argument.

Another thing that was more likely to increase your success at convincing others was appearing flexible yourself.  Starting out with "You're an idiot if you don't see that..." is a poor opening salvo.  Wording such as "It could be that..." or "It looks like the data might support that..." sounds as if it would be a signal of a weak argument, but in fact, such softer phrasing was much more likely to be persuasive than a full frontal attack.

Even more interesting were the characteristics of the original posts that signaled that the person was persuadable.  The people who were most likely to change their minds, the researchers found, wrote longer posts, included more information and data in the form of lists, included sources, and were more likely to use first-person singular pronouns (I, my) rather than first-person plural (we, our) or third-person impersonal (they, their).

Unsurprising, really; if a person is basing his/her opinion on evidence, I'd expect (s)he would be easier to convince using different evidence.  And the "I" vs. "we" vs. "they" thing also makes some sense; as I posted a couple of weeks ago, despite our technological advances, we remain tribal creatures.  If you engage that in-group-identity module in the brain, it's no wonder that we are more likely to hang on to whatever views allow us to keep our sense of belonging to the tribe.

The Tan et al. research, however, does give us some ideas about how to frame arguments in order to give us the greatest likelihood of success.  Stay calm, don't rant or ridicule.  Give your reasoning, and expand on your own views rather than simply responding to what the other person said.  If you have links or sources, post them.  Especially, show your own willingness to be persuaded.  If the person you're arguing with sees you as reasonable yourself, you're much more likely to be listened to.

Most importantly, don't give up debate as a completely fruitless and frustrating endeavor.  Vlad Niculae, who co-authored the study, found their results to be encouraging.  "If you’ve never visited [ChangeMyView]," Nicolae writes, "the concept that people can debate hot topics without getting into flame wars on the internet might be hard to believe.  Or, if you read a lot of Youtube comments, you might be inclined to believe the backfire effect, and doubt that graceful concession is even possible online.  But a quick trip to this great subreddit will undoubtedly make you a believer."