Skeptophilia (skep-to-fil-i-a) (n.) - the love of logical thought, skepticism, and thinking critically. Being an exploration of the applications of skeptical thinking to the world at large, with periodic excursions into linguistics, music, politics, cryptozoology, and why people keep seeing the face of Jesus on grilled cheese sandwiches.

Thursday, August 26, 2021

The nasty bite of Poe's Law

I have a love-hate relationship with Poe's Law.

Poe's Law, you probably know, is a rule of thumb named after Nathan Poe, who said in 2005, "The better a parody is, the harder it is to tell from the truth."

I love Poe's Law because the targets of parody and satire are often so richly deserving of it.  Consider one of the most fantastic parody sites out there -- The Onion -- which combines absolute hilarity with acid-tipped social and political commentary.  (One particularly trenchant example is that every time there is yet another mass shooting in the United States, The Onion has an article with the headline, "'No Way to Prevent This,' Says Only Nation Where This Regularly Happens.")

On the other hand, I hate Poe's Law because there is enough misinformation out there without waggish satirists adding to it.  The Law itself states that good satire will take people in; the point is to get people to say, "No, really?", at least for a moment.  But for some folks, that moment gets stretched out way too far, and you have people believing satire is the truth.

My favorite example of this -- once again from The Onion -- is the pearl-clutching woman who wrote an outraged letter to the editor of Reader's Digest after they did an interview with J. K. Rowling.  "How can you give this woman more publicity?" the letter-writer said.  "This is supposed to be a magazine that supports conservative morals and values.  J. K. Rowling is an avowed practitioner of black magic.  She has overseen the baptism of thousands of children into the Church of Satan.  There was a major exposé of Rowling's evil activities a couple of months ago in The Onion."

The editor of Reader's Digest, showing admirable restraint, printed the letter, responding only with, "The Onion is a satirical news source, not meant to be taken as fact."

The "hate" side of the ledger got another entry yesterday, when a frequent reader and contributor to Skeptophilia sent me a message about Tuesday's post, which was about a scientific study showing that people are more likely to follow absurd directives than reasonable ones.  The message said, "Um, Gord... I think that site is satire.  Check the 'About' section."

He then pointed out that the lead researcher, Fiona Hayes-Verhorsihs, has a ridiculous name.  Say it out loud.

Yup.  "Hay's for horses."  Funny thing, given my background in linguistics, that this bit of the joke went past me so fast it didn't even ruffle my hair.  I figured the last part of her name was some obscure surname, perhaps Dutch or Afrikaans by the look of it, and didn't give it any further thought.

Suffice it to say that the fellow who sent me the comment is right.  I got bitten in the ass by Poe's Law.  Not the first time this has happened, nor (I suspect) will it be the last.  I didn't really dig too hard into the antecedents of the story; if I had, I'd have realized my error pretty quickly.  The problem is, the conclusion of the faux study -- that people can be pretty irrational at times -- was something I've written about many times before, and I have no real doubt that the general point is true.  So when the study by Professor Hay's-For-Horses popped up, I didn't even question it.

Meaning that I not only fell for Poe's Law, I fell for confirmation bias.

Of course, I'm in good company.  Pravda and Xinhua have both been hoodwinked by hoax stories that sounded plausible.

But so has Fox News.  So maybe "good company" isn't the best way to phrase it.

Anyhow, once this post is up, I'll take the old one down.  I'd rather not add to the morass of wacky stuff online, and find out that someone else has mentioned the absurdity study -- and cited Skeptophilia as the source.  All of which has me rededicating myself to being careful about my own research, as should we all.  Check your sources, look for corroboration, see if you can find out the credentials of the people cited -- all before you post, like, or retweet a link.

And that goes double if you're the author of a blog devoted to rational thinking.

*********************************************

I've been interested for a long while in creativity -- where it comes from, why different people choose different sorts of creative outlets, and where we find our inspiration.  Like a lot of people who are creative, I find my creative output -- and my confidence -- ebbs and flows.  I'll have periods where I'm writing every day and the ideas are coming hard and fast, and times when it seems like even opening up my work-in-progress is a depressing prospect.

Naturally, most of us would love to enhance the former and minimize the latter.  This is the topic of the wonderful book Think Like an Artist, by British author (and former director of the Tate Gallery) Will Gompertz.  He draws his examples mostly from the visual arts -- his main area of expertise -- but overtly states that the same principles of creativity apply equally well to musicians, writers, dancers, and all of the other kinds of creative humans out there. 

And he also makes a powerful point that all of us are creative humans, provided we can get out of our own way.  People who (for example) would love to be able to draw but say they can't do it, Gompertz claims, need not to change their goals but to change their approach.

It's an inspiring book, and one which I will certainly return to the next time I'm in one of those creative dry spells.  And I highly recommend it to all of you who aspire to express yourself creatively -- even if you feel like you don't know how.

[Note: if you purchase this book using the image/link below, part of the proceeds goes to support Skeptophilia!]


Wednesday, August 25, 2021

The honesty researcher

One of the things I pride myself on is honesty.

I'm not trying to say I'm some kind of paragon of virtue, but I do try to tell the truth in a direct fashion.  I hope it's counterbalanced by kindness -- that I don't broadcast a hurtful opinion and excuse it by saying "I'm just being honest" -- but if someone wants to know what I think, I'll tell 'em.

As the wonderful poet and teacher Taylor Mali put it, "I have a policy about honesty and ass-kicking.  Which is: if you ask for it, I have to let you have it."  (And if you haven't heard his wonderful piece "What Teachers Make," from which that quote was taken -- sit for three minutes right now and watch it.)


I think it's that commitment to the truth that first attracted me to science.  I was well aware from quite a young age that there was no reason to equate an idea making me happy and an idea being the truth.  It was as hard for me to give up magical thinking as the next guy -- I spent a good percentage of my teenage years noodling around with Tarot cards and Ouija boards and the like -- but eventually I had to admit to myself that it was all a bunch of nonsense.

In science, honesty is absolutely paramount.  It's about data and evidence, not about what you'd dearly love to be true.  As the eminent science fiction author Phillip K. Dick put it, "Reality is that which, when you stop believing in it, it doesn't go away."

Or perhaps I should put it, "it should be about data and evidence."  Scientists are human, and are subject to the same temptations the rest of us are -- but they damn well better be above-average at resisting them.  Because once you've let go of that touchstone, it not only calls into question your own veracity, it casts a harsh light on the scientific enterprise as a whole.

And to me, that's damn near unforgivable.  Especially given the anti-science attitude that is currently so prevalent in the United States.  We don't need anyone or anything giving more ammunition to the people who think the scientists are lying to us for their own malign purposes -- the people whom, to quote the great Isaac Asimov, think "my ignorance is as good as your knowledge."

Which brings me to Dan Ariely.

Ariely is a psychological researcher at Duke University, and made a name for himself studying the issue of honesty.  I was really impressed with him and his research, which looked at how our awareness of the honor of truth-telling affects our behavior, and the role of group identification and tribalism in how much we're willing to bend our own personal morality.  I used to show his TED Talk, "Our Buggy Moral Code," to my Critical Thinking classes at the beginning of the unit on ethics; his conclusions seemed to be a fascinating lens on the whole issue of honesty and when we decide to abandon it.

Which is more than a little ironic, because the data Ariely used to support these conclusions appear to have been faked -- possibly by Ariely himself.

[Image licensed under the Creative Commons Yael Zur, for Tel Aviv University Alumni Organization, Dan Ariely January 2019, CC BY-SA 4.0]

Ariely has not admitted any wrongdoing, but has agreed to retract the seminal paper on the topic, which appeared in the prestigious journal Proceedings of the National Academy of Sciences back in 2012.  "I can see why it is tempting to think that I had something to do with creating the data in a fraudulent way," Ariely said, in a statement to BuzzFeed News.  "I can see why it would be tempting to jump to that conclusion, but I didn’t...  If I knew that the data was fraudulent, I would have never posted it."

His contention is that the insurance company that provided the data, The Hartford, might have given him fabricated (or at least error-filled) data, although what their motivation could be for doing so is uncertain at best.  There's also the problem that the discrepancies in the 2012 paper led analysts to sift through his other publications, and found a troubling pattern of sloppy data-handling, failures in replicability of results, misleading claims about sources, and more possible outright falsification.  (Check out the link I posted above for a detailed overview of the issues with Ariely's work.)

Seems like the one common thread running through all of these allegations is Ariely.

It can be very difficult to prove scientific fraud.  If a researcher deliberately fabricated data to support his/her claims, how can you prove that it was deliberate, and not either (1) an honest mistake, or (2) simply bad experimental design (which isn't anything to brag about, but is still in a separate class of sins from outright lying).  Every once in a while, an accused scientist will actually admit it -- one example that jumps to mind is Korean stem-cell researcher Hwang Woo-Suk, whose spectacular fall from grace reads like a Shakespearean tragedy -- but like many politicians who are accused of malfeasance, a lot of times the accused scientist just decides to double down, deny everything, and soldier on, figuring that the storm will eventually blow over.

And, sadly, it usually does.  Even in Hwang's case -- not only did he admit fraud, he was fired by Seoul National University and tried and found guilty of embezzlement -- he's back doing stem-cell research, and since his conviction has published a number of papers, including ones in PubMed.

I don't know what's going to come of Ariely's case.  Much is being made about the fact that a researcher in honesty and morality has been accused of being dishonest and immoral.  Ironic as this is, the larger problem is that this sort of thing scuffs the reputation of the scientific endeavor as a whole.  The specific results of Ariely's research aren't that important; what is much more critical is that this sort of thing makes laypeople cast a wry eye on the entire enterprise.

And that, to me, is absolutely inexcusable.

*********************************************

I've been interested for a long while in creativity -- where it comes from, why different people choose different sorts of creative outlets, and where we find our inspiration.  Like a lot of people who are creative, I find my creative output -- and my confidence -- ebbs and flows.  I'll have periods where I'm writing every day and the ideas are coming hard and fast, and times when it seems like even opening up my work-in-progress is a depressing prospect.

Naturally, most of us would love to enhance the former and minimize the latter.  This is the topic of the wonderful book Think Like an Artist, by British author (and former director of the Tate Gallery) Will Gompertz.  He draws his examples mostly from the visual arts -- his main area of expertise -- but overtly states that the same principles of creativity apply equally well to musicians, writers, dancers, and all of the other kinds of creative humans out there. 

And he also makes a powerful point that all of us are creative humans, provided we can get out of our own way.  People who (for example) would love to be able to draw but say they can't do it, Gompertz claims, need not to change their goals but to change their approach.

It's an inspiring book, and one which I will certainly return to the next time I'm in one of those creative dry spells.  And I highly recommend it to all of you who aspire to express yourself creatively -- even if you feel like you don't know how.

[Note: if you purchase this book using the image/link below, part of the proceeds goes to support Skeptophilia!]


Monday, August 23, 2021

Studies show the author of Skeptophilia is brilliant!

I would love it if some psychologist who studies the effect of media on people's beliefs would do a specific experiment, and then let me know the results.

The experiment I'd like done is to have a series of fake news articles that test subjects would read.  There would be two different kinds of articles -- ones in which the headline basically summarized what the text of the article said (as it should be), and ones in which the headline made a statement that was at odds with what the text of the article actually claimed.  Then, subjects would answer some questions, and see which had a greater impact in their memory -- the contents of the headline, or the contents of the article text.  

I strongly suspect that when the text of an article and the headline conflict, it's the headline that will have the biggest effect on what readers remember.  It's the first thing they see; it's in bold print; and it gives a catchy, terse summary of what the story supposedly is about.  All of the details in the text, I think, are much more likely to be lost, misremembered, or ignored outright.  

An interesting twist would be to ask the people who got the second set of articles -- the ones where the headline contradicted the content -- whether they even noticed.  My guess is a lot of people wouldn't.  I can't tell you how many times I've had someone post a comment or response to my blog that said, "Yes, but have you considered _____?", and it turns out what they wanted me to consider is something that I explicitly addressed in paragraph two.

This comes up because of an article sent to me by a friend, which was entitled "New Studies: ‘Conspiracy Theorists’ Sane, While Government Dupes Are Crazy and Hostile."  The story, which appeared in 21st Century Wire, is making a pretty bold claim -- that what the conspiracy theorists have been claiming all along is correct.  All of us skeptics, who have scoffed at Stop the Steal and Pizzagate and chemtrails and Illuminati and mind control and RFID chip implants in the COVID vaccine and evil Satanic Masonic rituals, are not only wrong, we are the crazy ones.


Naturally, I was pretty interested to read about this.

The first paragraph basically mirrored the headline, stating that "those labeled 'conspiracy theorists' appear to be saner than those who accept the official version of contested events."  Then, we hear about the first study:
The most recent study was published on July 8th by psychologists Michael J. Wood and Karen M. Douglas of the University of Kent (UK). Entitled “What about Building 7?  A social psychological study of online discussion of 9/11 conspiracy theories,” the study compared “conspiracist” (pro-conspiracy theory) and “conventionalist” (anti-conspiracy) comments at news websites.

The authors were surprised to discover that it is now more conventional to leave so-called conspiracist comments than conventionalist ones: “Of the 2174 comments collected, 1459 were coded as conspiracist and 715 as conventionalist.”  In other words, among people who comment on news articles, those who disbelieve government accounts of such events as 9/11 and the JFK assassination outnumber believers by more than two to one.  That means it is the pro-conspiracy commenters who are expressing what is now the conventional wisdom, while the anti-conspiracy commenters are becoming a small, beleaguered minority.
By this time, I was already bouncing up and down in my chair, yelling at my computer, "Just hang on a moment!  That doesn't support what the headline said at all!"  So we have double the number of conspiracist comments as conventional ones posted on news websites -- we're supposed to conclude from this that the conspiracists are more likely to be right?  Or even sane?  All it means is that conspiracist comments are common, which is hardly the same thing.

I don't think that the we can even conclude from this that the conspiracists themselves outnumber the "conventionalists."  For that, we'd need to make the further assumption that people of all beliefs are equally likely to post, which seems like a leap, considering what a rabid lot some of the conspiracy theorists seem to be.  Myself, I have a hard enough time bringing myself to read the comments section on controversial articles, much less post my own comments.

Then, we hear about the second "study:"
(T)hese findings are amplified in the new book Conspiracy Theory in America by political scientist Lance deHaven-Smith, published earlier this year by the University of Texas Press.  Professor deHaven-Smith explains why people don’t like being called “conspiracy theorists”:  The term was invented and put into wide circulation by the CIA to smear and defame people questioning the JFK assassination!  “The CIA’s campaign to popularize the term ‘conspiracy theory’ and make conspiracy belief a target of ridicule and hostility must be credited, unfortunately, with being one of the most successful propaganda initiatives of all time.”

In other words, people who use the terms “conspiracy theory” and “conspiracy theorist” as an insult are doing so as the result of a well-documented, undisputed, historically-real conspiracy by the CIA to cover up the JFK assassination.  That campaign, by the way, was completely illegal, and the CIA officers involved were criminals; the CIA is barred from all domestic activities, yet routinely breaks the law to conduct domestic operations ranging from propaganda to assassinations.
Ah.  So because (1) conspiracy theorists don't like being called conspiracy theorists, and (2) the CIA engaged in some nasty business surrounding the JFK assassination, the conspiracy theorists are actually sane when they babble about chemtrails and the Reptilians.  Got it.

Then, we have an alleged conclusion from psychologist Laurie Manwell, of the University of Guelph, summarized as follows:
Psychologist Laurie Manwell of the University of Guelph agrees that the CIA-designed “conspiracy theory” label impedes cognitive function.  She points out, in an article published in American Behavioral Scientist (2010), that anti-conspiracy people are unable to think clearly about such apparent state crimes against democracy as 9/11 due to their inability to process information that conflicts with pre-existing belief.
So, I did a little digging on Manwell -- and as you might already be anticipating, the author of the article in 21st Century Wire is misrepresenting her, too.  Turns out Manwell thinks that laypeople of all stripes tend to ignore factual information, and pay more attention to claims that support what they already believed.  Take a look at what she wrote in a June 2007 paper, "Faulty Towers of Belief:"
Most laypersons would agree with research showing that attitudes influence a person's evaluation of a subject -- whether it be an idea or another person -- and that the stronger the attitude, the greater influence it will have in evoking a positive or a negative evaluation.  However, the types of reasoning processes that laypersons believe they use when evaluating information are not necessarily the processes that they actually use.  Research repeatedly shows that what people say they are doing, and what they are actually doing, are often two very different things...  Thus, in evaluating the events of 9/11, we need to keep in mind that there are many factors that influence our judgments, including previously formed attitudes and beliefs, many of which are resistant to change, and some of which we may not even be aware of at the time of evaluation.
So, the bottom line is that Manwell's contention is that we're all prone to confirmation bias, which is hardly the same thing as claiming that the conspiracy theorists are clear-eyed exponents of the truth, and the skeptics are dim-witted obstructionists.  And as far as who is entering the argument with more "previously formed attitudes and beliefs," might I just ask you to consider that question from the standpoint of contrasting David Icke and Alex Jones with, say, Sharon Hill, Rebecca Watson, and Simon Singh?

Oh, but don't let that stand in the way of your drawing the conclusion you'd already settled on.  Here's the last line of the article in 21st Century Wire:
No wonder the anti-conspiracy people are sounding more and more like a bunch of hostile, paranoid cranks.
Have you considered the possibility that we're cranky and hostile because we're getting really fucking tired of arguing with a bunch of people who appear to have spent way too much time playing on a pogo stick in a room with low ceilings?

Anyhow, there you have it.  Take some actual research, claim it supports the contentions you already had, then turn around and accuse your opponents of doing what you just did.  Craft a nice, inflammatory headline that basically says, "You Should Believe Me Because the People Who Disagree With Me Are Big Fat Liars," and call it good.

Chances are, the most your readers are going to remember is what you wrote is the headline, anyway, which gives me an idea.  Maybe I should start giving my posts headlines like "New Studies Show That You'll Have Good Luck If You Send Gordon Money."  It's worth a try, because attempting to become independently wealthy as a writer seems to be a losing proposition any other way.

*********************************************

I've been interested for a long while in creativity -- where it comes from, why different people choose different sorts of creative outlets, and where we find our inspiration.  Like a lot of people who are creative, I find my creative output -- and my confidence -- ebbs and flows.  I'll have periods where I'm writing every day and the ideas are coming hard and fast, and times when it seems like even opening up my work-in-progress is a depressing prospect.

Naturally, most of us would love to enhance the former and minimize the latter.  This is the topic of the wonderful book Think Like an Artist, by British author (and former director of the Tate Gallery) Will Gompertz.  He draws his examples mostly from the visual arts -- his main area of expertise -- but overtly states that the same principles of creativity apply equally well to musicians, writers, dancers, and all of the other kinds of creative humans out there. 

And he also makes a powerful point that all of us are creative humans, provided we can get out of our own way.  People who (for example) would love to be able to draw but say they can't do it, Gompertz claims, need not to change their goals but to change their approach.

It's an inspiring book, and one which I will certainly return to the next time I'm in one of those creative dry spells.  And I highly recommend it to all of you who aspire to express yourself creatively -- even if you feel like you don't know how.

[Note: if you purchase this book using the image/link below, part of the proceeds goes to support Skeptophilia!]


Saturday, August 21, 2021

The evolution of Little Red Riding Hood

Every once in a while, I'll run across a piece of scientific research that is so creative and clever that it just warms my heart, and I felt this way yesterday when I stumbled onto a link to the article in PLoS ONE called "The Phylogeny of Little Red Riding Hood," by Jamshid Tehrani of the University of Bristol.

The reason I was delighted by Tehrani's paper is that it combines two subjects I love -- evolutionary biology and mythology and folklore.  The gist of what Tehrani did is to use a technique most commonly used to assemble species into "star diagrams" -- cladistic bootstrap analysis -- to analyze worldwide versions of the "Little Red Riding Hood" story to see to what degree a version in (for example) Senegal was related to one in Germany.

Cladistic bootstrap analysis generates something called a "star diagram" -- not, generally, a pedigree or family tree, because we don't know the exact identity of the common ancestor to all of the members of the tree, all we can tell is how closely related current individuals are.  Think, for example, of what it would look like if you assembled the living members of your family group this way -- you'd see clusters of close relatives linked together (you, your siblings, and your first cousins, for example) -- and further away would be other clusters, made up of more distant relatives grouped with their near family members.

So Tehrani did this with the "Little Red Riding Hood" story, by looking at the similarities and differences, from subtle to major, between the way the tale is told in different locations.  Apparently there are versions of it all over the world -- not only the Grimm Brothers Fairy Tales variety (the one I know the best), but from Africa, the Middle East, India, China, Korea, and Japan.  Oral transmission of stories is much like biological evolution; there are mutations (people change the story by misremembering it, dropping some pieces, embellishment, and so on) and there is selection (the best versions, told by the best storytellers, are more likely to be passed on).  And thus, the whole thing unfolds like an evolutionary lineage.

In Tehrani's analysis, he found three big branches -- the African branch (where the story is usually called "The Wolf and the Kids"), the East Asian branch ("Tiger Grandmother"), and the European/Middle Eastern Branch ("Little Red Riding Hood," "Catterinella," and "The Story of Grandmother").  (For the main differences in the different branches, which are fascinating but too long to be quoted here in full, check out the link to Tehrani's paper.)

Put all together, Tehrani came up with the following cladogram:




WK = "The Wolf and the Kids," TG = "Tiger Grandmother," "Catt" = "Catterinella," GM = "The Story of Grandmother," and RH = "Little Red Riding Hood;" the others are less common variations that Tehrani was able to place on his star diagram.

The whole thing just makes me very, very happy, and leaves me smiling with my big, sharp, wolflike teeth.

Pure research has been criticized by some as being pointless, and this is a stance that I absolutely abhor.  There is a completely practical reason to support, fund, and otherwise encourage pure research -- and that is, we have no idea yet what application some technique or discovery might have in the future.  A great deal of highly useful, human-centered science has been uncovered by scientists playing around in their labs with no other immediate goal than to study some small bit of the universe.  Further, the mere application of raw creativity to a problem -- using the tools of cladistics, say, to analyze a folk tale -- can act as an impetus to other minds, elsewhere, encouraging them to approach the problems we face in novel ways.

But I think it's more than that.  The fundamental truth here is that human mind needs to be exercised.  The "what good is it?" attitude is not only anti-science, it is anti-intellectual.  It devalues inquiry, curiosity, and creativity.  It asks the question "how does this benefit humanity?" in such a way as to imply that the sheer joy of comprehending deeply the world around us is not a benefit in and of itself.

It may be that Tehrani's jewel of a paper will have no lasting impact on humanity as a whole.  I'm perfectly okay with that, and I suspect Tehrani would be, as well.  We need to make our brains buckle down to the "important stuff," yes; but we also need to let them out to play sometimes, a lesson that the men and women currently overseeing our educational system need to learn.  In a quote that seems unusually apt, considering the subject of Tehrani's research, Albert Einstein said: "I am enough of an artist to draw freely upon my imagination.  Imagination is more important than knowledge.  Knowledge is limited.  Imagination encircles the world." 

************************************

I was an undergraduate when the original Cosmos, with Carl Sagan, was launched, and being a physics major and an astronomy buff, I was absolutely transfixed.  Me and my co-nerd buddies looked forward to the new episode each week and eagerly discussed it the following day between classes.  And one of the most famous lines from the show -- ask any Sagan devotee -- is, "If you want to make an apple pie from scratch, first you must invent the universe."

Sagan used this quip as a launching point into discussing the makeup of the universe on the atomic level, and where those atoms had come from -- some primordial, all the way to the Big Bang (hydrogen and helium), and the rest formed in the interiors of stars.  (Giving rise to two of his other famous quotes: "We are made of star-stuff," and "We are a way for the universe to know itself.")

Since Sagan's tragic death in 1996 at the age of 62 from a rare blood cancer, astrophysics has continued to extend what we know about where everything comes from.  And now, experimental physicist Harry Cliff has put together that knowledge in a package accessible to the non-scientist, and titled it How to Make an Apple Pie from Scratch: In Search of the Recipe for our Universe, From the Origin of Atoms to the Big Bang.  It's a brilliant exposition of our latest understanding of the stuff that makes up apple pies, you, me, the planet, and the stars.  If you want to know where the atoms that form the universe originated, or just want to have your mind blown, this is the book for you.

[Note: if you purchase this book using the image/link below, part of the proceeds goes to support Skeptophilia!]



Friday, August 20, 2021

Content warnings

Last week's Fiction Friday post -- about how (or if) we should continue to read writers whose work gives tacit acceptance to such repugnant views as racism or homophobia -- resulted in a few interesting responses and questions. 

The first one had to do with how I as a writer approach other sorts of sensitive topics, especially sexuality and violence.  It did immediately make me wonder why here in the United States those are so often lumped together; we often talk about "sex and violence" in one breath, with regards especially to movie content.  I have no idea why something that can be an expression of pleasure and loving intimacy is somehow put in the same category as harming someone, but that's just one of a gazillion things I don't understand about my own culture.

But accepting for now that they're frequently thrown into the same "this is taboo" category... in my own work, I've written both sex scenes and violent scenes.  To me, both of these in fiction ramp up the emotional intensity, and I have no hesitation including them if it seems appropriate for the plot and characters.  However, I've also seen way too many examples of gratuitous content, where such scenes are simply pasted in to titillate the reader/watcher, and that to me is no more excusable than any other action that leaves you wondering what the point was.

I'm reminded of how some of my students responded to seeing The Matrix Reloaded.  If you haven't watched this movie, there's a scene where Neo and Trinity are desperately horny and looking around for somewhere, anywhere, that they can get each others' clothes off.  They finally succeed, but other than giving us a chance to see Keanu Reeves and Carrie Anne Moss buck naked, it really did zilch for the plot.  And you'd think a bunch of teenage guys would have thought that was awesome, but one and all they branded it as a "stupid scene."

As far as gratuitous violence, consider the amount of goriness in Kill Bill as compared to The Usual Suspects.  I've never taken a body count of either movie; suffice it to say it's high in both films.  But the amount of blood flying around doesn't even begin to compare.  The Usual Suspects, for all of the death and destruction, is a subtle movie, and leaves way more to the imagination than it shows you.  Kill Bill... isn't.

In my own work, I do sometimes include explicit sexuality or violence, but I hope none of it is unnecessary.  Also, there can be many reasons for including such content.  The sex scene in Sephirot is between the main character and a woman he will soon desperately regret getting friendly with; in Kári the Lucky, it's sweet and sad, between lovers who are headed for inevitable tragedy; in Whistling in the Dark, between two characters who have found love and healing in each other after suffering terrible emotional damage.  The same with violent scenes; in Gears, which might be the most overall-violent book I've written, one character gets her arm broken and is choked nearly to death, another is killed by being thrown against a wall, another third shot in the middle of the chest, another crushed by a (psychically-generated) landslide, yet another murdered by a deliberately-loosed falling piece of masonry.  Even so, the violence isn't the point of the story.  If anything, the point of Gears is that goodness and courage and steadfastness will always win over greed and deception and ruthlessness.  The violence is there not only to advance the plot, but to set in stark relief how a choice to be brave and moral isn't without risks, but it's still what we should all aspire to.

Another question generated by last Friday's post had to do with "content warnings" or "trigger warnings."  Should they be present on a book's back cover?  A related question -- are there topics that are over the line for me, that are enough of an emotional trigger for me that I can't write them?

I've never included a content warning for my own work, although I did one time mention to someone I thought might be sensitive to it that Sephirot has a fairly explicit sex scene (as it turned out, the reader in question had no problem with it).  In my work, pretty much What You See Is What You Get; the back cover blurb will give you a pretty good idea of the content of the story, and readers can make the decision whether or not to read a particular book based on that, without needing a specific content warning.  I mostly write speculative/paranormal fiction, so you can expect lots of spooky atmosphere, but (I hope) nothing that really offends.  

However, since we're talking about the capacity for offending readers, it must be mentioned that some of my characters have the tendency to swear a lot.  This is partly because I swear a lot.  I try to make it appropriate for the scene and character, but Be Thou Forewarned.

Be Thou Even More Forewarned if we ever sit down and have a beer together.

As an amusing aside -- I recall being at a book signing event, and a rather prim-looking woman coming up to me and saying she'd really enjoyed Lock & Key, but "the character of the Librarian sure does use the f-bomb a lot."

I responded, completely deadpan, "I know!  I tried talking to him about it, but he told me to fuck off."

Well, at least I thought it was funny.

In all seriousness, the problem is that different people have different sensitive points.  I gave up on the book The Third Eye (by T. Lobsang Rampa), and walked out of the movie Brazil, because of torture scenes; despite my fascination with Scottish history, I refuse to watch Braveheart because I know damn good and well what happens to William Wallace at the end.  However, I know people who had no problem with any of those -- the scenes in question might have been unpleasant, but not enough to cause them serious upset.

In my own work, there are three kinds of scenes that I can't stomach writing; rape, pedophilia, and animal abuse.  I just can't do it.  As far as the last-mentioned, I found out from another reader that I'm not the only one who can't deal with reading about harming animals even in a fictional setting.  In Kill Switch, the main character's dog, Baxter, is his constant companion.  I was stopped on the street by someone in my village who told me he was reading Kill Switch, and so far was enjoying it -- but then a frown crossed his face, and he said, "I know people are gonna die.  I'm okay with that.  It's a thriller, after all."  He brought his face near mine, and said in an intense voice, "But if you kill Baxter, I will never speak to you again."

The scene that for me danced the closest to the edge of that line is, once again, in my novel Sephirot (yeah, it's a pretty emotionally-fraught story, in case you hadn't already figured that out).  A character is the recipient of a brutal bare-back whipping -- it's absolutely necessary for the plot, but it was right at the boundary of "this is too awful for me to write about."

I guess everyone has their limits -- and we as writers need to be cognizant of that.

Anyhow, there are a few responses to the questions and comments generated by last Friday's post.  I love hearing what people think, and what thoughts my posts bring up for readers, so keep those cards and letters comin'.  As for me, I need to get to my work-in-progress, and see what diabolical plot twists I can think of for this novel.  As Stephen King put it, "In a good story, the author gets the readers to love the characters -- then releases the monsters."

So now I'm off to give the monsters some exercise.

************************************

I was an undergraduate when the original Cosmos, with Carl Sagan, was launched, and being a physics major and an astronomy buff, I was absolutely transfixed.  Me and my co-nerd buddies looked forward to the new episode each week and eagerly discussed it the following day between classes.  And one of the most famous lines from the show -- ask any Sagan devotee -- is, "If you want to make an apple pie from scratch, first you must invent the universe."

Sagan used this quip as a launching point into discussing the makeup of the universe on the atomic level, and where those atoms had come from -- some primordial, all the way to the Big Bang (hydrogen and helium), and the rest formed in the interiors of stars.  (Giving rise to two of his other famous quotes: "We are made of star-stuff," and "We are a way for the universe to know itself.")

Since Sagan's tragic death in 1996 at the age of 62 from a rare blood cancer, astrophysics has continued to extend what we know about where everything comes from.  And now, experimental physicist Harry Cliff has put together that knowledge in a package accessible to the non-scientist, and titled it How to Make an Apple Pie from Scratch: In Search of the Recipe for our Universe, From the Origin of Atoms to the Big Bang.  It's a brilliant exposition of our latest understanding of the stuff that makes up apple pies, you, me, the planet, and the stars.  If you want to know where the atoms that form the universe originated, or just want to have your mind blown, this is the book for you.

[Note: if you purchase this book using the image/link below, part of the proceeds goes to support Skeptophilia!]



Thursday, August 19, 2021

The origins of Old Yeller

Since the last few days (hell, the last few years) of news has been filled with one horrible thing after another, today I'm retreating into my happy place, namely: the cool scientific discovery of the week.

And puppies.  Lots o' puppies.

I don't know if it's ever occurred to the dog lovers in the studio audience how unusual dog coat coloration is.  I can't think of another animal species that has such striking variability -- from the jet black of black labs to the solid bronze of golden retrievers to the spots of Dalmatians to the particolored patches of collies, there is huge variation in fur color across the species.

One additional one that is especially curious is called agouti coloration -- when the base of the hair is yellow and the tip is black.  This is frequently seen in German shepherds, and was also the coat pattern in my beloved rescue dog Grendel:

If you're wondering, Grendel was not spoiled.  At all.

As you can see, Grendel also looked a bit like someone created a Frankendog by stitching together parts of about six different breeds.  He didn't have any other German-shepherd-like characteristics, but he definitely seemed to have pilfered his fur from one while it wasn't looking.

Well, a new piece of research that appeared in Nature Ecology and Evolution this week indicates that five very common coat color patterns in dogs come from the activity of a single gene.  Where and when this gene activates (and creates a gene product called the agouti signaling protein) determines the deposition of two pigments -- eumelanin (which is black) and pheomelanin (which is yellow).  The amount and placement of these two pigments creates five different color patterns, as shown below:

[Image from Bannasch et al.}

One of these alleles, dominant yellow, is apparently of ancient origins; the researchers determined that it was present in an extinct canid species that branched off from wolves over two million years ago.

I'm a little curious about another dog coat feature, the white blaze, something my current non-spoiled dog Guinness has:


He also has white toes, which may or may not be related:


As you can see from the image from Bannasch et al., some of the dogs expressing each pattern have white blazes and some don't, so whatever genetic mechanism controls it must be independent of the agouti gene.

But if you have a dog with some yellow or agouti coloration, you now know that your pooch descends from a branch of the canine family tree that is two million years old.  As far as Guinness goes, I flatly refuse to believe he descends from wolves.  His level of fierceness is somewhere between "cream puff" and "cupcake."  He is basically a seventy-pound lap dog. 


In any case, that's the latest from the field of canine genetics and evolution.  Me, I wonder where another important dog feature comes from, and that's the cute head tilt.  There's no doubt that it's a significant selective advantage:
Guinness:  Play ball? 
Me:  Dude.  It's raining outside. 
Guinness:  Please play ball? 
Me:  Don't you want to wait?  I really don't want to go stand out in the... 
Guinness: *adorable head tilt* 
Me:  Dammit.
Speaking of which, I need to go get my dogs their breakfast because they're staring at me.  You'd think if they really are descended from wolves, they could go hunt down a squirrel or something, but I guess the decision to take advantage of sofas was made at the same time as they figured out it was easier to wait for someone to place a bowl full of dog food in front of them than to wear themselves out chasing some scrawny squirrel.

You gotta wonder who has trained whom, here.

************************************

I was an undergraduate when the original Cosmos, with Carl Sagan, was launched, and being a physics major and an astronomy buff, I was absolutely transfixed.  Me and my co-nerd buddies looked forward to the new episode each week and eagerly discussed it the following day between classes.  And one of the most famous lines from the show -- ask any Sagan devotee -- is, "If you want to make an apple pie from scratch, first you must invent the universe."

Sagan used this quip as a launching point into discussing the makeup of the universe on the atomic level, and where those atoms had come from -- some primordial, all the way to the Big Bang (hydrogen and helium), and the rest formed in the interiors of stars.  (Giving rise to two of his other famous quotes: "We are made of star-stuff," and "We are a way for the universe to know itself.")

Since Sagan's tragic death in 1996 at the age of 62 from a rare blood cancer, astrophysics has continued to extend what we know about where everything comes from.  And now, experimental physicist Harry Cliff has put together that knowledge in a package accessible to the non-scientist, and titled it How to Make an Apple Pie from Scratch: In Search of the Recipe for our Universe, From the Origin of Atoms to the Big Bang.  It's a brilliant exposition of our latest understanding of the stuff that makes up apple pies, you, me, the planet, and the stars.  If you want to know where the atoms that form the universe originated, or just want to have your mind blown, this is the book for you.

[Note: if you purchase this book using the image/link below, part of the proceeds goes to support Skeptophilia!]



Wednesday, August 18, 2021

Non-binary reality check

One of the claims I hear that infuriates me the most is that LGBTQ+ identification is becoming more common because our society is increasingly amoral, and this is somehow fostering a sense that "being gay will get me noticed."  This is really just the "LGBTQ+ is a choice" foolishness in slightly prettier packaging, along with the sense that queer people are doing it for attention, and my lord isn't that such an inconvenience for everyone else.  I just saw a meme a couple of days ago that encapsulated the idea; it went something like, "We no longer have to explain just the birds and the bees to kids, we have to explain the birds and the birds and the bees and the bees and the birds who think they're bees and the bees who think they're birds..."  And so on and so forth.  You get the idea.

The most insidious thing about this claim is that it delegitimizes queer identification, making it sound no more worthy of serious consideration than a teenager desperate to buy into the latest fashion trend.  It also ignores the actual explanation -- that there were just as many LGBTQ+ people around decades and centuries ago, but if there's a significant chance you will be harmed, jailed, discriminated against, ridiculed, or killed if you admit to who you are publicly, you have a pretty powerful incentive not to tell anyone.  I can vouch for that in my own case; I not only had the threat of what could happen in the locker room hanging over my head if I'd have admitted I was bisexual when I realized it (age fifteen or so), but the added filigree that my religious instructors had told us in no uncertain terms that any kind of sex outside of the traditional male + female marriage was a mortal sin that would result in eternal hellfire.

And that included masturbation.  Meaning that just about all of us received our tickets to hell when we were teenagers and validated them thereafter with great regularity.

The reason this comes up is because of two studies I ran into in the last couple of days.  The first, in The Sociological Review, is called "ROGD is a Scientific-sounding Veneer for Unsubstantiated Anti-trans View: A Peer-reviewed Analysis," by Florence Ashley of the University of Toronto.  ROGD is "rapid-onset gender dysphoria," and is the same thing I described above, not only in pretty packaging but with a nice psychobabble bow on top; the claim boils down to the choice of a trans person to come out being driven by "social contagion," and therefore being a variety of mental illness.  The whole thing hinges on the "suddenness" aspect of it, as if a person saying, "By the way, I'm trans" one day means that they'd just figured it out that that day.  You'd think anyone with even a modicum of logical faculties would realize that one doesn't imply the other.  I came out publicly as queer three years ago, but believe me, it was not a new realization for me personally.  I'd known for decades.  Society being what it is, it just took me that long to have to courage to say so.

Ashley's paper addresses this in no uncertain terms:

"Rapid-onset gender dysphoria" (ROGD) first appeared in 2016 on anti-trans websites as part of recruitment material for a study on an alleged epidemic of youth coming out as trans "out of the blue" due to social contagion and mental illness.  Since then, the concept of ROGD has spread like wildfire and become a mainstay of anti-trans arguments for restricting access to transition-related care...  [It is] evident that ROGD is not grounded in evidence but assumptions.  Reports by parents of their youth’s declining mental health and degrading familial relationships after coming out are best explained by the fact that the study recruited from highly transantagonistic websites.  Quite naturally, trans youth fare worse when their gender identity isn’t supported by their parents.  Other claims associated with ROGD can similarly be explained using what we already know about trans youth and offer no evidence for the claim that people are ‘becoming trans’ because of social contagion or mental illness.
The second, quite unrelated, paper was in The European Journal of Archaeology and describes a thousand-year-old burial in southern Finland that strongly suggests the individual buried there was androgynous.  Genetic analysis of the bones showed that they'd belonged to someone with Klinefelter Syndrome, a disorder involving a chromosomally-male person having an extra X chromosome (i.e., XXY instead of XY).  This results in someone who is basically male but has some female physical features -- most often, the development of breasts.  

Nondisjunction disorders like Klinefelter Syndrome are not uncommon, and finding a bone from someone with an odd number of chromosome is hardly surprising.  But what made this paper stand out to me -- and what it has to do with the previous one -- is that the individual in the grave in Finland was buried with honors, and with accoutrements both of males and females.  There was jewelry and clothing traditionally associated with women, but two sword-hilts that are typically found in (male) warrior-burials.


Artist's depiction of the burial at Suontaka [Image from Moilanen et al., July 2021]

So apparently, not only was the person in the grave buried with honors, (s)he/they were openly androgynous -- and that androgyny was accepted by the community to the extent that (s)he/they were buried with grave goods representing both gender roles.

"This burial [at Suontaka] has an unusual and strong mixture of feminine and masculine symbolism, and this might indicate that the individual was not strictly associated with either gender but instead with something else," said study leader Ulla Moilanen of  the University of Turku.  "Based on these analyses, we suggest... [that] the Suontaka grave possibly belonged to an individual with sex-chromosomal aneuploidy XXY.  The overall context of the grave indicates that it was a respected person whose gender identity may well have been non-binary."

If Moilanen and her group are correct in their conclusions, it gives us the sobering message that people in tenth-century C.E. Finland were doing better than we are at accepting that sexual identification and orientation aren't simple and binary.


What it comes back to for me is the astonishing gall it takes to tell someone, "No, you don't know your own sexuality; here, let me explain it to you."  Why it's apparently such a stressor for some people when a friend says, "I'm now identifying as ____, this is my new name," I have no idea, especially given that nobody seems to have the least trouble switching from "Miss" to "Mrs." and calling a newly-married woman by her husband's last name when the couple makes that choice.  The harm done to people from telling them, "Who you are is wrong/a phase/a plea for attention/sinful" is incalculable; it's no wonder that the suicide rate amongst LGBTQ+ is three times higher than it is for cis/het people.

All of which, you'd think, would be a tremendous impetus for outlawing the horrors of "conversion therapy" and "ex-gay ministries" worldwide.  But no.

More exasperating still, now there's apparently evidence that people in Finland a thousand years ago had figured this whole thing out better than we have, making it even more crystal-clear why so many of us sound exhausted when we ask, "why are we still having to fight these battles?" 

Of course, as tired as we are of saying the same thing over and over, we certainly can't stop now.  We have made some headway; my guess is that if I were a teenager now, I'd have few compunctions about admitting I'm queer, and that's even considering how ridiculously shy I am.  Contrast that to when I actually was a teenager back in the 1970s, and there was not a single out LGBTQ+ in my entire graduating class (although several of us came out later; in my case, much later). 

And allow me to state, if I hadn't already made the point stridently enough: none of us was "turned queer" between graduation and coming out.  We just finally made our way into a context where we were less likely to be ridiculed, discriminated against, or beaten up for admitting who we are.

I'll end with something else I found online, that sums up the whole issue nicely -- although it does highlight how far we still have to go, despite the reality checks we're seeing increasingly often in scientific research.  Even with all that, I firmly believe it:


************************************

I was an undergraduate when the original Cosmos, with Carl Sagan, was launched, and being a physics major and an astronomy buff, I was absolutely transfixed.  Me and my co-nerd buddies looked forward to the new episode each week and eagerly discussed it the following day between classes.  And one of the most famous lines from the show -- ask any Sagan devotee -- is, "If you want to make an apple pie from scratch, first you must invent the universe."

Sagan used this quip as a launching point into discussing the makeup of the universe on the atomic level, and where those atoms had come from -- some primordial, all the way to the Big Bang (hydrogen and helium), and the rest formed in the interiors of stars.  (Giving rise to two of his other famous quotes: "We are made of star-stuff," and "We are a way for the universe to know itself.")

Since Sagan's tragic death in 1996 at the age of 62 from a rare blood cancer, astrophysics has continued to extend what we know about where everything comes from.  And now, experimental physicist Harry Cliff has put together that knowledge in a package accessible to the non-scientist, and titled it How to Make an Apple Pie from Scratch: In Search of the Recipe for our Universe, From the Origin of Atoms to the Big Bang.  It's a brilliant exposition of our latest understanding of the stuff that makes up apple pies, you, me, the planet, and the stars.  If you want to know where the atoms that form the universe originated, or just want to have your mind blown, this is the book for you.

[Note: if you purchase this book using the image/link below, part of the proceeds goes to support Skeptophilia!]