Skeptophilia (skep-to-fil-i-a) (n.) - the love of logical thought, skepticism, and thinking critically. Being an exploration of the applications of skeptical thinking to the world at large, with periodic excursions into linguistics, music, politics, cryptozoology, and why people keep seeing the face of Jesus on grilled cheese sandwiches.

Tuesday, August 31, 2021

Illuminating the Dark Ages

A writer friend of mine threw out a question on social media that I thought was fascinating: if you had a time machine and could visit (safely) any time in the past, when would you choose?

There are so many to choose from -- at least, if safety were guaranteed -- it's hard to pick one.  If you add in pre-Homo-sapiens times, even more so.  There are the ones that would be fascinating just from their spectacular nature.  Think of watching, from a distance, the Chicxulub Meteorite striking the Earth, ending the Cretaceous Period and the hegemony of the dinosaurs in one fell swoop.  Or how about the eruption of the Siberian Traps two hundred million years earlier -- thought to be the biggest series volcanic eruptions ever, releasing four million cubic kilometers of lava -- one of the events that kicked off the Permian-Triassic extinction, the largest mass extinction known.

Even the eruption of Mount Vesuvius in 79 C.E. would have been something to see.

Then there are the ones of historical import.  Imagine witnessing the Mayan, Inca, or Aztec civilizations at their heights.  Consider visiting the Library of Alexandria before it was destroyed, and getting a chance to browse through works of literature for which we have no traces left.  So many classic works didn't survive -- we have only seven of the 120 plays written by Sophocles, eighteen of the 92 by Euripides, seven out of the eighty-odd written by Aeschylus -- and think of how many other amazing writers had their entire body of work obliterated, and whose names we might not even know.

So choosing only one would be fiendishly difficult.  But for me, there's one that jumps to mind whenever I consider the question.  I would love to visit Britain during the "Dark Ages," some time between the withdrawal of the Romans from the island in 410 C.E. and the consolidation of Anglo-Saxon rule starting with Penda of Mercia in the early part of the seventh century.

My main reason for choosing this time period is that it's such a historical mystery.  We have a number of accounts of events from that time, most famously the Arthurian legend cycle, but given how few records actually date from that period -- most were written centuries afterward, and based upon sources of dubious reliability -- no one knows where the myth ends and the history begins.

For example, consider the research I found out through my friend and frequent contributor to Skeptophilia, the author Gil Miller, indicating that one of the sites associated with Arthur dates from much much earlier.

The site in question is on the River Wye in rural Hertfordshire, and is nicknamed "Arthur's Stone."  Depending on who you talk to, the Stone is said to be the place where Arthur defeated a giant, a place where he knelt to pray before going to battle against the Saxons, or a monument marking where he's buried.  Whether the Stone actually had anything to do with Arthur himself (presuming he even existed) is a matter of debate, but what this research has shown is that Arthur's Stone predates Arthur himself...

... by about 4,500 years.

This site is Neolithic in origin, and in fact was built before Stonehenge, which is itself incorrectly associated with Druids and the Celts.  In fact, neither of them was around at the time; that area of Britain was inhabited at that point by the Western Hunter-Gatherer culture (originally of Aegean or Anatolian origin) and the north African and Iberian Bell Beaker People, so named because of the characteristic shape of their clay drinking vessels.

So even the sites that are associated with King Arthur have no certain historical connection.  And as far as Arthur himself, we don't even know for sure who he was.  Various theories include his being one of a number of different Welsh chieftains, a half-Celtic, half-Roman military man who stayed behind when Rome made their official withdrawal, or possibly just someone who had a sword bunged at him by a watery tart.


There's no doubt that a good many of the legends that have been spun about Arthur and the Round Table and Camelot and the Holy Grail are complete fairy stories, but once again, what's impossible at this point is to tease out which bits (if any) are actually historical, all the way down to which of the various characters even existed.  I mean, I have my doubts about the Lady of the Lake and the Green Knight and even the Killer Rabbit of Caer Bannog, but what about Guinevere and Galahad and Lancelot and the rest?  Do any of these even have a glancing connection with real historical people?

I would love to know the answer to that.

So those are my musings for this morning.  Going back and witnessing the Battle of Badon Hill.  I still would have to think long and hard to choose between that and the Library of Alexandria -- but I have the feeling I'd stick with Arthur, especially since I doubt the people who ran the Library of Alexandria would have been okay with my grabbing a few dozen manuscripts to take back with me.

*******************************

One of the most enduring mysteries of neuroscience is the origin of consciousness.  We are aware of a "self," but where does that awareness come from, and what does it mean?  Does it arise out of purely biological processes -- or is it an indication of the presence of a "soul" or "spirit," with all of its implications about the potential for an afterlife and the independence of the mind and body?

Neuroscientist Anil Seth has taken a crack at this question of long standing in his new book Being You: A New Science of Consciousness, in which he brings a rigorous scientific approach to how we perceive the world around us, how we reconcile our internal and external worlds, and how we understand this mysterious "sense of self."  It's a fascinating look at how our brains make us who we are.

[Note: if you purchase this book using the image/link below, part of the proceeds goes to support Skeptophilia!]



Monday, August 30, 2021

The geological time slip

The geologists have lost about a billion or so years, have you noticed?

This wee time slip has to do with the sedimentary strata best observed in the Grand Canyon, which showed a peculiarity noticed -- but not explained -- by explorer John Wesley Powell in 1869.  Powell knew about sedimentary rock formation and the Principle of Superposition -- the rather common-sensical idea that in undisturbed strata, the lower layers were formed first.  (It's like building a layer cake -- it'd be rather tricky to build it from the top down.)

In Powell's case, he noticed what looked like a jump in rock types and stratum structure, where because of fault lines and angles of tilt it seemed like there was a gap in the rock record.  What neither Powell nor anyone else knew at the time was how big this gap -- which Powell called "the Great Unconformity" -- was.

It represents the loss of about a billion years of accumulated rocks.

The Arroyo Penasco Formation showing the Great Unconformity, Montezuma, New Mexico [Image is in the Public Domain]

That by itself would be enough to suggest that whatever caused the Great Unconformity, it was not some local effect, and it didn't take much surveying to confirm this.  In fact, the same jump has been seen in just about every place there's rock of that age (right around the boundary between the Cambrian and Precambrian eras), most notably in the St. François Mountains of Arkansas and at Siccar Point on the east coast of Scotland.

In each case, there is 1.4 billion year old rock (mostly granite, rhyolite, and schist), and the layer immediately above it dates to around 500 million years ago.

In between -- nothing.

It wasn't until 1910 that the magnitude of this bizarre gap was fully appreciated.  In Cambrian Geology and Paleontology, geologist Charles D. Walcott wrote:
I do not know of a case of proven conformity between Cambrian and pre-Cambrian Algonkian rocks on the North American continent.  In all localities where the contact is sufficiently extensive, or where fossils have been found in the basal Cambrian beds or above the basal conglomerate and coarser sandstones, an unconformity has been found to exist.  Stated in another way, the pre-Cambrian land surface was formed of sedimentary, eruptive, and crystalline rocks that did not in any known instance immediately precede in deposition or origin the Cambrian sediments.  Everywhere there is a stratigraphic and time break between the known pre-Cambrian rocks and Cambrian sediments of the North American continent.

But what on Earth could tear down a billion years' worth of strata -- all over the world, more or less simultaneously (if you can call anything that had a duration of a billion years "simultaneous")?  Scientists believe that these missing layers represent something on the order of six to eight vertical kilometers of rock.

Some new research has indicated a possible trigger -- and perhaps the mechanism involved.  Back around the beginning of the gap, all of the continents of the Earth had slammed together to form one huge supercontinent.  This was pre-Pangaea, the one most people will think of; this was Rodinia, a colossal land mass that lasted from the late Precambrian Era to right about the beginning of the Cambrian, at which point rifting took over and the continents separated into a new configuration.

Here's what seems to have happened.  When Rodinia formed, the force of the collisions pushed a lot of rock skyward.  We're seeing exactly the same thing happen today in the Himalayas; Mount Everest is sedimentary rock that was once at the bottom of the ocean, but the collision between India and the main part of Asia scooped it up like a huge plow and raised enormous mountains.  This same process occurred during the formation of Rodinia, but on a global scale as all of the world's land masses collided.

But by the beginning of the Cambrian, a huge amount of that rock was gone, eroded away.  What could cause erosion on that scale?

It seems like the likeliest explanation is worldwide glaciation.  The late Precambrian has been called the "Cryogenic Period" -- from Greek words meaning "ice-forming" -- as well as the perhaps more vivid moniker of the "the Snowball Earth."  The shoving of the Precambrian rocks aloft created steep topography (again, just like in the Himalayas today), so any erosive forces, whether ice or liquid water, would have that much more gravity-driven force to grind it down.

As a biologist, what I find even cooler is that that breakup of Rodinia, which coincided with the thawing of the Snowball Earth, was also the beginning of a huge diversification of life on Earth, something that has been nicknamed the Cambrian Explosion.  I don't think it's a reach to hypothesize that these two events were connected.

So do we owe the current biodiversity -- and, by our extension, our own presence here -- to a process that erased every trace of a billion years of sedimentary rock layers?

I find it fascinating how everything is connected, and that even after a couple of centuries of intense study, there are still mysteries out there to solve.  The unconformities in our own knowledge are still huge, but unlike the one in the Grand Canyon, aren't immediately obvious.  Filling in these gaps inevitably opens up new questions, enough that scientists will never run out of new areas to explore.  As Socrates said, over two thousand years ago, "If I am accounted wise, it is only because I alone realize how little I know."

*******************************

One of the most enduring mysteries of neuroscience is the origin of consciousness.  We are aware of a "self," but where does that awareness come from, and what does it mean?  Does it arise out of purely biological processes -- or is it an indication of the presence of a "soul" or "spirit," with all of its implications about the potential for an afterlife and the independence of the mind and body?

Neuroscientist Anil Seth has taken a crack at this question of long standing in his new book Being You: A New Science of Consciousness, in which he brings a rigorous scientific approach to how we perceive the world around us, how we reconcile our internal and external worlds, and how we understand this mysterious "sense of self."  It's a fascinating look at how our brains make us who we are.

[Note: if you purchase this book using the image/link below, part of the proceeds goes to support Skeptophilia!]



Saturday, August 28, 2021

Globe-trotting with the gods

Despite my fairly persistent railing against people who make outlandish, unverifiable claims, I find it even more perplexing when people make outlandish, demonstrably false claims.  I mean, it's one thing to claim that last night you had a visitation from the ghost of your late Aunt Gertrude, and she told you the secret recipe for making her Extra-Zesty Bean Dip.  I couldn't disprove that even if I wanted to, which I don't, because I actually kind of like bean dip.

But when someone makes a statement that is (1) falsifiable, and (2) clearly incorrect, and yet (3) stands by it as if it made complete sense... that I find baffling.  "I'm sorry," they seem to be saying, "I know you've demonstrated that gravity pulls things toward the Earth, but I believe that in reality, it works the opposite way, so I'm going to wear velcro shoes so I don't fall upward."

And for once I am not talking about young-Earth creationism.

This all comes up because of an article from the site Unexplained Mysteries that a friend sent me yesterday.  Entitled "Easter Island Heads -- They Speak At Last," it was written by L. M. Leteane.  If that name sounds familiar to regular readers of this blog, it's because Leteane has appeared here before, most recently for claiming that the Central American god Quetzalcoatl and the Egyptian god Thoth were actually the same person, despite one being a feathered snake and the other being a shirtless dude with the head of an ibis, which last I checked hardly look alike at all.  Be that as it may, Leteane concludes that this is why the Earth is going to end when a comet hits it in the year 3369.

So I suppose that given his past attempts, we should not expect L. M. Leteane to exactly knock us dead in the logic department.

But even starting out with low expectations, I have to say that he has outdone himself this time.

Here's the basic outline of his most recent argument, if I can dignify it by calling it that.  Fasten your seatbelts, it's gonna be a bit of a bumpy ride.
  1. The Bantu people of south-central Africa came originally from Egypt, which in their language they called Khama-Roggo.  This name translates in Tswana as "Black-and-Red Land."
  2. Charles Berlitz, of The Mystery of Atlantis fame, says that Quetzalcoatl also comes from "Black-and-Red Land."  Berlitz, allow me to remind you, is the writer about whose credibility the skeptical researcher Larry Kusche said, "If Berlitz were to report that a ship was red, the chances of it being some other color is almost a certainty."
  3. The Olmecs were originally from Africa, but then they accompanied the god Thoth to Central America.  In a quote that I swear I am not making up, "That is evidently why their gigantic sculptured heads are always shown helmeted."
  4. The Babylonian goddess Ishtar was also a real person, who ruled in the Indus Valley for a while (yes, I know that India and Babylonia aren't the same place; just play along, okay?) until she got fed up and also moved to Central America.  She took some people with her called the Kassites.  This was because she was heavily interested in tin mining.
  5. Well, three gods in one place are just too many (three too many, in my opinion), and this started a war.  Hot words were spoken.  Nuclear weapons were detonated.  Devastation was wreaked.  Passive voice was used repeatedly for dramatic effect.
  6. After the dust settled, the Olmecs, who were somehow also apparently the Kassites and the Bantu, found themselves mysteriously deposited on Easter Island.  A couple of more similarities between words in various languages and Pascuanese (the language of the natives of Easter Island) are given, the best one being Rapa Nui (the Pascuanese name for the island) meaning "black giant" because Rapa is a little like the Hebrew repha (giant) and Nui sounds like the French nuit (night).  This proves that the island was settled by dark-skinned giant people from Africa.  Or somewhere.
  7. The Olmecs decided to name it "Easter Island" because "Easter" sounds like "Ishtar."
  8. So they built a bunch of stone heads. q. e. d.
[Image licensed under the Creative Commons TravelingOtter, Moai at Rano Raraku - Easter Island (5956405378), CC BY 2.0]

Well. I think we can all agree that that's a pretty persuasive logical chain, can't we?

Okay, maybe not.

Let's start with the etymological funny business.  Unfortunately for L. M. Leteane, there is a critical rule in such discussions, which is, "Don't fuck with someone who is actually a linguist."  I have an M. A. in historical linguistics, so I have a fairly decent idea how language evolution works.  I also know you can't base language relationships on one or two words -- chance correspondences are all too common.  So just because roggo means "red" in Tswana (which I'm taking on faith because Leteane himself is from Botswana, and my expertise is not in African languages), and rouge is French for "red," doesn't mean a damn thing.  Rouge goes back to the Latin ruber, then to Ancient Greek ἐρυθρός, and finally to a reconstructed Proto-Indo-European root reudr.  Any resemblance to the Tswana word for "red" is coincidental.  And as for "Rapa Nui" meaning "black giant," that's ridiculous; Pascuanese is a Polynesian language, which is neither Indo-European nor Semitic, and has no underlying similarity to either French or Hebrew other than all of them being languages spoken by people somewhere.

And as far as "Easter Island" being named after Ishtar... well, let's just say it'll take me a while to recover from the headdesk I did when I read that.  Easter Island was so named by the Dutch explorer Jacob Roggeveen, because he first spotted it on Easter Sunday in 1722.  He called it Paasch-Eyland, Dutch for "Easter Island;" its official name is Isla de Pascua, which means the same thing in Spanish.  Neither one sounds anything like "Ishtar."  (In fact, "Easter" isn't a cognate to "Ishtar," despite the fact that some anti-Christian types circulate that claim every spring, apparently to prove that the Christians are a bunch of thinly-disguised pagans.)

And as for the rest of it... well, it sounds like the plot of a hyper-convoluted science fiction story to me.  Gods globe-trotting all over the world, bringing along slave labor, and having major wars, and conveniently leaving behind no hard evidence whatsoever.

The thing I find maddening about all of this is that Leteane mixes some facts (his information about Tswana) with speculation (he says that the name of the tin ore cassiterite comes from the Kassites, which my etymological dictionary says is "possible," but gives two other equally plausible hypotheses) with outright falsehood (that Polynesian, Bantu, and Indo-European languages share lots of common roots) with wild fantasy (all of the stuff about the gods).  And people believe it.  His story had, last I checked, been tweeted and Facebook-liked dozens of times, and amongst the comments I saw was, "Brilliant piece of research connecting all the history you don't learn about in school!  Thank you for drawing together the pieces of the puzzle!"

So, anyway.  I suppose I shouldn't get so annoyed by all of this.  Actually, on the spectrum of woo-woo beliefs, this one is pretty harmless.  No one ever blew himself up in a crowded market because he thought that the Olmecs came from Botswana.  My frustration is that there are seemingly so many people who lack the ability to think critically -- to look at the facts of an argument, and how the evidence is laid out, and to see if the conclusion is justified.  The problem, of course, is that learning the principles of scientific induction is hard work.  Much easier, apparently, to blather on about feathered serpents and goddesses who are seriously into tin.

*********************************************

I've been interested for a long while in creativity -- where it comes from, why different people choose different sorts of creative outlets, and where we find our inspiration.  Like a lot of people who are creative, I find my creative output -- and my confidence -- ebbs and flows.  I'll have periods where I'm writing every day and the ideas are coming hard and fast, and times when it seems like even opening up my work-in-progress is a depressing prospect.

Naturally, most of us would love to enhance the former and minimize the latter.  This is the topic of the wonderful book Think Like an Artist, by British author (and former director of the Tate Gallery) Will Gompertz.  He draws his examples mostly from the visual arts -- his main area of expertise -- but overtly states that the same principles of creativity apply equally well to musicians, writers, dancers, and all of the other kinds of creative humans out there. 

And he also makes a powerful point that all of us are creative humans, provided we can get out of our own way.  People who (for example) would love to be able to draw but say they can't do it, Gompertz claims, need not to change their goals but to change their approach.

It's an inspiring book, and one which I will certainly return to the next time I'm in one of those creative dry spells.  And I highly recommend it to all of you who aspire to express yourself creatively -- even if you feel like you don't know how.

[Note: if you purchase this book using the image/link below, part of the proceeds goes to support Skeptophilia!]


Friday, August 27, 2021

Retrograde

One thing I haven't done yet on Fiction Friday is to post one of my pieces of fiction.  Seemed a bit ironic, that, so today I'm sharing "Retrograde," a strange short story about time, a chance meeting, and how we all watch the film unwind from our own perspectives.  It's not available anywhere else, so here you have it: an Exclusive Release.  Enjoy!

*************************************

Retrograde

I met Hannah about a month ago.  Of course, she wouldn’t phrase it that way, but maybe there’s no other way to say it and be understood, so let’s just leave it there; I met her about a month ago, the week before Christmas.

It had been an unusually cold December.  Even people who’d been born and raised in Ithaca were complaining.  There were about two feet of ice-crusted snow on the ground, and the sound of the plows growling by became so common that you stopped hearing them.  I was walking up Meadow Street, and as my boots pressed into the snow on the sidewalk, they made that squeaking noise that only happens when the temperature is getting close to zero.

I do this walk most nights, up from the bicycle shop where I work to the Ithaca Bakery to grab a bite to eat, then over two blocks on Cascadilla Street where I rent an upstairs room from an elderly couple.  It’s an okay life but you don’t need to tell me that I’m floating, that I’m slipping through life doing the bare minimum.  My mom tells me that most times I talk to her, but it’s not like I don’t see it myself.  I’ve got a decent brain.  I know I could do okay in college, but right now, I just don’t see a path.  I’d rather work at the bike shop, come home with my food and sit and read or watch TV or mess around online, than go to college and spend lots of money to spin my wheels, you know?

Anyway, I was doing my usual trek on that icy December night.  It was right around the solstice, so it’d been dark since around five o’clock, and by this time it was that kind of dark that seems to be an actual substance, not just an absence of light.  Even the streetlights didn’t help much, just illuminated the flakes of snow that were beginning to fall again.  I passed a guy I often see on that walk – tall middle-aged dude, wearing an old-fashioned felt hat with a feather, always going the other way, carrying a briefcase.  That night he had a thick scarf wrapped around his face, and I could barely hear his voice as he said, “What happened to the goddamn global warming?”
 
“No kidding.” 

“Winter storm warning tonight,” he said.  “Supposed to get another foot and a half by tomorrow noon.  Christ.”

I shook my head.  “Unbelievable,” and then we both went on our way.

[Image licensed under the Creative Commons Mehr News Agency, 23 January 2020, Arak (13), CC BY 4.0]

The Ithaca Bakery was empty except for me.  There never were many people in this late, but it wasn’t usually completely empty.  Probably the winter storm warning kept everyone with any common sense home.

I could hear a couple of folks in the kitchen, bumping around as they cleaned up.  There was only one person behind the counter.  I’d never seen her before, and I knew most of the staff by name.  She’d been looking down when I walked in, her hands holding onto the counter, but then she looked up at me.

She was one of those people who is hard to describe; pretty but not beautiful, medium-length blond hair held back by a clip, oval face, medium height.  Her only real standout feature was her eyes, which were a very pale blue.  An artist might describe them as a chilly blue, an icy blue, but that’s not right; there was no cold in them at all.  They had a fire in them.  I’ve read that the hottest fire, past red hot, and yellow, and white, is blue; and after seeing her eyes, I think I understand that.

And as soon as those eyes met mine, she started crying.

She looked down again, still clutching the counter, her whole body shaking.

“Jesus,” I said.  “What’s wrong?”

She shook her head, kept on crying, and I just stood there, feeling weird and uncomfortable, and glad there were no other people in the Bakery that night.

Finally she just looked up, those pale eyes still flooded with tears, and said, “Eli, I can’t believe it’s already that time.”

I stared at her for a moment, and then said, “Do I know you?”

“Not yet.  But you will.”  She drew a sleeve across her eyes, and attempted a smile.  She finally unclenched one of her hands from the counter edge, and reached it across for me to shake.  “I’m Hannah.” 

“Eli,” I said, even though she apparently already knew that somehow.

“What can I get you tonight?”  She was obviously trying for that cheerful and courteous sound restaurant staff always have, and mostly succeeded.

“Sun-dried tomato bagel, toasted, cream cheese and lox.”

She smiled a little bit, for real now, said, “The usual, then,” and turned away to get me my food.  I put a ten dollar bill on the counter, and pretty soon she came up, handed me my plate, gave me my change.

“Look,” I said, still feeling strange, “you want to talk for a while?”

She shrugged.  “No one’s here tonight, and the place closes soon anyway.  We won’t get many more people in this weather, and if we do, I can just get up and take care of them, right?”

“That’s fine.  We can talk for a little while.”

I went to a table, over in the corner by the window, and she followed me, sat down, and rested her chin in her hands, her elbows on the table.

I looked at her, trying to place where I knew her from, but still drew a blank.  I’ve got a good memory for faces, and I wouldn’t forget those eyes, I knew that.  I was certain I’d never seen this woman before.

“I know you don’t understand now, Eli,” she said.  “It’s so awful for you.  I’m sorry about the way I acted.  Inexcusable, really inexcusable.”

“Are you sure you know me?” I took a bite of my bagel.

She just smiled a little.  “Do you want me to explain?  It won’t make much sense now.  It will later.”  She paused.  “My name is Hannah, by the way.”

“Hannah,” I said.  “I know.  You already told me.  But explain?  Explain what?”

She looked out of the window, at the snow falling faster, hissing against the glass panes.

“I don’t see the world the way others do.”

That was kind of a vague start, I thought.  “None of us see the world the same way.  That doesn't mean your point of view isn't valid.”  I was trying to be helpful, but only ending up sounding like somebody who’s read too much pop psychology.

Her lips tightened, her face looking resolute.   “Okay. I guess I just need to say it straight out.”  She took a deep breath, exhaled slowly.  “What’s the past for you is the future for me,” she said, in a low, intense voice, and then just looked at me, her pale eyes searching mine.

My rational mind said, This chick is crazy, but something about her demeanor seemed so normal that I couldn’t just attribute her odd behavior to her being a nut.  “What’s that supposed to mean?” 

“When you say something is in the past,” she said, patiently, “it hasn’t happened for me yet.  What I remember is what you call the future.  What you call the past I don’t remember, because it hasn’t happened yet.  For me, at least.”

I stared at her, my mouth hanging open a little.  “That’s impossible.  The past is the past.  The future is the future.”

“Not for me.”

“Time passes the same way for everyone.”

She shook her head.  “It’s been this way all of my life.  All the few short weeks of my life.  Time runs backwards for me.”  She gestured at my plate, and smiled a little wryly.  “Can I have a bite of your bagel?  I’m starving.”

I picked up half of the bagel, handed it to her.  “Why did you ask, if for you it’d already happened?  For you, you’d already taken a bite, right?”

“Yes.  But I knew by what you said that it was going to happen, and if I hadn’t asked afterwards, you would have wondered why the hell this strange chick had taken a bite of your dinner without asking.  I learned this stuff the hard way.  I’m beginning to adapt.”

“So you asked to have some of my bagel because for you it had already happened?”

She shrugged.  “I guess from your perspective, that’s the only way you could make sense of it.”

“This doesn’t make any sense.  The clock only runs one way.  No one lives in a world where glasses unbreak, snow falls upward, balls roll uphill.  That’s scientifically impossible, right?”

“I can’t answer that.  All I can say is that we see the same things.  For me, the film runs backwards, that’s all.  Other than that, there’s no difference.  There’s nothing I can do to change the way things unfold, same as with you.”

“That’s why you were crying, when I came in.  Because of something that for you, had already happened?  What was it?”

She shook her head.  “I shouldn’t answer that.”

I thought for a moment.  “It’s me, isn’t it?  For me, I was just meeting you for the first time; for you, it was the last time you’d ever see me.”  I winced, and rubbed my eyes with the heel of my hand.  “Jesus, I’m starting to believe you.  But that’s it, right?”

She didn’t answer for a moment.  “The thing is, you know, you just start looking at things as inevitable.  Like you’re in some sort of film.  The actors seem to have freedom, they seem to have will, but in reality the whole thing is just scrolling by and what’s going to happen is only what’s already written in the script.  You could, if you wanted to, start at the end and run the film backwards.  Same stuff, different direction.  No real difference except for the arrow of time.”

“I guess I’d cry, too.”

The corners of her mouth turned up a little.  “It’s no problem, I can get you another bagel.”
 
Before I could ask her what she was talking about, there was a sudden crash as someone dropped something in the kitchen.  I jumped, and my hand jerked.  The plate with my dinner slid off the table and fell upside down on the floor.

I looked at it, mutely, then at her.  She shrugged and smiled.

“Yeah,” I finally said. “That’d be great.”

She stood up, one eyebrow raised quizzically, and went off to the kitchen.

My mind was spinning.  Was she crazy, or was what she was saying the literal, factual truth?  How could anyone perceive the world in reverse?  If what she was saying was true, someone should be told; it would blow away all of what was known about science.

But then, how could they test it?  As her life unrolled, she would forget more and more, because as our clocks moved forward, hers would be moving backward.  Only at the present moment did our lives touch – for an instant only, and then continued to spin away along their inverted paths.

She returned with the bagel.

“Sun-dried tomato, cream cheese, and lox,” I said.  “You remembered that, at least.”

She just smiled at me, and sat down, then reached across the table, and took my hand.

Then I realized -- no, she didn’t remember.  I'd just told her.  All she did was get what I just told her to get.

Looking across at her, my heart gave a funny little gallop in my chest.  She knew it because it had already happened for her.  It was the past.  She was remembering, not predicting.  And I think that’s the moment when I was convinced that she was telling the truth.

“It’s been three weeks since it all started,” she said, still holding my hand.  “It’s nice to find someone to tell about all this.  You’re the first person I’ve told.”

“Three weeks?  Three weeks since what?”

“My life started three weeks ago.  I don’t really understand how, but there it is.”

“Started?  Started how?  What happened three weeks ago?”

She looked down, her eyes becoming unfocused for a moment, as she searched her… memory?  What else could you call it?  After a moment, she looked up.  “The first thing I remember is a shock.  Like an explosion.  Then I felt wind.  Before I knew what was happening, I was up on a bridge, near Cornell, over that really beautiful gorge, I forget its name.  It was snowing, just like today.  Cold.  I didn’t know where I was, all I knew was that my name was Hannah and I was cold.  And I began to walk, and finally came here, and talked to one of the managers, and he offered me a job.  They let me sleep on a cot in one of the offices in back.  Only till I can get a place, and it was really nice of them to let me.  I honestly don’t know why they agreed.  But three weeks – yes, that’s when it all started.”

“So that means you’ve only got three weeks to live.”

“I suppose that’s the way it would appear, from your perspective.”

“My perspective?” I shouted.   “My perspective is all I have!  You don’t mean to tell me that in three weeks you’re going to die, and there’s nothing you can do about it?”

Hannah shrugged.  “I don’t know any other way to explain it.  It really is all about perspective.”

I leaned back in my chair.  “So you’re telling me that from your point of view, you’re going to get younger and younger, and finally a baby, and then you’ll disappear up into your mother’s uterus, and then you’ll just… cease to be?”

“It’s not so very much weirder than your life seems to me.  Where were you before you were born?  And what will happen to you when you die?”

Well, she got me there, and I didn’t respond for a moment.  “I don’t know,” I finally said.  “I’m not religious.  But even so, I don’t know how you can expect this to make sense to me.”

“Look, you don’t have to be upset on my behalf.  It is what it is.  Maybe we should just stop talking about all these matters of life and death, and the afterlife.  Or beforelife.  Or whatever.”

The snow was falling faster now, beginning to pile up on the older drifts, swirling in curtains against the streetlight.  “I’m not upset,” I said, and I was telling the truth.  I felt completely calm for some reason, despite having spent fifteen minutes in what was the most peculiar conversation I’d ever had.  I ate the last bit of my bagel, and looked into those eyes, those strange, luminous eyes.  “Look, I don’t know.  Do you want to come back to my place?  I know it’s weird to ask, but it might be better than your staying here, alone, and having to be left with… your memories.”
 
She smiled. “I’d like that.”

I held out my hand for her, and she stood.  “Let’s go,” I said.  “I just live a couple of blocks away.”


We didn’t talk any more about time and perspective – just talked about what we liked, talked about the weather.  We each had a beer and sat on the couch for a while, and then went to bed.  I offered her the couch, but she smiled and shook her head, saying that that if the point was for her not to be lonely, the couch was no better than her cot back at the Bakery.  I didn’t argue.

We made love that night, and as I was drifting off to sleep, I wondered what that had been like for her – an explosion, merging into excitement, fading into anticipation, then subsiding into silence.  I hoped that it was good, however she had perceived it.


She stayed with me for three days.  On the morning of the fourth day, I awoke to find a note on the pillow next to me, and that she was gone.  It wasn’t really a surprise, but still, it made my stomach clench when I picked it up.  Time was spooling by, the clock was running; it never stopped, whatever direction it was going.  You couldn’t halt it either way.

The note read:
Eli… 
I know you won’t understand, but this can’t go on indefinitely.  It will make sense to you eventually, I hope.  I hardly know you, and as time passes for you, I will know you less and less, and finally forget you entirely.  It’s better this way. 
Hannah
I looked at the note for a while, then got up, showered, dressed, and headed up to the Bakery.

Hannah was behind the counter.  She looked up at me, and I was greeted by a smile.  I went up to her, stood silent for a moment.

“My name is Eli,” I said.  “I don’t want you to forget that.  Eli.  And for three days, you were important to me, Hannah.”

She smiled again, those odd eyes glittering.  “I won’t forget,” she said, and reached across and touched my hand.

“Don’t forget,” I said.  “Don’t ever forget me.”


And that was all.

I went in to the Bakery a couple of days after that, near closing time, taking my usual route after getting off from work at the bike shop.  Tom, the long-haired, multiply-pierced counterman, greeted me with a grin.

“Hey, Eli,” he said.  “The usual?”

“Yeah,” I said.  He started putting together my dinner.  “Hey, Tom.  What do you think about that girl who works here, Hannah?”

Tom half turned, my bagel in his hand.  He rolled his eyes.  “That chick is wack, and that’s my considered opinion.  Owner said she could live in the back room for a coupla weeks, till she finds a place.  But she’s a strange one.  Nice-looking, though.”

I nodded.  “Yeah.  Pretty strange.  You got that right.”


Then last week, in the Ithaca Journal, the following article appeared on the front page.
Local Woman Killed in Fall from Bridge 
Hannah van Meter, 24, was killed in what police are considering a probable suicide.  On the night of January 17, she fell from the bridge on Stewart Avenue into Fall Creek Gorge.  A witness, whose name has not been released by police, stated that she had been standing for some time, looking down into the gorge, and that he went up and attempted to speak to her.  She seemed disoriented, and would not leave the bridge even though the witness attempted to persuade her to do so.  She threatened to jump if he approached her more closely, he stated.  After five minutes, the witness went to a nearby house to get help, and was walking back up toward the bridge when van Meter jumped or fell over the bridge railing. 
She was the daughter of David and Helen van Meter of Chenango Forks.  She had lived in Ithaca for only a few weeks, and had been employed by the Ithaca Bakery since mid-December. 
Police are investigating.
I sat in my room, crying and reading the article over and over.  Sometimes you still cry even when you know how the story’s going to end.  But perhaps, if the story is read backwards, it will have a happier ending.
 
Or beginning.  Or whatever.

At least that’s what I am hoping for.

*********************************************

I've been interested for a long while in creativity -- where it comes from, why different people choose different sorts of creative outlets, and where we find our inspiration.  Like a lot of people who are creative, I find my creative output -- and my confidence -- ebbs and flows.  I'll have periods where I'm writing every day and the ideas are coming hard and fast, and times when it seems like even opening up my work-in-progress is a depressing prospect.

Naturally, most of us would love to enhance the former and minimize the latter.  This is the topic of the wonderful book Think Like an Artist, by British author (and former director of the Tate Gallery) Will Gompertz.  He draws his examples mostly from the visual arts -- his main area of expertise -- but overtly states that the same principles of creativity apply equally well to musicians, writers, dancers, and all of the other kinds of creative humans out there. 

And he also makes a powerful point that all of us are creative humans, provided we can get out of our own way.  People who (for example) would love to be able to draw but say they can't do it, Gompertz claims, need not to change their goals but to change their approach.

It's an inspiring book, and one which I will certainly return to the next time I'm in one of those creative dry spells.  And I highly recommend it to all of you who aspire to express yourself creatively -- even if you feel like you don't know how.

[Note: if you purchase this book using the image/link below, part of the proceeds goes to support Skeptophilia!]


Thursday, August 26, 2021

The nasty bite of Poe's Law

I have a love-hate relationship with Poe's Law.

Poe's Law, you probably know, is a rule of thumb named after Nathan Poe, who said in 2005, "The better a parody is, the harder it is to tell from the truth."

I love Poe's Law because the targets of parody and satire are often so richly deserving of it.  Consider one of the most fantastic parody sites out there -- The Onion -- which combines absolute hilarity with acid-tipped social and political commentary.  (One particularly trenchant example is that every time there is yet another mass shooting in the United States, The Onion has an article with the headline, "'No Way to Prevent This,' Says Only Nation Where This Regularly Happens.")

On the other hand, I hate Poe's Law because there is enough misinformation out there without waggish satirists adding to it.  The Law itself states that good satire will take people in; the point is to get people to say, "No, really?", at least for a moment.  But for some folks, that moment gets stretched out way too far, and you have people believing satire is the truth.

My favorite example of this -- once again from The Onion -- is the pearl-clutching woman who wrote an outraged letter to the editor of Reader's Digest after they did an interview with J. K. Rowling.  "How can you give this woman more publicity?" the letter-writer said.  "This is supposed to be a magazine that supports conservative morals and values.  J. K. Rowling is an avowed practitioner of black magic.  She has overseen the baptism of thousands of children into the Church of Satan.  There was a major exposé of Rowling's evil activities a couple of months ago in The Onion."

The editor of Reader's Digest, showing admirable restraint, printed the letter, responding only with, "The Onion is a satirical news source, not meant to be taken as fact."

The "hate" side of the ledger got another entry yesterday, when a frequent reader and contributor to Skeptophilia sent me a message about Tuesday's post, which was about a scientific study showing that people are more likely to follow absurd directives than reasonable ones.  The message said, "Um, Gord... I think that site is satire.  Check the 'About' section."

He then pointed out that the lead researcher, Fiona Hayes-Verhorsihs, has a ridiculous name.  Say it out loud.

Yup.  "Hay's for horses."  Funny thing, given my background in linguistics, that this bit of the joke went past me so fast it didn't even ruffle my hair.  I figured the last part of her name was some obscure surname, perhaps Dutch or Afrikaans by the look of it, and didn't give it any further thought.

Suffice it to say that the fellow who sent me the comment is right.  I got bitten in the ass by Poe's Law.  Not the first time this has happened, nor (I suspect) will it be the last.  I didn't really dig too hard into the antecedents of the story; if I had, I'd have realized my error pretty quickly.  The problem is, the conclusion of the faux study -- that people can be pretty irrational at times -- was something I've written about many times before, and I have no real doubt that the general point is true.  So when the study by Professor Hay's-For-Horses popped up, I didn't even question it.

Meaning that I not only fell for Poe's Law, I fell for confirmation bias.

Of course, I'm in good company.  Pravda and Xinhua have both been hoodwinked by hoax stories that sounded plausible.

But so has Fox News.  So maybe "good company" isn't the best way to phrase it.

Anyhow, once this post is up, I'll take the old one down.  I'd rather not add to the morass of wacky stuff online, and find out that someone else has mentioned the absurdity study -- and cited Skeptophilia as the source.  All of which has me rededicating myself to being careful about my own research, as should we all.  Check your sources, look for corroboration, see if you can find out the credentials of the people cited -- all before you post, like, or retweet a link.

And that goes double if you're the author of a blog devoted to rational thinking.

*********************************************

I've been interested for a long while in creativity -- where it comes from, why different people choose different sorts of creative outlets, and where we find our inspiration.  Like a lot of people who are creative, I find my creative output -- and my confidence -- ebbs and flows.  I'll have periods where I'm writing every day and the ideas are coming hard and fast, and times when it seems like even opening up my work-in-progress is a depressing prospect.

Naturally, most of us would love to enhance the former and minimize the latter.  This is the topic of the wonderful book Think Like an Artist, by British author (and former director of the Tate Gallery) Will Gompertz.  He draws his examples mostly from the visual arts -- his main area of expertise -- but overtly states that the same principles of creativity apply equally well to musicians, writers, dancers, and all of the other kinds of creative humans out there. 

And he also makes a powerful point that all of us are creative humans, provided we can get out of our own way.  People who (for example) would love to be able to draw but say they can't do it, Gompertz claims, need not to change their goals but to change their approach.

It's an inspiring book, and one which I will certainly return to the next time I'm in one of those creative dry spells.  And I highly recommend it to all of you who aspire to express yourself creatively -- even if you feel like you don't know how.

[Note: if you purchase this book using the image/link below, part of the proceeds goes to support Skeptophilia!]


Wednesday, August 25, 2021

The honesty researcher

One of the things I pride myself on is honesty.

I'm not trying to say I'm some kind of paragon of virtue, but I do try to tell the truth in a direct fashion.  I hope it's counterbalanced by kindness -- that I don't broadcast a hurtful opinion and excuse it by saying "I'm just being honest" -- but if someone wants to know what I think, I'll tell 'em.

As the wonderful poet and teacher Taylor Mali put it, "I have a policy about honesty and ass-kicking.  Which is: if you ask for it, I have to let you have it."  (And if you haven't heard his wonderful piece "What Teachers Make," from which that quote was taken -- sit for three minutes right now and watch it.)


I think it's that commitment to the truth that first attracted me to science.  I was well aware from quite a young age that there was no reason to equate an idea making me happy and an idea being the truth.  It was as hard for me to give up magical thinking as the next guy -- I spent a good percentage of my teenage years noodling around with Tarot cards and Ouija boards and the like -- but eventually I had to admit to myself that it was all a bunch of nonsense.

In science, honesty is absolutely paramount.  It's about data and evidence, not about what you'd dearly love to be true.  As the eminent science fiction author Phillip K. Dick put it, "Reality is that which, when you stop believing in it, it doesn't go away."

Or perhaps I should put it, "it should be about data and evidence."  Scientists are human, and are subject to the same temptations the rest of us are -- but they damn well better be above-average at resisting them.  Because once you've let go of that touchstone, it not only calls into question your own veracity, it casts a harsh light on the scientific enterprise as a whole.

And to me, that's damn near unforgivable.  Especially given the anti-science attitude that is currently so prevalent in the United States.  We don't need anyone or anything giving more ammunition to the people who think the scientists are lying to us for their own malign purposes -- the people whom, to quote the great Isaac Asimov, think "my ignorance is as good as your knowledge."

Which brings me to Dan Ariely.

Ariely is a psychological researcher at Duke University, and made a name for himself studying the issue of honesty.  I was really impressed with him and his research, which looked at how our awareness of the honor of truth-telling affects our behavior, and the role of group identification and tribalism in how much we're willing to bend our own personal morality.  I used to show his TED Talk, "Our Buggy Moral Code," to my Critical Thinking classes at the beginning of the unit on ethics; his conclusions seemed to be a fascinating lens on the whole issue of honesty and when we decide to abandon it.

Which is more than a little ironic, because the data Ariely used to support these conclusions appear to have been faked -- possibly by Ariely himself.

[Image licensed under the Creative Commons Yael Zur, for Tel Aviv University Alumni Organization, Dan Ariely January 2019, CC BY-SA 4.0]

Ariely has not admitted any wrongdoing, but has agreed to retract the seminal paper on the topic, which appeared in the prestigious journal Proceedings of the National Academy of Sciences back in 2012.  "I can see why it is tempting to think that I had something to do with creating the data in a fraudulent way," Ariely said, in a statement to BuzzFeed News.  "I can see why it would be tempting to jump to that conclusion, but I didn’t...  If I knew that the data was fraudulent, I would have never posted it."

His contention is that the insurance company that provided the data, The Hartford, might have given him fabricated (or at least error-filled) data, although what their motivation could be for doing so is uncertain at best.  There's also the problem that the discrepancies in the 2012 paper led analysts to sift through his other publications, and found a troubling pattern of sloppy data-handling, failures in replicability of results, misleading claims about sources, and more possible outright falsification.  (Check out the link I posted above for a detailed overview of the issues with Ariely's work.)

Seems like the one common thread running through all of these allegations is Ariely.

It can be very difficult to prove scientific fraud.  If a researcher deliberately fabricated data to support his/her claims, how can you prove that it was deliberate, and not either (1) an honest mistake, or (2) simply bad experimental design (which isn't anything to brag about, but is still in a separate class of sins from outright lying).  Every once in a while, an accused scientist will actually admit it -- one example that jumps to mind is Korean stem-cell researcher Hwang Woo-Suk, whose spectacular fall from grace reads like a Shakespearean tragedy -- but like many politicians who are accused of malfeasance, a lot of times the accused scientist just decides to double down, deny everything, and soldier on, figuring that the storm will eventually blow over.

And, sadly, it usually does.  Even in Hwang's case -- not only did he admit fraud, he was fired by Seoul National University and tried and found guilty of embezzlement -- he's back doing stem-cell research, and since his conviction has published a number of papers, including ones in PubMed.

I don't know what's going to come of Ariely's case.  Much is being made about the fact that a researcher in honesty and morality has been accused of being dishonest and immoral.  Ironic as this is, the larger problem is that this sort of thing scuffs the reputation of the scientific endeavor as a whole.  The specific results of Ariely's research aren't that important; what is much more critical is that this sort of thing makes laypeople cast a wry eye on the entire enterprise.

And that, to me, is absolutely inexcusable.

*********************************************

I've been interested for a long while in creativity -- where it comes from, why different people choose different sorts of creative outlets, and where we find our inspiration.  Like a lot of people who are creative, I find my creative output -- and my confidence -- ebbs and flows.  I'll have periods where I'm writing every day and the ideas are coming hard and fast, and times when it seems like even opening up my work-in-progress is a depressing prospect.

Naturally, most of us would love to enhance the former and minimize the latter.  This is the topic of the wonderful book Think Like an Artist, by British author (and former director of the Tate Gallery) Will Gompertz.  He draws his examples mostly from the visual arts -- his main area of expertise -- but overtly states that the same principles of creativity apply equally well to musicians, writers, dancers, and all of the other kinds of creative humans out there. 

And he also makes a powerful point that all of us are creative humans, provided we can get out of our own way.  People who (for example) would love to be able to draw but say they can't do it, Gompertz claims, need not to change their goals but to change their approach.

It's an inspiring book, and one which I will certainly return to the next time I'm in one of those creative dry spells.  And I highly recommend it to all of you who aspire to express yourself creatively -- even if you feel like you don't know how.

[Note: if you purchase this book using the image/link below, part of the proceeds goes to support Skeptophilia!]


Monday, August 23, 2021

Studies show the author of Skeptophilia is brilliant!

I would love it if some psychologist who studies the effect of media on people's beliefs would do a specific experiment, and then let me know the results.

The experiment I'd like done is to have a series of fake news articles that test subjects would read.  There would be two different kinds of articles -- ones in which the headline basically summarized what the text of the article said (as it should be), and ones in which the headline made a statement that was at odds with what the text of the article actually claimed.  Then, subjects would answer some questions, and see which had a greater impact in their memory -- the contents of the headline, or the contents of the article text.  

I strongly suspect that when the text of an article and the headline conflict, it's the headline that will have the biggest effect on what readers remember.  It's the first thing they see; it's in bold print; and it gives a catchy, terse summary of what the story supposedly is about.  All of the details in the text, I think, are much more likely to be lost, misremembered, or ignored outright.  

An interesting twist would be to ask the people who got the second set of articles -- the ones where the headline contradicted the content -- whether they even noticed.  My guess is a lot of people wouldn't.  I can't tell you how many times I've had someone post a comment or response to my blog that said, "Yes, but have you considered _____?", and it turns out what they wanted me to consider is something that I explicitly addressed in paragraph two.

This comes up because of an article sent to me by a friend, which was entitled "New Studies: ‘Conspiracy Theorists’ Sane, While Government Dupes Are Crazy and Hostile."  The story, which appeared in 21st Century Wire, is making a pretty bold claim -- that what the conspiracy theorists have been claiming all along is correct.  All of us skeptics, who have scoffed at Stop the Steal and Pizzagate and chemtrails and Illuminati and mind control and RFID chip implants in the COVID vaccine and evil Satanic Masonic rituals, are not only wrong, we are the crazy ones.


Naturally, I was pretty interested to read about this.

The first paragraph basically mirrored the headline, stating that "those labeled 'conspiracy theorists' appear to be saner than those who accept the official version of contested events."  Then, we hear about the first study:
The most recent study was published on July 8th by psychologists Michael J. Wood and Karen M. Douglas of the University of Kent (UK). Entitled “What about Building 7?  A social psychological study of online discussion of 9/11 conspiracy theories,” the study compared “conspiracist” (pro-conspiracy theory) and “conventionalist” (anti-conspiracy) comments at news websites.

The authors were surprised to discover that it is now more conventional to leave so-called conspiracist comments than conventionalist ones: “Of the 2174 comments collected, 1459 were coded as conspiracist and 715 as conventionalist.”  In other words, among people who comment on news articles, those who disbelieve government accounts of such events as 9/11 and the JFK assassination outnumber believers by more than two to one.  That means it is the pro-conspiracy commenters who are expressing what is now the conventional wisdom, while the anti-conspiracy commenters are becoming a small, beleaguered minority.
By this time, I was already bouncing up and down in my chair, yelling at my computer, "Just hang on a moment!  That doesn't support what the headline said at all!"  So we have double the number of conspiracist comments as conventional ones posted on news websites -- we're supposed to conclude from this that the conspiracists are more likely to be right?  Or even sane?  All it means is that conspiracist comments are common, which is hardly the same thing.

I don't think that the we can even conclude from this that the conspiracists themselves outnumber the "conventionalists."  For that, we'd need to make the further assumption that people of all beliefs are equally likely to post, which seems like a leap, considering what a rabid lot some of the conspiracy theorists seem to be.  Myself, I have a hard enough time bringing myself to read the comments section on controversial articles, much less post my own comments.

Then, we hear about the second "study:"
(T)hese findings are amplified in the new book Conspiracy Theory in America by political scientist Lance deHaven-Smith, published earlier this year by the University of Texas Press.  Professor deHaven-Smith explains why people don’t like being called “conspiracy theorists”:  The term was invented and put into wide circulation by the CIA to smear and defame people questioning the JFK assassination!  “The CIA’s campaign to popularize the term ‘conspiracy theory’ and make conspiracy belief a target of ridicule and hostility must be credited, unfortunately, with being one of the most successful propaganda initiatives of all time.”

In other words, people who use the terms “conspiracy theory” and “conspiracy theorist” as an insult are doing so as the result of a well-documented, undisputed, historically-real conspiracy by the CIA to cover up the JFK assassination.  That campaign, by the way, was completely illegal, and the CIA officers involved were criminals; the CIA is barred from all domestic activities, yet routinely breaks the law to conduct domestic operations ranging from propaganda to assassinations.
Ah.  So because (1) conspiracy theorists don't like being called conspiracy theorists, and (2) the CIA engaged in some nasty business surrounding the JFK assassination, the conspiracy theorists are actually sane when they babble about chemtrails and the Reptilians.  Got it.

Then, we have an alleged conclusion from psychologist Laurie Manwell, of the University of Guelph, summarized as follows:
Psychologist Laurie Manwell of the University of Guelph agrees that the CIA-designed “conspiracy theory” label impedes cognitive function.  She points out, in an article published in American Behavioral Scientist (2010), that anti-conspiracy people are unable to think clearly about such apparent state crimes against democracy as 9/11 due to their inability to process information that conflicts with pre-existing belief.
So, I did a little digging on Manwell -- and as you might already be anticipating, the author of the article in 21st Century Wire is misrepresenting her, too.  Turns out Manwell thinks that laypeople of all stripes tend to ignore factual information, and pay more attention to claims that support what they already believed.  Take a look at what she wrote in a June 2007 paper, "Faulty Towers of Belief:"
Most laypersons would agree with research showing that attitudes influence a person's evaluation of a subject -- whether it be an idea or another person -- and that the stronger the attitude, the greater influence it will have in evoking a positive or a negative evaluation.  However, the types of reasoning processes that laypersons believe they use when evaluating information are not necessarily the processes that they actually use.  Research repeatedly shows that what people say they are doing, and what they are actually doing, are often two very different things...  Thus, in evaluating the events of 9/11, we need to keep in mind that there are many factors that influence our judgments, including previously formed attitudes and beliefs, many of which are resistant to change, and some of which we may not even be aware of at the time of evaluation.
So, the bottom line is that Manwell's contention is that we're all prone to confirmation bias, which is hardly the same thing as claiming that the conspiracy theorists are clear-eyed exponents of the truth, and the skeptics are dim-witted obstructionists.  And as far as who is entering the argument with more "previously formed attitudes and beliefs," might I just ask you to consider that question from the standpoint of contrasting David Icke and Alex Jones with, say, Sharon Hill, Rebecca Watson, and Simon Singh?

Oh, but don't let that stand in the way of your drawing the conclusion you'd already settled on.  Here's the last line of the article in 21st Century Wire:
No wonder the anti-conspiracy people are sounding more and more like a bunch of hostile, paranoid cranks.
Have you considered the possibility that we're cranky and hostile because we're getting really fucking tired of arguing with a bunch of people who appear to have spent way too much time playing on a pogo stick in a room with low ceilings?

Anyhow, there you have it.  Take some actual research, claim it supports the contentions you already had, then turn around and accuse your opponents of doing what you just did.  Craft a nice, inflammatory headline that basically says, "You Should Believe Me Because the People Who Disagree With Me Are Big Fat Liars," and call it good.

Chances are, the most your readers are going to remember is what you wrote is the headline, anyway, which gives me an idea.  Maybe I should start giving my posts headlines like "New Studies Show That You'll Have Good Luck If You Send Gordon Money."  It's worth a try, because attempting to become independently wealthy as a writer seems to be a losing proposition any other way.

*********************************************

I've been interested for a long while in creativity -- where it comes from, why different people choose different sorts of creative outlets, and where we find our inspiration.  Like a lot of people who are creative, I find my creative output -- and my confidence -- ebbs and flows.  I'll have periods where I'm writing every day and the ideas are coming hard and fast, and times when it seems like even opening up my work-in-progress is a depressing prospect.

Naturally, most of us would love to enhance the former and minimize the latter.  This is the topic of the wonderful book Think Like an Artist, by British author (and former director of the Tate Gallery) Will Gompertz.  He draws his examples mostly from the visual arts -- his main area of expertise -- but overtly states that the same principles of creativity apply equally well to musicians, writers, dancers, and all of the other kinds of creative humans out there. 

And he also makes a powerful point that all of us are creative humans, provided we can get out of our own way.  People who (for example) would love to be able to draw but say they can't do it, Gompertz claims, need not to change their goals but to change their approach.

It's an inspiring book, and one which I will certainly return to the next time I'm in one of those creative dry spells.  And I highly recommend it to all of you who aspire to express yourself creatively -- even if you feel like you don't know how.

[Note: if you purchase this book using the image/link below, part of the proceeds goes to support Skeptophilia!]


Saturday, August 21, 2021

The evolution of Little Red Riding Hood

Every once in a while, I'll run across a piece of scientific research that is so creative and clever that it just warms my heart, and I felt this way yesterday when I stumbled onto a link to the article in PLoS ONE called "The Phylogeny of Little Red Riding Hood," by Jamshid Tehrani of the University of Bristol.

The reason I was delighted by Tehrani's paper is that it combines two subjects I love -- evolutionary biology and mythology and folklore.  The gist of what Tehrani did is to use a technique most commonly used to assemble species into "star diagrams" -- cladistic bootstrap analysis -- to analyze worldwide versions of the "Little Red Riding Hood" story to see to what degree a version in (for example) Senegal was related to one in Germany.

Cladistic bootstrap analysis generates something called a "star diagram" -- not, generally, a pedigree or family tree, because we don't know the exact identity of the common ancestor to all of the members of the tree, all we can tell is how closely related current individuals are.  Think, for example, of what it would look like if you assembled the living members of your family group this way -- you'd see clusters of close relatives linked together (you, your siblings, and your first cousins, for example) -- and further away would be other clusters, made up of more distant relatives grouped with their near family members.

So Tehrani did this with the "Little Red Riding Hood" story, by looking at the similarities and differences, from subtle to major, between the way the tale is told in different locations.  Apparently there are versions of it all over the world -- not only the Grimm Brothers Fairy Tales variety (the one I know the best), but from Africa, the Middle East, India, China, Korea, and Japan.  Oral transmission of stories is much like biological evolution; there are mutations (people change the story by misremembering it, dropping some pieces, embellishment, and so on) and there is selection (the best versions, told by the best storytellers, are more likely to be passed on).  And thus, the whole thing unfolds like an evolutionary lineage.

In Tehrani's analysis, he found three big branches -- the African branch (where the story is usually called "The Wolf and the Kids"), the East Asian branch ("Tiger Grandmother"), and the European/Middle Eastern Branch ("Little Red Riding Hood," "Catterinella," and "The Story of Grandmother").  (For the main differences in the different branches, which are fascinating but too long to be quoted here in full, check out the link to Tehrani's paper.)

Put all together, Tehrani came up with the following cladogram:




WK = "The Wolf and the Kids," TG = "Tiger Grandmother," "Catt" = "Catterinella," GM = "The Story of Grandmother," and RH = "Little Red Riding Hood;" the others are less common variations that Tehrani was able to place on his star diagram.

The whole thing just makes me very, very happy, and leaves me smiling with my big, sharp, wolflike teeth.

Pure research has been criticized by some as being pointless, and this is a stance that I absolutely abhor.  There is a completely practical reason to support, fund, and otherwise encourage pure research -- and that is, we have no idea yet what application some technique or discovery might have in the future.  A great deal of highly useful, human-centered science has been uncovered by scientists playing around in their labs with no other immediate goal than to study some small bit of the universe.  Further, the mere application of raw creativity to a problem -- using the tools of cladistics, say, to analyze a folk tale -- can act as an impetus to other minds, elsewhere, encouraging them to approach the problems we face in novel ways.

But I think it's more than that.  The fundamental truth here is that human mind needs to be exercised.  The "what good is it?" attitude is not only anti-science, it is anti-intellectual.  It devalues inquiry, curiosity, and creativity.  It asks the question "how does this benefit humanity?" in such a way as to imply that the sheer joy of comprehending deeply the world around us is not a benefit in and of itself.

It may be that Tehrani's jewel of a paper will have no lasting impact on humanity as a whole.  I'm perfectly okay with that, and I suspect Tehrani would be, as well.  We need to make our brains buckle down to the "important stuff," yes; but we also need to let them out to play sometimes, a lesson that the men and women currently overseeing our educational system need to learn.  In a quote that seems unusually apt, considering the subject of Tehrani's research, Albert Einstein said: "I am enough of an artist to draw freely upon my imagination.  Imagination is more important than knowledge.  Knowledge is limited.  Imagination encircles the world." 

************************************

I was an undergraduate when the original Cosmos, with Carl Sagan, was launched, and being a physics major and an astronomy buff, I was absolutely transfixed.  Me and my co-nerd buddies looked forward to the new episode each week and eagerly discussed it the following day between classes.  And one of the most famous lines from the show -- ask any Sagan devotee -- is, "If you want to make an apple pie from scratch, first you must invent the universe."

Sagan used this quip as a launching point into discussing the makeup of the universe on the atomic level, and where those atoms had come from -- some primordial, all the way to the Big Bang (hydrogen and helium), and the rest formed in the interiors of stars.  (Giving rise to two of his other famous quotes: "We are made of star-stuff," and "We are a way for the universe to know itself.")

Since Sagan's tragic death in 1996 at the age of 62 from a rare blood cancer, astrophysics has continued to extend what we know about where everything comes from.  And now, experimental physicist Harry Cliff has put together that knowledge in a package accessible to the non-scientist, and titled it How to Make an Apple Pie from Scratch: In Search of the Recipe for our Universe, From the Origin of Atoms to the Big Bang.  It's a brilliant exposition of our latest understanding of the stuff that makes up apple pies, you, me, the planet, and the stars.  If you want to know where the atoms that form the universe originated, or just want to have your mind blown, this is the book for you.

[Note: if you purchase this book using the image/link below, part of the proceeds goes to support Skeptophilia!]



Friday, August 20, 2021

Content warnings

Last week's Fiction Friday post -- about how (or if) we should continue to read writers whose work gives tacit acceptance to such repugnant views as racism or homophobia -- resulted in a few interesting responses and questions. 

The first one had to do with how I as a writer approach other sorts of sensitive topics, especially sexuality and violence.  It did immediately make me wonder why here in the United States those are so often lumped together; we often talk about "sex and violence" in one breath, with regards especially to movie content.  I have no idea why something that can be an expression of pleasure and loving intimacy is somehow put in the same category as harming someone, but that's just one of a gazillion things I don't understand about my own culture.

But accepting for now that they're frequently thrown into the same "this is taboo" category... in my own work, I've written both sex scenes and violent scenes.  To me, both of these in fiction ramp up the emotional intensity, and I have no hesitation including them if it seems appropriate for the plot and characters.  However, I've also seen way too many examples of gratuitous content, where such scenes are simply pasted in to titillate the reader/watcher, and that to me is no more excusable than any other action that leaves you wondering what the point was.

I'm reminded of how some of my students responded to seeing The Matrix Reloaded.  If you haven't watched this movie, there's a scene where Neo and Trinity are desperately horny and looking around for somewhere, anywhere, that they can get each others' clothes off.  They finally succeed, but other than giving us a chance to see Keanu Reeves and Carrie Anne Moss buck naked, it really did zilch for the plot.  And you'd think a bunch of teenage guys would have thought that was awesome, but one and all they branded it as a "stupid scene."

As far as gratuitous violence, consider the amount of goriness in Kill Bill as compared to The Usual Suspects.  I've never taken a body count of either movie; suffice it to say it's high in both films.  But the amount of blood flying around doesn't even begin to compare.  The Usual Suspects, for all of the death and destruction, is a subtle movie, and leaves way more to the imagination than it shows you.  Kill Bill... isn't.

In my own work, I do sometimes include explicit sexuality or violence, but I hope none of it is unnecessary.  Also, there can be many reasons for including such content.  The sex scene in Sephirot is between the main character and a woman he will soon desperately regret getting friendly with; in Kári the Lucky, it's sweet and sad, between lovers who are headed for inevitable tragedy; in Whistling in the Dark, between two characters who have found love and healing in each other after suffering terrible emotional damage.  The same with violent scenes; in Gears, which might be the most overall-violent book I've written, one character gets her arm broken and is choked nearly to death, another is killed by being thrown against a wall, another third shot in the middle of the chest, another crushed by a (psychically-generated) landslide, yet another murdered by a deliberately-loosed falling piece of masonry.  Even so, the violence isn't the point of the story.  If anything, the point of Gears is that goodness and courage and steadfastness will always win over greed and deception and ruthlessness.  The violence is there not only to advance the plot, but to set in stark relief how a choice to be brave and moral isn't without risks, but it's still what we should all aspire to.

Another question generated by last Friday's post had to do with "content warnings" or "trigger warnings."  Should they be present on a book's back cover?  A related question -- are there topics that are over the line for me, that are enough of an emotional trigger for me that I can't write them?

I've never included a content warning for my own work, although I did one time mention to someone I thought might be sensitive to it that Sephirot has a fairly explicit sex scene (as it turned out, the reader in question had no problem with it).  In my work, pretty much What You See Is What You Get; the back cover blurb will give you a pretty good idea of the content of the story, and readers can make the decision whether or not to read a particular book based on that, without needing a specific content warning.  I mostly write speculative/paranormal fiction, so you can expect lots of spooky atmosphere, but (I hope) nothing that really offends.  

However, since we're talking about the capacity for offending readers, it must be mentioned that some of my characters have the tendency to swear a lot.  This is partly because I swear a lot.  I try to make it appropriate for the scene and character, but Be Thou Forewarned.

Be Thou Even More Forewarned if we ever sit down and have a beer together.

As an amusing aside -- I recall being at a book signing event, and a rather prim-looking woman coming up to me and saying she'd really enjoyed Lock & Key, but "the character of the Librarian sure does use the f-bomb a lot."

I responded, completely deadpan, "I know!  I tried talking to him about it, but he told me to fuck off."

Well, at least I thought it was funny.

In all seriousness, the problem is that different people have different sensitive points.  I gave up on the book The Third Eye (by T. Lobsang Rampa), and walked out of the movie Brazil, because of torture scenes; despite my fascination with Scottish history, I refuse to watch Braveheart because I know damn good and well what happens to William Wallace at the end.  However, I know people who had no problem with any of those -- the scenes in question might have been unpleasant, but not enough to cause them serious upset.

In my own work, there are three kinds of scenes that I can't stomach writing; rape, pedophilia, and animal abuse.  I just can't do it.  As far as the last-mentioned, I found out from another reader that I'm not the only one who can't deal with reading about harming animals even in a fictional setting.  In Kill Switch, the main character's dog, Baxter, is his constant companion.  I was stopped on the street by someone in my village who told me he was reading Kill Switch, and so far was enjoying it -- but then a frown crossed his face, and he said, "I know people are gonna die.  I'm okay with that.  It's a thriller, after all."  He brought his face near mine, and said in an intense voice, "But if you kill Baxter, I will never speak to you again."

The scene that for me danced the closest to the edge of that line is, once again, in my novel Sephirot (yeah, it's a pretty emotionally-fraught story, in case you hadn't already figured that out).  A character is the recipient of a brutal bare-back whipping -- it's absolutely necessary for the plot, but it was right at the boundary of "this is too awful for me to write about."

I guess everyone has their limits -- and we as writers need to be cognizant of that.

Anyhow, there are a few responses to the questions and comments generated by last Friday's post.  I love hearing what people think, and what thoughts my posts bring up for readers, so keep those cards and letters comin'.  As for me, I need to get to my work-in-progress, and see what diabolical plot twists I can think of for this novel.  As Stephen King put it, "In a good story, the author gets the readers to love the characters -- then releases the monsters."

So now I'm off to give the monsters some exercise.

************************************

I was an undergraduate when the original Cosmos, with Carl Sagan, was launched, and being a physics major and an astronomy buff, I was absolutely transfixed.  Me and my co-nerd buddies looked forward to the new episode each week and eagerly discussed it the following day between classes.  And one of the most famous lines from the show -- ask any Sagan devotee -- is, "If you want to make an apple pie from scratch, first you must invent the universe."

Sagan used this quip as a launching point into discussing the makeup of the universe on the atomic level, and where those atoms had come from -- some primordial, all the way to the Big Bang (hydrogen and helium), and the rest formed in the interiors of stars.  (Giving rise to two of his other famous quotes: "We are made of star-stuff," and "We are a way for the universe to know itself.")

Since Sagan's tragic death in 1996 at the age of 62 from a rare blood cancer, astrophysics has continued to extend what we know about where everything comes from.  And now, experimental physicist Harry Cliff has put together that knowledge in a package accessible to the non-scientist, and titled it How to Make an Apple Pie from Scratch: In Search of the Recipe for our Universe, From the Origin of Atoms to the Big Bang.  It's a brilliant exposition of our latest understanding of the stuff that makes up apple pies, you, me, the planet, and the stars.  If you want to know where the atoms that form the universe originated, or just want to have your mind blown, this is the book for you.

[Note: if you purchase this book using the image/link below, part of the proceeds goes to support Skeptophilia!]