Skeptophilia (skep-to-fil-i-a) (n.) - the love of logical thought, skepticism, and thinking critically. Being an exploration of the applications of skeptical thinking to the world at large, with periodic excursions into linguistics, music, politics, cryptozoology, and why people keep seeing the face of Jesus on grilled cheese sandwiches.
Showing posts with label writing. Show all posts
Showing posts with label writing. Show all posts

Saturday, August 23, 2025

Encounters with the imaginary

Yesterday I had an interesting conversation with a dear friend of mine, the wonderful author K. D. McCrite.  (Do yourself a favor and check out her books -- she's written in several different genres, and the one thing that unites them all is that they're fantastic.)  It had to do with how we authors come up with characters -- and how often it feels like we're not inventing them, but discovering them, gradually getting to know some actual person we only recently met.  The result is that they can sometimes seem more real than the real people we encounter every day.

"In my early days of writing, my lead male character was a handsome but rather reclusive country-boy detective," K. D. told me.  "The kind who doesn't realize how good he looks in his jeans.  Anyway, whilst in the middle of bringing this book to life, I saw him in the store looking at shirts.  I was startled, seeing him so unexpectedly that way.  So, like any good delusional person would do, I walked toward him and started to ask, 'Hey, Cody.  What are you doing here?'  Thank God, I came to myself, woke up, or whatever, before I reached him and embarrassed myself into the next realm."

I've never had the experience of meeting someone who was strikingly similar to one of my characters, but I've certainly had them take the keyboard right out of my hands and write themselves a completely different part.  The two strangest examples of this both occurred in my Arc of the Oracles trilogy.  In the first book, In the Midst of Lions, the character of Mary Hansard literally appeared out of thin air -- the main characters meet her while fleeing for their lives as law and order collapses around them, and she cheerfully tells them, "Well, hello!  I've been waiting for all of you!"

I had to go back and write an entire (chronologically earlier) section of the book to explain who the hell she was and how she'd known they were going to be there, because I honestly hadn't known she was even in the story.

In the third book, The Chains of Orion, the character of Marig Kastella was initially created to be the cautious, hesitant boyfriend of the cheerful, bold, and swashbuckling main character, the astronaut Kallman Dorn.  Then, halfway through, the story took a sharp left-hand turn when Marig decided to become the pivot point of the whole plot -- and ended up becoming one of my favorite characters I've ever... created?  Discovered?  Met?  I honestly don't know what word to use.

That feeling of being the recorder of real people and events, not the designer of fictional ones, can be awfully powerful.

"Another time," K. D. told me, "we had taken a road trip to North Carolina so I could do some research for a huge historical family saga I was writing.  (I was so immersed in the creation of that book that my then-husband was actually jealous of the main character -- I kid you not!)  As we went through Winston-Salem, we drove past a huge cemetery.  I said, 'Oh, let's stop there.  Maybe that's where the Raven boys are buried and I can find their graves.'  And then I remembered.. the Raven boys weren't buried there.  They weren't buried anywhere.  Good grief."

Turns out we're not alone in this.  A 2020 study carried out by some researchers at Durham University, that was the subject of a paper in the journal Consciousness and Cognition, and received a review in The Guardian, involved surveying authors at the International Book Festival in Edinburgh in 2014 and 2018.  The researchers asked a set of curious questions:
  1. How do you experience your characters?
  2. Do you ever hear your characters’ voices?
  3. Do you have visual or other sensory experiences of your characters, or sense their presence?
  4. Can you enter into a dialogue with your characters?
  5. Do you feel that your characters always do what you tell them to do, or do they act of their own accord?
  6. How does the way you experience your characters’ voices feed into your writing practice?  Please tell us about this process.
  7. Once a piece of writing or performance is finished, what happens to your characters’ voices?
  8. If there are any aspects of your experience of your characters’ voices or your characters more broadly that you would like to elaborate on, please do so here.
  9. In contexts other than writing, do you ever have the experience of hearing voices when there is no one around?  If so, please describe these experiences.  How do these experiences differ from the experience of hearing the voice of a character?
Question #9 was obviously thrown in there to identify test subjects who were prone to auditory hallucinations anyway.  But even after you account for these folks, a remarkable percentage of authors -- 63% -- say they hear their characters' voices, with 56% having visual or other sensory experiences of their characters. 62% reported at least some experience of feeling that their characters had agency -- that they could act of their own accord independent of what the author intended.

You might be expecting me, being the perennially dubious type, to scoff at this.  But all I can say is -- whatever is going on here -- this has happened to me.

[Image licensed under the Creative Commons Martin Hricko, Ghosts (16821435), CC BY 3.0]

Here are some examples that came out of the study, and that line up with the exactly the sort of thing both K. D. and I have experienced:
  • I have a very vivid, visual picture of them in my head.  I see them in my imagination as if they were on film – I do not see through their eyes, but rather look at them and observe everything they do and say.
  • Sometimes, I just get the feeling that they are standing right behind me when I write.  Of course, I turn and no one is there.
  • They [the characters' voices] do not belong to me.  They belong to the characters.  They are totally different, in the same way that talking to someone is different from being on one’s own.
  • I tend to celebrate the conversations as and when they happen.  To my delight, my characters don’t agree with me, sometimes demand that I change things in the story arc of whatever I’m writing.
  • They do their own thing!  I am often astonished by what takes place and it can often be as if I am watching scenes take place and hear their speech despite the fact I am creating it.
"The writers we surveyed definitely weren’t all describing the same experience," said study lead author John Foxwell, "and one way we might make sense of that is to think about how writing relates to inner speech...  Whether or not we’re always aware of it, most of us are trying to anticipate what other people are going to say and do in everyday interactions.  For some of these writers, it might be the case that after a while their characters start to feel independent because the writers developed the same kinds of personality ‘models’ as they’d develop for real people, and these were generating the same kinds of predictions."

Which is kind of fascinating.  When I've done book signings, the single most common question revolves around where my characters and plots come from.  I try to give some kind of semi-cogent response, but the truth is, the most accurate answer is "beats the hell out of me."  They seem to pop into my head completely unannounced, sometimes with such vividness that I have to write the story to discover why they're important.  I often joke that I keep writing because I want to find out how the story ends, and there's a sense in which this is exactly how it seems.

I'm endlessly fascinated with the origins of creativity, and how creatives of all types are driven to their chosen medium to express ideas, images, and feelings they can't explain, and which often seem to come from outside.  Whatever my own experience, I'm still a skeptic, and I am about as certain as I can be that this is only a very convincing illusion, that the imagery and personalities and plots are bubbling up from some part of me that is beneath my conscious awareness.

But the sense that it isn't, that these characters have an independent existence, is really powerful.  So if (as I'm nearly certain) it is an illusion, it's a remarkably intense and persistent one, and seems to be close to ubiquitous in writers of fiction.

And I swear, I didn't have any idea beforehand about Mary Hansard's backstory and what Marig Kastella would ultimately become.  Wherever that information came from, I can assure you that I was as shocked as (I hope) my readers are to find it all out.

****************************************


Monday, May 19, 2025

The loss of memory

British science historian James Burke has a way of packing a lot of meaning into a small space.

I still recall the first time I watched his amazing series The Day the Universe Changed, in which he looked at moments in history that radically altered the direction of human progress.  The final installment, titled "Worlds Without End," had several jaw-hanging-open scenes, but one that stuck with me was near the beginning, where he's recapping some of the inventions that had led to our current scientific outlook and high-tech world.  "In the fifteenth century," Burke said, "the invention of the printing press by Johannes Gutenberg took our memories away."

Being someone who has always loved the written word, it had honestly never occurred to me that writing -- and, even more, mass printing -- had a downside; the fact that we no longer have to commit information to memory, but can rely on what amount to external memory storage devices.  Burke, of course, is hardly the first person to make this observation.  Back in around 370 B.C.E., Socrates (as recorded by his disciple Plato in the dialogue Phaedrus) comments that the invention of writing is as much a curse as a blessing, a viewpoint he frames as a discussion between the Egyptian gods Thamus and Thoth, the latter of whom is credited with the creation of Egyptian hieroglyphics:

"This invention, O king," said Thoth, "will make the Egyptians wiser and will improve their memories; for it is an elixir of memory and wisdom that I have discovered."  But Thamus replied, “Most ingenious Thoth, one man has the ability to beget arts, but the ability to judge of their usefulness or harmfulness to their users belongs to another; and now you, who are the father of letters, have been led by your affection to ascribe to them a power the opposite of that which they really possess.

"For this invention will produce forgetfulness in the minds of those who learn to use it, because they will not practice their memory.  Their trust in writing, produced by external characters which are no part of themselves, will discourage the use of their own memory within them.  You have invented an elixir not of memory, but of reminding; and you offer your pupils the appearance of wisdom, not true wisdom, for they will read many things without instruction and will therefore seem to know many things, when they are for the most part ignorant and hard to get along with, since they are not wise, but only appear wise."

Socrates also points out that once written, a text is open to anyone's interpretation; it can't say, "Hey, wait, that's not what I meant:"

I cannot help feeling, Phaedrus, that writing is unfortunately like painting; for the creations of the painter have the attitude of life, and yet if you ask them a question they preserve a solemn silence.  And the same may be said of speeches.  You would imagine that they had intelligence, but if you want to know anything and put a question to one of them, the speaker always gives one unvarying answer.  And when they have been once written down they are tumbled about anywhere among those who may or may not understand them, and know not to whom they should reply, to whom not: and, if they are maltreated or abused, they have no parent to protect them; and they cannot protect or defend themselves.

And certainly he has a point.  A writer can write down nonsense just as easily as universal truth, and (as I've found out with my own writing!) two people reading the same passage can come to completely different conclusions about what it means.  Even the most careful and skillful writing can't avoid all ambiguity.

I'm not clear that we're on any surer footing with the oral tradition, though.  Not only do we have the inevitable "mutations" in lineages passed down orally (a phenomenon that was used to brilliant effect by sociolinguist Jamshid Tehrani in his delightful research into the phylogeny of "Little Red Riding Hood"), there's the problem that suppression of cultures from invasion, colonization, or conquest often wipes out (or at least drastically alters) the cultural memory.

How much of our history, mythology, and knowledge has been erased simply because the last person who had the information died without ever passing it on?

[Image licensed under the Creative Commons Planemad, Chart of world writing systems, CC BY-SA 3.0]

Swiss philosopher Jean-Jacques Rousseau seems to side with Socrates, though.  In his Essay on the Origin of Languages, he writes:

Writing, which would seem to crystallize language, is precisely what alters it.  It changes not the words but the spirit, substituting exactitude for expressiveness.  Feelings are expressed in speaking, ideas in writing.  In writing, one is forced to use all the words according to their conventional meaning.  But in speaking, one varies the meanings by varying one’s tone of voice, determining them as one pleases.  Being less constrained to clarity, one can be more forceful.  And it is not possible for a language that is written to retain its vitality as long as one that is only spoken.
I wonder about that last bit.  Chinese has been a written language for over eight millennia, and I think you'd be hard-pressed to defend the opinion that it has "lost its vitality."  Seems to me that like most arguments of this ilk, the situation is complex.  Writing down our ideas may mean losing nuance and increasing the dependence on interpretation, but the gain in (semi-) permanence is pretty damn important.

And of course, this has bearing on our own century's old-school pearl-clutching; people decrying the shift toward electronic (rather than print) media, and in English, the fact that cursive isn't being taught in many elementary schools.  My guess is that like the loss of memory Socrates predicted, and Rousseau's concerns over the "crystallization" of language into something flat and dispassionate, the human mind -- and our ability to communicate meaningfully -- will survive this latest onslaught.

So I'm still in favor of the written word.  Obviously.  My own situation is a little like the exchange between the Chinese philosophers Lao Tsu and Zhuang Zhou.  Lao Tsu, in his book Tao Te Ching, famously commented, "Those who say don't know, and those who know don't say."  To which Zhuang Zhou wryly responded, "If 'those who say don't know and those who know don't say,' why is Lao Tsu's book so long?"

****************************************


Tuesday, January 14, 2025

Life out of round

All my life, I've been pulled by two opposing forces.

One of them is the chaos-brain I described in yesterday's post, which I seem to have been born with.  The other is a ferocious attempt to counteract that tendency by controlling the absolute hell out of my surroundings.  I know a lot of this came from the way I was raised; throughout my childhood, nothing I ever did was good enough, and any compliments came along with an appended list (notarized and in triplicate) of all the things I should have done differently and/or could have done better.  

The result is that I do a great deal of overcompensation.  I became fanatically neat, because organizing my physical space was a way of coping with the fact that my brain is like a car with bald tires and no brakes.  My classroom was so organized and clean you could just about eat off the floor (and keep in mind that it was a biology lab).  As a teacher, I strove to make use of every moment we had, and faulted myself whenever things didn't go well or there was an eventuality I hadn't planned for.

I didn't expect perfection from my students, but I did from myself.  And, in some parts of my life, it served me well enough.

The problem is, that approach doesn't work when you apply it to the arts.

I'm not even talking about the "learning curve" issue, here.  Even when I've attained some level of proficiency, I still expect nothing less than perfection, excoriating myself for every scene in a story that didn't come out the way I wanted, every slightly lopsided piece of pottery, every missed note when I play music.

In theory, I'm one hundred percent in agreement with the quote from Ludwig van Beethoven -- "To play a wrong note is insignificant; to play without passion is inexcusable."  Or, more accurately, I believe that for everyone else.  It's much harder to treat myself so forgivingly.

The result has been an overwhelming case of impostor syndrome, coupled with fear of criticism -- which will, in my warped way of looking at things, only confirm what I've thought about myself all along.  I'm at least working on getting my writing out there under the public eye, despite the inherent risks of poor sales and/or bad reviews, but it's been harder in other aspects of my creative life.  I'm still at the stage where I had to have my arm twisted (hard) to induce me to join as a flutist in a contradance band, and it's damn near impossible to get me to play the piano in front of anyone else (including my wife).  But I'm harshest about my own skill when it comes to my artistic work, which is pottery.  I keep very little of what I make, and most of what I do keep are the pieces that are simple and purely functional -- bowls and mugs and the like.  The vast majority of the sculptures and other, more unusual, pieces I make end up given that dreadful label of "not good enough" and are smashed against the concrete wall of the back of our house. 

All along, I had the attitude -- again, directly consonant with my upbringing -- that this is how you improve, that constant self-criticism should act as some kind of impetus to getting better, to ridding your work of those dreaded mistakes, to attaining that fabled ability to create something with which others could not find fault.

It's only been recently that I've realized that this approach is completely antithetical to creativity.

I got to thinking about this after watching an online pottery workshop with the wonderful New Hampshire potter Nick Sevigney, whose pieces are weird and whimsical and unexpected.  A lot of his pottery has a steampunk feeling, a sense of having been put together from a random assemblage of parts.  It was a revelation to watch him piece together cut slabs of clay, not caring if the result was a little uneven or had a rough edge.  In fact, he embraces those seeming imperfections, turning them "from a bug into a feature."

So I decided to see if I could do a few pieces that riffed off of his approach.

I'm most comfortable on the potter's wheel, so I started out throwing three medium-sized white stoneware bowls.  I've gotten pretty good at getting that smooth curve and rounded profile, with a perfectly circular rim, that is what most of us shoot for when creating a bowl.  

Usually, that's where I'd stop.  If it passed my critical assessment -- not lopsided, decent weight, evenly thick walls, nice smooth surface -- I'd keep it.  Otherwise, into the scrap bucket it'd go.  But here... that was only the first step.

One of the techniques Nick does is taking a piece, cutting chunks out of it, adding texture to the chunk, then reattaching it.  You'd think that because you're putting the piece back where it had been, it'd fit perfectly; but the problem is that adding texture (usually using stamps or rollers) stretches and flattens the clay, so inevitably it ends up larger than the hole it came from.  Nick just forces it to fit, warping the piece's profile -- and instead of worrying about that, he often adds some circular marks that make it look like the piece was inexpertly riveted or screwed back on.

He leans into the unevenness hard.  And the result is something magical, like a relic you might find in a demolished nineteenth century mad scientist's laboratory, something stitched together and broken and reassembled upside down and backwards.

So I took my three smooth, undamaged stoneware bowls and gave it a try.

One of the results

The hardest part -- unsurprising, perhaps, given my personality -- was making the first cut.  Even knowing that if I didn't like the result, I have more clay and could always make another plain, boring, but "perfect" bowl, I sat there for some time, knife in hand, as if the Pottery Gods would smite me if I touched that sleek, classic profile.  Slicing and pressing and marring and deforming it felt like deliberately choosing to ruin something "nice."  

But maybe "nice" isn't what we should be shooting for, as creatives.

Maybe the goal should be somewhere out there beyond "nice."  The point, I realized, is not to retread the safe, secure footsteps I've always taken, but to take a deep breath and launch off into the shadowlands.

So I cut a big chunk from the side of the bowl, got out my texturing stamps and rollers, and set to work.

I was half expecting to give up after a few attempts and throw the whole thing into the scrap bucket, but I didn't.  I found I actually kind of liked the result, as different as it is from what I usually make.  And what surprised me even more was that once I got into it, it was...

... fun.

I've never been much good at "having fun."  In general, I give new meaning to the phrase "tightly wound."  Letting loose and simply being silly is way outside my wheelhouse.  (I know I shortchanged my boys as a dad when they were little simply by my seeming inability to play.)  But I've come to realize that the spirit of playfulness is absolutely critical to creativity.  I don't mean that every creative endeavor should be funny or whimsical; but that sense of pushing the boundaries, of letting the horse have its head and seeing where you end up, is at the heart of what it means to be creative.

I was recently chatting with another author about times when inspiration in writing will surprise you, coming at you seemingly out of nowhere.  When it happens, the feeling is honestly like the ideas are originating outside of my own brain.  There are two examples of this that come to mind immediately, cases where characters to whom I'd never intended to give a big role basically said, "Nuh-uh, you're not sidelining me.  I'm important, and here's why."  (If you're curious, the two are Jennie Trahan in my novella "Convection," and most strikingly, Marig Kastella in The Chains of Orion, who kind of took over the last third of the book, and became one of my favorite characters I've ever written.)  When that happens, it means I've loosened my death-grip on the story, and given my creativity space to breathe.

And it always is a hallmark of things going really right with the writing process.

So I guess the point of all this is to encourage you to stretch your boundaries in your own creative work.  I won't say "lose your fears" -- that's hopeless advice -- but try something new despite them.  (Either something new within your chosen creative medium, or something entirely new.)  Be willing to throw your creative life out of round, to press it into new and unexpected configurations, to turn in a new direction and see where you end up.  There's good stuff to be found there outside of the narrow, constricted, breathless little boundaries of what we've always been told is "the right way to do things."  Take a risk.  Then take another one.  The goal of creativity is not to play it safe.

As French author and Nobel laureate André Gide put it, "One does not discover new lands without consenting to lose sight of the shore."

****************************************

NEW!  We've updated our website, and now -- in addition to checking out my books and the amazing art by my wife, Carol Bloomgarden, you can also buy some really cool Skeptophilia-themed gear!  Just go to the website and click on the link at the bottom, where you can support your favorite blog by ordering t-shirts, hoodies, mugs, bumper stickers, and tote bags, all designed by Carol!

Take a look!  Plato would approve.


****************************************

Thursday, January 9, 2025

Guest post from Andrew Butters: Devil's in the details

Before we start, what are your thoughts on calling certain people Overzealous Grammar Reporting Enthusiasts instead of Grammar Na*is?  OGREs.  I think this works.  Hereinafter, that is how I will refer to them. With that out of the way, let’s get on with it.

***

I read just about everything Gordon Bonnet writes.  I read his blog, Skeptophilia, daily (well, six days a week.  He takes Sundays off.  He was also kind enough to crosspost this for me today).  Occasionally, I’ll find a typo.  When I do, I shoot him a message pointing it out, and he thanks me and then fixes it (though sometimes he fixes it and then thanks me.  Potato potato).  My response is the same when he does the same for my writing here or on Facebook.

Tyops happen.  It's not an automatic sign that the writer was negligent.  It's not irrefutable proof that self-published authors are "lesser" when compared with traditionally published ones.  I’ve seen typos in Stephen King's books and from highly respected AP journalists.  Here’s a great example of a traditional publisher thinking that global search and replace was a good idea:


Readers who come across them vary.  Some ignore them and move on.  I typically ignore them, but if I were to find a shit-tonne, I'd stop reading and send the author or publisher a private message.  No need to make a scene.  That's me, though.  Some people latch onto them as if the fate of the literary world hangs in the balance (OGREs).  Take this example:


Now, I’m told that their book was reinstated after an outpouring of support from readers, but the fact that it happened should serve as a cautionary tale.  I scooped this screenshot from someone on Facebook, and one of the comments read (in part):
“You do your job poorly, there are consequences.  That’s how it works.  And no, if there is a typo in my book I AM telling Amazon because I want my money back.”
—Some OGRE on Facebook
It took all my willpower not to point out that Grammarly suggested not one but two corrections to his comment.  At any rate, I don't blame others for piping up if the typos are rampant.  The thing is, in my experience, books like that are rare.  I've read many books from established big names to first-time self-published authors and have yet to encounter one with enough errors to raise an eyebrow.  No, the plural of anecdote isn't data, but you get my point.  Sometimes shit happens.  Welcome to being human.  Unfortunately, not everyone sees it that way.


What follows is a true story.

I wrote Near Death By A Thousand Cuts over about a month, sometime in November 2022.  After writing, I let it sit for about a week.  Then, I started editing.  These were all personal anecdotes, so I didn't approach it like I would fiction.  The language was informal, and there was more swearing.

I made three passes of editing before sending it to my actual editor, who, in this case, happened to be Gordon (a great writer in his own right and a former teacher with an MA in linguistics).  I made the changes he recommended, adding a few more.

Then, I had seven beta readers go through it (reading critically, not just for fun), and THEY found errors.

Then, my mom (a former teacher) read it and found some stuff.

Then, I read the proofcopy and found more things.

Then, upon receiving what was supposed to be the final version to upload to KDP, I got a message from my layout designer.  SHE found a typo.

Like, holy shit.  Even after all the people and all the times this book was read, there was still a missing letter ("a" should have been "an").

Then, I recorded the audiobook, and guess what? I found MORE mistakes.

All that to say, editing is hard.

I have a good mind to send a link for Near Death to the OGRE from the quote above, with their high standards, and ask them to have a go at it.  I’d even refund them their money, forgoing my royalty and Amazon’s cut.

If you find a typo in my book Known Order Girls, I’ll mail you a bookplate (normally $5).  I extended this offer on Facebook, and someone took me up on it!  They were very kind, and I appreciate their eagle eyes catching something that made it through the editing gauntlet.

There will always be some asshole typo, waiting, lurking, biding its time, and making itself known only to that one reader who will fixate on it and leave a bad review as a result.

As Vonnegut probably wrote, "So ti goes."

****************************************

Friday, December 13, 2024

The parasitic model

My post yesterday, about how the profit motive in (and corporate control of) media has annihilated any hope of getting accurate representation of the news, was almost immediately followed up by my running into a story about how the same forces in creative media are working to strangle creativity at its source.

The article was from Publishers Weekly, and was about an interview with HarperCollins CEO Brian Murray.  It centered largely on the company's whole-hearted endorsement of AI as part of its business model.  He describes using AI to take the place both of human narrators for audiobooks and of translators for increasing their sales in non-English-speaking countries, which is troubling enough; but by far his most worrisome comment describes using AI, basically, to be a stand-in for the authors themselves.  Lest you think I'm exaggerating, or making this up entirely, here's a direct quote from the article:

The fast-evolving AI sector could deliver new types of formats for books, Murray said, adding that HC is experimenting with a number of potential products.  One idea is a “talking book,” where a book sits atop a large language model, allowing readers to converse with an AI facsimile of its author.  Speculating on other possible offerings, Murray said that it is now possible for AI to help HC build an entire cooking-focused website using only content from its backlist, but the question of how to monetize such a site remains.

Later in the article, almost offhand, was a comment that while HarperCollins saw their sales go up last year by only six percent, their profits went up by sixty percent.  The reason was a "restructuring" of the company -- which, of course, included plenty of layoffs.

How much of that windfall went to the authors themselves is left as an exercise for the reader. 

I can vouch first-hand that in the current economic climate, it is damn near impossible to make a living as a writer, musician, or artist.  The people who are actually the wellspring of creativity powering the whole enterprise of creative media get next to nothing; the profits are funneled directly into the hands of a small number of people -- the CEOs of large publishing houses, distributors, marketing and publicity firms, and social media companies.

I can use myself as an example.  I have twenty-four books in print, through two small traditional publishers and some that are self-published.  I have never netted more than five hundred dollars in a calendar year; most years, it's more like a hundred.  I didn't go into this expecting to get rich, but I'd sure like to be able to take my wife out to a nice restaurant once a month from my royalties.

As it is, we might be able to split the lunch special at Denny's.

Okay, I can hear some of you say; maybe it's not the system, maybe it's you.  Maybe your books just aren't any good, and you're blaming it on corporate greed.  All right, fair enough, we can admit that as a possibility.  But I have dozens of extraordinarily talented and hard-working writer friends, and they all say pretty much the same thing.  Are you gonna stand there and tell us we're all so bad we don't deserve to make a living?

And now the CEO of HarperCollins is going to take the authors out of the loop even of speaking for ourselves, and just create an AI so readers can talk to a simulation of us without our getting any compensation for it?

Ooh, maybe he could ratchet those profits up into the eighty or ninety percent range if he eliminated the authors altogether, and had AI write the books themselves.

Besides the greed, it's the out-of-touchness that bothers me the most.  Lately I've been seeing the following screenshot going around -- a conversation between Long Island University Economics Department Chair Panos Mourdoukoutas and an ordinary reader named Gwen:


The cockiness is absolutely staggering; that somehow it's better to put even more money in Jeff Bezos's pockets than it is to support public libraries.  They've already got the entire market locked up tight, so what more do the corporate CEOs want?  It's flat-out impossible as an author to avoid selling through Amazon; they've got an inescapable stranglehold on book sales.  And, as I found out the hard way, they also have no problem with reducing the prices set by me or my publisher without permission, further cutting into any profit I get -- but, like HarperCollins, you can bet they make sure it doesn't hurt their bottom line by a single cent.

And don't even get me started about the Mark Zuckerberg model of social media.  When Facebook first really got rolling, authors and other creators could post links to their work, and it was actually not a bad way to (at the very least) get some name recognition.  Now?  Anything with an external link gets deliberately drowned by the algorithm.  Oh, sure, you can post stuff, but no one sees it.  The idea is to force authors to purchase advertising from Facebook instead.

Basically, if it doesn't make Zuckerberg money, you can forget about it.

If I sound bitter about all this -- well, it's because I am.  I've thrown my heart into my writing, and gotten very little in return.  We've ceded the control of the creative spirit of humanity to an inherently parasitic system, where the ones who are actually enriching the cultural milieu are reaping only a minuscule percent of the rewards.

The worst part is that, like the situation I described yesterday regarding the news media, I see no way out of this, not for myself nor for any other creative person.  Oh, we'll continue doing what we do; writing is as much a part of my life as breathing.  But isn't it tragic that the writers, artists, and musicians whose creative spirits nurture all of us have to struggle against seemingly insurmountable odds even to be seen?

All because of the insatiable greed, arrogance, and short-sightedness of a handful of individuals who have somehow ended up in charge of damn near everything that makes life bearable.  People who want more and more and more, and after that, more again.  Millions don't satisfy; they need billions.

As psychologist Erich Fromm put it, "Greed is a bottomless pit which exhausts the person in an endless effort to satisfy the need without ever actually reaching satisfaction."

****************************************

Tuesday, December 3, 2024

Easy as A, B, C

There's an unfortunate but natural tendency for us to assume that because something is done a particular way in the culture we were raised in, that obviously, everyone else must do it the same way.

It's one of the (many) reasons I think travel is absolutely critical.  Not only do you find out that people elsewhere get along just fine doing things differently, it also makes you realize that in the most fundamental ways -- desire for peace, safety, food and shelter, love, and acceptance -- we all have much more in common than you'd think.  As Mark Twain put it, "Travel is fatal to prejudice, bigotry, and narrow-mindedness, and many of our people need it sorely on these accounts.  Broad, wholesome, charitable views of men and things cannot be acquired by vegetating in one little corner of the earth all one's lifetime."

One feature of culture that is so familiar that most of the time, we don't even think about it, is how we write.  The Latin alphabet, with a one-sound-one-character correspondence, is only one way of turning spoken language into writing.  Turns out, there are lots of options:
  • Pictographic scripts -- where one symbol represents an idea, not a sound.  One example is the Nsibidi script, used by the Igbo people of Nigeria.
  • Logographic scripts -- where one symbol represents a morpheme (a meaningful component of a word; the word unconventionally, for example, has four morphemes -- un-, convention, -al, and -ly).  Examples include early Egyptian hieroglyphics (later hieroglyphs included phonetic/alphabetic symbols as well), the Cuneiform script of Sumer, the characters used in Chinese languages, and the Japanese kanji.
  • Syllabaries -- where one symbol represents a single syllable (whether or not the syllable by itself has any independent meaning).  Examples include the Japanese hiragana script, Cherokee, and Linear B -- the mysterious Bronze-Age script from Crete that was a complete mystery until finally deciphered by Alice Kober and Michael Ventris in the mid-twentieth century.
  • Abjads -- where one symbol represents one sound, but vowels are left out unless they are the first sound in the word.  Examples include Arabic and Hebrew.
  • Abugidas -- where each symbol represents a consonant, and the vowels are indicated by diacritical marks (so, a bit like a syllabary melded with an abjad).   Examples include Thai, Tibetan, Bengali, Burmese, Malayalam, and lots of others.
  • Alphabets -- one symbol = one sound for both vowels and consonants, such as our own Latin alphabet, as well as Cyrillic, Greek, Mongolian, and many others.
To make things more complicated, scripts (like every other feature of language) evolve over time, and sometimes can shift from one category to another.  There's decent evidence that our own alphabet evolved from a pictographic script.  Here are three examples of pathways letters seem to have taken:

[Image licensed under the Creative Commons Rozemarijn van L, Proto-sinaitic-phoenician-latin-alphabet-2, CC BY-SA 4.0]

The reason the topic comes up is the discovery at Tell Um-el Marra, Syria of incised clay cylinders that date to 2400 B.C.E. and may be the earliest known example of an alphabetic script -- meaning one of the last four in the list, which equate one symbol with one sound or sound cluster (rather than with an idea, morpheme, or entire word).  If the discovery and its interpretation bear up under scrutiny, it would precede the previous record holder, Proto-Sinaitic, by five hundred years.

"Alphabets revolutionized writing by making it accessible to people beyond royalty and the socially elite," said Glenn Schwartz, of Johns Hopkins University, who led the research.  "Alphabetic writing changed the way people lived, how they thought, how they communicated.  And this new discovery shows that people were experimenting with new communication technologies much earlier and in a different location than we had imagined before now...  Previously, scholars thought the alphabet was invented in or around Egypt sometime after 1900 B.C.E.  But our artifacts are older and from a different area on the map, suggesting the alphabet may have an entirely different origin story than we thought."

When you think about it, alphabetic scripts are a brilliant, but odd, innovation.  Drawing a picture, or even a symbol, of an entire concept as a way of keeping track of it -- the head of a cow on a vessel containing milk, for example -- isn't really that much of a stretch.  But who came up with letting symbols represent sounds?  It's a totally different way of representing language.  Not merely the symbols themselves altering, and perhaps becoming simpler or more stylized, but completely divorcing the symbol from the meaning.

No one, for example, links the letter "m" to water any more.  It's simply a symbol-sound correspondence, and nothing more; the symbol itself has become more or less arbitrary.  The level of meaning has been lifted to clusters of symbols.

It's so familiar that we take it for granted, but honestly, it's quite a breathtaking invention.

Scholars are uncertain what the writing on the clay cylinders says; they've yet to be translated, so it may be that this assessment will have to be revisited.  Also uncertain is how it's related to other scripts that developed later in the region, which were largely thought to be derived from Egyptian writing systems.

If this discovery survives peer review, it may be that the whole history of symbolic written language will have to be re-examined.

But that's all part of linguistics itself.  Languages evolve, as does our understanding of them.  Nothing in linguistics is static.  The argument over whether it should be -- the infamous descriptivism vs. prescriptivism fight -- is to me akin to denying the reality of biological evolution.  Our word usages, definitions, and spellings have changed, whether you like it or not; so have the scripts themselves.  Meaning, somehow, still somehow survives, despite the dire consequences the prescriptivists warn about.

It's why the recent tendency for People Of A Certain Age to bemoan the loss of cursive writing instruction in American public schools is honestly (1) kind of funny, and (2) swimming upstream against a powerful current.  Writing systems have been evolving since the beginning, with complicated, difficult to learn, difficult to reproduce, ambiguous, or highly variable systems being altered or eliminated outright.  It's a tough sell, though, amongst people who have been trained all their lives to use that script; witness the fact that Japanese still uses three systems, more or less at the same time -- the logographic kanji and the syllabic hiragana and katakana.  It will be interesting to see how long that lasts, now that Japan has become a highly technological society.  My guess is at some point, they'll phase out the cumbersome (although admittedly beautiful) kanji, which require understanding over two thousand symbols to be considered literate.  The Japanese have figured out how to represent kanji on computers, but the syllabic scripts are so much simpler that I suspect they'll eventually win.

I doubt it'll be any time soon, though.  The Japanese are justly proud of their long written tradition, and making a major change in it will likely be met with as much resistance as English spelling reform has been.

In any case, it's fascinating to see how many different solutions humans have found for turning spoken language into written language, and how those scripts have changed over time (and continue to change).  All features of the amazing diversity of humanity, and a further reminder that "we do it this way" isn't the be-all-end-all of culture.

****************************************

Wednesday, October 23, 2024

The moral of the story

I was asked an interesting question yesterday: does a good fictional story always have a moral?

My contention is even stories that are purely for entertainment still often do have morals.  Consider Dave Barry's novel Big Trouble, a lunatic romp in south Florida that for me would be in the running for the funniest book ever written.  Without stretching credulity too much, you could claim that Big Trouble has the theme "love, loyalty, and kindness are always worth it."  Certainly the humor is more the point, but the end of the story (no spoilers) is so damn sweet that the first time I read it, it made me choke up a little.

Another favorite genre, murder mysteries, could usually be summed up as "murdering people is bad."

But that's not what most people mean by "a moral to the story."  Generally, a story with a moral is one where the moral is the main point -- not something circumstantial to the setting or plot.

The moral is the reason the story was written.

I'm a little ambivalent about overt morals in stories.  I've seen it done exceptionally well; Thornton Wilder's amazing The Bridge of San Luis Rey is explicitly about a man trying to find out if things happen for a reason, or if the universe is simply chaotic.  His conclusion -- that either there is no reason, or else the mind of God is so subtle that we could never parse the reason -- is absolutely devastating in the context of the story.  The impact on me when I first read it, as an eleventh grader in a Modern American Literature class in high school, turned my whole worldview upside down.  In a lot of ways, that one novel was the first step in shaping the approach to life I now have, forty-seven-odd years later.

If I can be excused for detouring into my favorite television show, Doctor Who, you can find there a number of examples of episodes where the moral gave the story incredible impact.  A few that come to mind immediately are "Midnight," which looks at the ugly side of tribalism and the human need to team up against a perceived common enemy, "Demons of the Punjab," about the inevitability of death and grief, "Dot and Bubble," which deals with issues of institutionalized racism, and "Silence in the Library," with a subtext of the terrible necessity of self-sacrifice.

But if you want examples of bad moralistic stories, you don't have to look any further.  The episode "Orphan 55," from the Thirteenth Doctor's run, pissed off just about everyone -- not only because of the rather silly cast of characters, but because at the end the Doctor delivers a monologue that amounts to, "Now, children, let me explain to you how all this bad stuff happened because humans are idiots and didn't address climate change."


So what's the difference?

In my mind, it all has to do with subtlety -- and respect for the reader's (or watcher's) intelligence.  A well-done moral-based story has a deep complexity; it tells the story and then leaves us to figure out what the lesson was. Haruki Murakami's brilliant and heartbreaking novel Colorless Tsukuru Tazaki is about what happens when people are in a lose-lose situation -- and that sometimes a terrible decision is still preferable when the other option is even worse.  But Murakami never comes out and says that explicitly.  He lets his characters tell their tales, and trusts that we readers will get to the punchline on our own.

Bad moral fiction -- often characterized as "preachy" -- doesn't give the reader credit for having the intelligence to get what's going on without being walloped over the head repeatedly by it.  One that immediately comes to mind is Ayn Rand's The Fountainhead, which is so explicitly about Big Government Is Bad and Individualism Is Good and Smart Creative People Need To Fight The Man that she might as well have written just that and saved herself a hundred thousand words.

I think what happens is that we authors have an idea of what our stories mean, and we want to make sure the readers "get it."  The problem is, every reader is going to bring something different to the reading of a story, so what they "get" will differ from person to person.  If that weren't the case, why would there be any difference in our individual preferences?  But authors need to trust that our message (whatever it is) is clear enough to shine through without our needing to preach a sermon in a fictional setting.  Stories like "Orphan 55" don't work because they insult the watcher's intelligence.  "You're probably too dumb to figure out what we're getting at, here," they seem to say.  "So let me hold up a great big sign in front of your face to make sure you see it."

A lot of my own work has an underlying theme that I'm exploring using the characters and the plot, but I hope I don't fall into the trap of preachiness.  Probably my most explicitly moral-centered tale, The Communion of Shadows, is about the fragility of life, the importance of taking emotional risks, and the absolute necessity of looking after the people we love, because we never know how long we have -- but I think the moral comes out of the characters' interactions organically, not because I jumped up and down and screamed it at you.

But it can be a fine line, sometimes.  Like I said, we all have different attitudes and backgrounds, so our relationship to the stories we read is bound to differ.  There are undoubtedly people who loved "Orphan 55" and The Fountainhead, so remember that all this is just my own opinion.

And maybe that's the overarching moral of this whole topic; that everyone is going to take away something different.  After all, if everyone hated explicitly moralistic stories, the Hallmark Channel would be out of business by next week.

****************************************


Friday, March 24, 2023

The writing's on the wall

When you think about it, writing is pretty weird.

Honestly, language in general is odd enough.  Unlike (as far as we know for sure) any other species, we engage in arbitrary symbolic communication -- using sounds to represent words.  The arbitrary part means that which sounds represent what concepts is not because of any logical link; there's nothing any more doggy about the English word dog than there is about the French word chien or the German word Hund (or any of the other thousands of words for dog in various human languages).  With the exception of the few words that are onomatopoeic -- like bang, bonk, crash, and so on -- the word-to-concept link is random.

Written language adds a whole extra layer of randomness to it, because (again, with the exception of the handful of languages with truly pictographic scripts), the connection between the concept, the spoken word, and the written word are all arbitrary.  (I discussed the different kinds of scripts out there in more detail in a post a year ago, if you're curious.)

Which makes me wonder how such a complex and abstract notion ever caught on.  We have at least a fairly good model of how the alphabet used for the English language evolved, starting out as a pictographic script and becoming less concept-based and more sound-based as time went on:


The conventional wisdom about writing is that it began in Sumer something like six thousand years ago, beginning with fired clay bullae that allowed merchants to keep track of transactions by impression into soft clay tablets.  Each bulla had its own symbol; some were symbols for the type of goods, others for numbers.  Once the Sumerians made the jump of letting marks stand for concepts, it wasn't such a huge further step to make marks for other concepts, and ultimately, for syllables or individual sounds.

The reason all this comes up is that a recent paper in the Cambridge Archaeology Journal is claiming that marks associated with cave paintings in France and Spain that were long thought to be random are actual meaningful -- an assertion that would push back the earliest known writing another fourteen thousand years.

The authors assessed 862 strings of symbols dating back to the Upper Paleolithic in Europe -- most commonly dots, slashes, and symbols like a letter Y -- and came to the conclusion that they were not random, but were true written language, for the purpose of keeping track of the mating and birthing cycles of the prey animals depicted in the paintings.

The authors write;

[Here we] suggest how three of the most frequently occurring signs—the line <|>, the dot <•>, and the <Y>—functioned as units of communication.  We demonstrate that when found in close association with images of animals the line <|> and dot <•> constitute numbers denoting months, and form constituent parts of a local phenological/meteorological calendar beginning in spring and recording time from this point in lunar months.  We also demonstrate that the <Y> sign, one of the most frequently occurring signs in Palaeolithic non-figurative art, has the meaning <To Give Birth>.  The position of the <Y> within a sequence of marks denotes month of parturition, an ordinal representation of number in contrast to the cardinal representation used in tallies.  Our data indicate that the purpose of this system of associating animals with calendar information was to record and convey seasonal behavioural information about specific prey taxa in the geographical regions of concern.  We suggest a specific way in which the pairing of numbers with animal subjects constituted a complete unit of meaning—a notational system combined with its subject—that provides us with a specific insight into what one set of notational marks means.  It gives us our first specific reading of European Upper Palaeolithic communication, the first known writing in the history of Homo sapiens.
The claim is controversial, of course, and is sure to be challenged; moving the date of the earliest writing from six thousand to twenty thousand years ago isn't a small shift in our model.  But if it bears up, it's pretty extraordinary.  It further gives lie to our concept of Paleolithic humans as brutal, stupid "cave men," incapable of any kind of mental sophistication.  As I hope I made clear in my first paragraphs, any kind of written language requires subtlety and complexity of thought.  If the beauty of the cave paintings in places like Lascaux doesn't convince you of the intelligence and creativity of our distant forebears, surely this will.

So what I'm doing now -- speaking to my fellow humans via strings of visual symbols -- may have a much longer history than we ever thought.  It's awe-inspiring that we landed on this unique way to communicate; even more that we stumbled upon it so long ago.

****************************************



Thursday, December 1, 2022

The code breakers

I've always been in awe of cryptographers.

I've read a bit about the work British computer scientist and mathematician Alan Turing did during World War II regarding breaking the "unbreakable" Enigma code used by the Germans -- a code that relied on a machine whose settings were changed daily.  And while I can follow a description of how Turing and his colleagues did what they did, I can't in my wildest dreams imagine I could do anything like that myself.

I had the same sense of awe when I read Margalit Fox's fantastic book The Riddle of the Labyrinth, which was about the work of linguists Alice Kober and Michael Ventris in successfully translating the Linear B script of Crete -- a writing system for which not only did they not initially know what the symbol-to-sound correspondence was, they didn't know if the symbols represented single sounds, syllables, or entire words -- nor what language the script represented!  (Turned out it was Mycenaean Greek.)

I don't know about you, but I'm nowhere near smart enough to do something like that.

Despite my sense that such endeavors are way outside of my wheelhouse, I've always been fascinated by people who do undertake such tasks.  Which is why I was so interested in a link a friend of mine sent me about the breaking of a code that had stumped cryptographers for centuries -- the one used by King Charles V of Spain back in the sixteenth century.


Charles was a bit paranoid, so his creation of a hitherto unbreakable code is definitely in character.  When the letter was written, in 1547, he was in a weak position -- he'd signed the Treaty of Crépy tentatively ending aggression with the French, but his ally King Henry VIII of England had just died and was succeeded by his son, the sickly King Edward VI.  Charles felt vulnerable...

... and in fact, when the letter was finally decrypted, it was found that it was about his fears of an assassination plot.

As it turned out, the fears were unfounded, and he went on to rule Spain and the Holy Roman Empire for another eleven years, finally dying of malaria at age 58.

His code remained unbroken until recently, however.  But the team of Cécile Pierrot-Inria and Camille Desenclos finally was able to decipher it, thanks to a lucky find -- another letter between Charles and his ambassador to France, Jean de St. Mauris, which had a partial key scribbled in the margin.  That hint included the vital information that nine of the symbols were meaningless, only thrown in to make it more difficult to break.  (Which worked.)


Even with the partial solution in hand, it was still a massive task.  As you can see from their solution, most of the consonants can be represented by two different symbols, and double letters are represented by yet another different (single) symbol.  There are single symbols that stand for specific people. 

But even with those difficulties, Pierrot-Inria and Desenclos managed to break the code.

All of this gives hope to linguists and cryptographers working on the remaining (long) list of writing systems that haven't been deciphered yet.  (Wikipedia has a list of scripts that are still not translated -- take a look, you'll be amazed at how many there are.)  I'm glad there are people still working on these puzzles.  Even if I don't have the brainpower to contribute to the effort, I'm in awe that there are researchers who are allowing us to read writing systems that before were a closed book.

****************************************