Skeptophilia (skep-to-fil-i-a) (n.) - the love of logical thought, skepticism, and thinking critically. Being an exploration of the applications of skeptical thinking to the world at large, with periodic excursions into linguistics, music, politics, cryptozoology, and why people keep seeing the face of Jesus on grilled cheese sandwiches.

Saturday, December 14, 2024

The cliff's edge

The universe is a dangerous place.

Much of what we've created -- the whole superstructure of civilized life, really -- is built to give us a sense of security.  And it works, or well enough.  During much of human history, we were one bad harvest, one natural disaster, one epidemic from starvation, disease, and death.  Our ancestors were constantly aware that they had no real security -- probably one of the main drivers of the development of religion.

The world is a capricious, dangerous place, but maybe the gods will help me if only I pray hard enough.

When the Enlightenment rolled around in the eighteenth century, science seemed to step in to provide a similar function.  Maybe the world could be tamed if we only understood it better.  Once again, it succeeded -- at least partially.  Industrial agriculture and modern medicine certainly saved millions of lives, and have allowed us to live longer, healthier lives than ever before.  Further reassuring us that it was possible to make the universe a secure, harm-free place for such creatures as us.

And we still have that sense, don't we?  When there's a natural disaster, many people respond, "Why did this happen?"  There's an almost indignant reaction of "the world should be safe, dammit."

[Image licensed under the Creative Commons svantassel, Danger Keep Away Sign, CC BY-SA 3.0]

This is why in 2012 a judge in Italy sentenced six geologists to six years in prison and a hefty fines for failing to predict the deadly 2009 L'Aquila earthquake.  There was the sense that if the best experts on the geology of Italy didn't see it coming... well, they should have, shouldn't they?  

That in the present state of our scientific knowledge, it's not possible to predict earthquakes, didn't seem to sway the judge's mind.  "The world is chaotic, dangerous, and incompletely understood" was simply too hard to swallow.  If something happened, and people died, there had to be someone to blame.  (Fortunately, eventually wiser heads prevailed, the charges were thrown out on appeal, and the geologists were released.)

In fact, I started thinking about this because of a study out of the University of California - Riverside that is investigating a technique for predicting earthquake severity based on the direction of propagation of the shock wave front.  This can make a huge difference -- for example, an earthquake on the San Andreas Fault that begins with failure near the Salton Sea and propagates northward will direct more energy toward Los Angeles than one that begins closer in but spreads in the opposite direction.

The scientists are using telltale scratch marks -- scoring left as the rocks slide across each other -- to determine the direction of motion of the quake's shock wave.  "The scratches indicate the direction and origin of a past earthquake, potentially giving us clues about where a future quake might start and where it will go," said Nic Barth, the paper's lead author. " This is key for California, where anticipating the direction of a quake on faults like San Andreas or San Jacinto could mean a more accurate forecast of its impact...  We can now take the techniques and expertise we have developed on the Alpine Fault [in New Zealand] to examine faults in the rest of the world.  Because there is a high probability of a large earthquake occurring in Southern California in the near-term, looking for these curved marks on the San Andreas fault is an obvious goal."

The thing is, this is still short of the ultimate goal of predicting fault failure accurately, and with enough time to warn people to evacuate.  Knowing the timing of earthquakes is something that is still out of reach.

Then there's the study out of the Max Planck Institute for Solar System Research that found that the Sun and other stars like it are prone to violent flare-ups -- on the average, once every century.  These "superflares" can release an octillion joules of energy in only a few hours.

The once-every-hundred-years estimate was based on a survey of over fifty-six thousand Sun-like stars, and the upshot is that so far, we've lucked out.  The last serious solar storm was the Carrington Event of 1859, and that was the weakest of the known Miyake Events, coronal mass ejections so big that they left traces in tree rings.  (One about fourteen thousand years ago was so powerful that if it occurred today, it would completely fry everything from communications satellites to electrical grids to home computers.)

The problem, once again, is that we still can't predict them; like earthquakes, we can know likelihood but not exactitude.  In the case of a coronal mass ejection, we'd probably have a few hours' notice -- enough time to unplug stuff in our houses, but not enough to protect the satellites and grids and networks.  (If that's even possible.  "An octillion joules" is what is known in scientific circles as "a metric shit tonne of energy.")

"The new data are a stark reminder that even the most extreme solar events are part of the Sun's natural repertoire," said study co-author Natalie Krivova.  "During the Carrington event of 1859, one of the most violent solar storms of the past two hundred years, the telegraph network collapsed in large parts of northern Europe and North America.  According to estimates, the associated flare released only a hundredth of the energy of a superflare.  Today, in addition to the infrastructure on the Earth's surface, especially satellites would be at risk."

All of this, by the way, is not meant to scare you.  In my opinion, the point is to emphasize the fragility of life and of our world, and to encourage you to work toward mitigating what we can.  No matter what we do, we'll still be subject to the vagaries of geology, meteorology, and astrophysics, but right now we are needless adding to our risk by ignoring climate change and pollution, and encouraging the ignorant and ill-founded claims of the anti-vaxxers.  (Just yesterday I saw that RFK Jr., who has been nominated as Secretary of the Department of Health and Human Services, is pursuing the de-authorization of the polio vaccine -- an extremely low-risk preventative that has saved millions of lives.)

Life's risky enough without adding to it by listening to reckless short-term profit hogs and dubiously sane conspiracy theorists.

My point here is that the chaotic nature of the universe shouldn't freeze us into despairing immobility; it should galvanize us to protect what we have.  The unpredictable dangers are a fact of life, and for most of our evolutionary history we were unable to do much about any of them.  Now, for the first time, we have figured out how to protect ourselves from many of the risks that our ancestors faced every day.  How foolish do we as a species have to be to add to those risks needlessly, heedlessly, rushing toward the edge of the cliff when we have the capacity simply to stop?

****************************************

Friday, December 13, 2024

The parasitic model

My post yesterday, about how the profit motive in (and corporate control of) media has annihilated any hope of getting accurate representation of the news, was almost immediately followed up by my running into a story about how the same forces in creative media are working to strangle creativity at its source.

The article was from Publishers Weekly, and was about an interview with HarperCollins CEO Brian Murray.  It centered largely on the company's whole-hearted endorsement of AI as part of its business model.  He describes using AI to take the place both of human narrators for audiobooks and of translators for increasing their sales in non-English-speaking countries, which is troubling enough; but by far his most worrisome comment describes using AI, basically, to be a stand-in for the authors themselves.  Lest you think I'm exaggerating, or making this up entirely, here's a direct quote from the article:

The fast-evolving AI sector could deliver new types of formats for books, Murray said, adding that HC is experimenting with a number of potential products.  One idea is a “talking book,” where a book sits atop a large language model, allowing readers to converse with an AI facsimile of its author.  Speculating on other possible offerings, Murray said that it is now possible for AI to help HC build an entire cooking-focused website using only content from its backlist, but the question of how to monetize such a site remains.

Later in the article, almost offhand, was a comment that while HarperCollins saw their sales go up last year by only six percent, their profits went up by sixty percent.  The reason was a "restructuring" of the company -- which, of course, included plenty of layoffs.

How much of that windfall went to the authors themselves is left as an exercise for the reader. 

I can vouch first-hand that in the current economic climate, it is damn near impossible to make a living as a writer, musician, or artist.  The people who are actually the wellspring of creativity powering the whole enterprise of creative media get next to nothing; the profits are funneled directly into the hands of a small number of people -- the CEOs of large publishing houses, distributors, marketing and publicity firms, and social media companies.

I can use myself as an example.  I have twenty-four books in print, through two small traditional publishers and some that are self-published.  I have never netted more than five hundred dollars in a calendar year; most years, it's more like a hundred.  I didn't go into this expecting to get rich, but I'd sure like to be able to take my wife out to a nice restaurant once a month from my royalties.

As it is, we might be able to split the lunch special at Denny's.

Okay, I can hear some of you say; maybe it's not the system, maybe it's you.  Maybe your books just aren't any good, and you're blaming it on corporate greed.  All right, fair enough, we can admit that as a possibility.  But I have dozens of extraordinarily talented and hard-working writer friends, and they all say pretty much the same thing.  Are you gonna stand there and tell us we're all so bad we don't deserve to make a living?

And now the CEO of HarperCollins is going to take the authors out of the loop even of speaking for ourselves, and just create an AI so readers can talk to a simulation of us without our getting any compensation for it?

Ooh, maybe he could ratchet those profits up into the eighty or ninety percent range if he eliminated the authors altogether, and had AI write the books themselves.

Besides the greed, it's the out-of-touchness that bothers me the most.  Lately I've been seeing the following screenshot going around -- a conversation between Long Island University Economics Department Chair Panos Mourdoukoutas and an ordinary reader named Gwen:


The cockiness is absolutely staggering; that somehow it's better to put even more money in Jeff Bezos's pockets than it is to support public libraries.  They've already got the entire market locked up tight, so what more do the corporate CEOs want?  It's flat-out impossible as an author to avoid selling through Amazon; they've got an inescapable stranglehold on book sales.  And, as I found out the hard way, they also have no problem with reducing the prices set by me or my publisher without permission, further cutting into any profit I get -- but, like HarperCollins, you can bet they make sure it doesn't hurt their bottom line by a single cent.

And don't even get me started about the Mark Zuckerberg model of social media.  When Facebook first really got rolling, authors and other creators could post links to their work, and it was actually not a bad way to (at the very least) get some name recognition.  Now?  Anything with an external link gets deliberately drowned by the algorithm.  Oh, sure, you can post stuff, but no one sees it.  The idea is to force authors to purchase advertising from Facebook instead.

Basically, if it doesn't make Zuckerberg money, you can forget about it.

If I sound bitter about all this -- well, it's because I am.  I've thrown my heart into my writing, and gotten very little in return.  We've ceded the control of the creative spirit of humanity to an inherently parasitic system, where the ones who are actually enriching the cultural milieu are reaping only a minuscule percent of the rewards.

The worst part is that, like the situation I described yesterday regarding the news media, I see no way out of this, not for myself nor for any other creative person.  Oh, we'll continue doing what we do; writing is as much a part of my life as breathing.  But isn't it tragic that the writers, artists, and musicians whose creative spirits nurture all of us have to struggle against seemingly insurmountable odds even to be seen?

All because of the insatiable greed, arrogance, and short-sightedness of a handful of individuals who have somehow ended up in charge of damn near everything that makes life bearable.  People who want more and more and more, and after that, more again.  Millions don't satisfy; they need billions.

As psychologist Erich Fromm put it, "Greed is a bottomless pit which exhausts the person in an endless effort to satisfy the need without ever actually reaching satisfaction."

****************************************

Thursday, December 12, 2024

The crossroads

I haven't exactly kept it a secret how completely, utterly fed up I am with media lately.

This goes from the miasmic depths of YouTube, Facebook, and TikTok right on up the food chain to the supposedly responsible mainstream media.  I still place a lot of the blame for Donald Trump's victory at the feet of the New York Times and their ilk; for months they ignored every babbling, incoherent statement Trump uttered, as well as the fascistic pronouncements he made during his more lucid moments, while putting on the front page headlines like "Will Kamala's Choice In Shoes Alienate Her From Voters?"

The idea of responsible journalism has, largely, been lost.  Instead we're drowning in a sea of slant and misinformation, generated by a deadly mix of rightward-tilted corporate control and a clickbait mentality that doesn't give a flying rat's ass whether the content is true or accurate as long as you keep reading or watching it.

While the political stuff is far more damaging, being a science nerd, it's the misrepresentation of science that torques me the the most.  And I saw a good example of this just yesterday, with a fascinating study out of the Max Planck Institute that appeared last week in the journal Astronomy and Astrophysics.

First, the actual research.

Using data from the x-ray telescope eROSITA, researchers found that the Solar System occupies a space in one of the arms of the Milky Way that is hotter than expected.  This "Local Hot Bubble" is an irregularly-shaped region that is a couple of degrees warmer than its surroundings, and is thought to have been caused by a series of supernovae that went off an estimated fourteen million years ago.  The bubble is expanding asymmetrically, with faster expansion perpendicular to the plane of the galaxy than parallel to it, for the simple reason that there is less matter in that direction, and therefore less resistance.

One curious observation is that there is a more-or-less cylindrical streamer of hotter gas heading off in one direction from the bubble, pointing in the general direction of the constellation Centaurus.  The nearest object in that direction is another hot region called the Gum Nebula, a supernova remnant, but it's unclear if that's a coincidence.

The Gum Nebula [Image licensed under the Creative Commons Meli Thev, Finkbeiner H-alpha Gum Nebula, CC BY-SA 4.0]

The researchers called this streamer an "interstellar tunnel" and speculated that there could be a network of these "tunnels" crisscrossing the galaxy, connecting warmer regions (such as the nebulae left from supernovae) and allowing for exchange of materials.  How physics allows the streamers to maintain their cohesion, and not simply disperse into the colder space surrounding them, is unknown.  This idea has been around since 1974, but has had little experimental support, so the new research is an intriguing vindication of a fifty-year-old idea.

Okay, ready to hear the headlines I've seen about this story?

  • "Scientists Find Network of Interstellar Highways in Our Own Galaxy"
  • "A Tunnel Links Us to Other Star Systems -- But Who's Maintaining It?"
  • "Mysterious Alien Tunnel Found In Our Region of Space"
  • "An Outer Space Superhighway"
  • "Scientists Baffled -- We're At The Galactic Crossroads and No One Knows Why"

*brief pause to punch a wall*

Okay, I can place maybe one percent of the blame on the scientists for calling it a "tunnel;" a tunnel, I guess, implies a tunneler.  But look, it's called quantum tunneling, and the aliens-and-spaceships crowd managed to avoid having multiple orgasms about that.  

On the other hand, given the mountains of bullshit out there about quantum resonant energy frequencies of healing, maybe I shouldn't celebrate too quickly.

But the main problem here is the media sensationalizing the fuck out of absolutely everything.  I have no doubt that in this specific case, the whole lot of 'em knew there was nothing in the research that implied a "who" that was "maintaining" these tunnels; the scientists explicitly said there was some unexplained physics here, which was interesting but hardly earthshattering.

But "streamers of gas from a local warm region in our galaxy" isn't going to get most people to click the link, so gotta make it sound wild and weird and woo-woo.

Look, I know this story by itself isn't really a major problem, but it's a symptom of something far worse, and far deeper.  There has got to be a way to impel media to do better.  Media trust is at an all-time low; a study last month estimated it at a little over thirty percent.  And what happens in that situation is that people (1) click on stuff that sounds strange, shocking, or exciting, and (2) for more serious news, gravitate toward sources that reinforce what they already believed.  The result is that the actual facts matter less than presenting people with attractive nonsense, and media consumers never find out if what they believe is simply wrong.

But saying "just don't read the news, because they're all lying" isn't the solution, either.  The likelihood of voting for Trump was strongly correlated with having low exposure to accurate information about current events, something that was exacerbated by his constant message of "everyone is lying to you except for me."

We are at a crossroads, just not the kind the headline-writer was talking about.

Honestly, I don't know that there is an answer, not in the current situation, where we no longer have a Fairness Doctrine to force journalists to be even-handed.  And the proliferation of wildly sensationalized online media sources has made the problem a million times worse.

At this point, I'm almost hoping the people who reported on the astronomy story are right, and we are in the middle of an alien superhighway.  And they'll slow down their spaceship long enough to pick me up and get me the hell off this planet.

****************************************

Wednesday, December 11, 2024

The power of ritual

I was raised in a devoutly Roman Catholic home, but after spending my teenage years with question after question bubbling up inside me, I left Catholicism, never to return.  In my twenties I tried more than once to find a faith community that seemed right -- that made sense of the universe for me -- attending first a Quaker meeting, then a Unitarian church, and finally a Methodist church, and each time I ended up faced with the same questions I'd had, questions that no one seemed to be able to answer.

The prime question was "How do you know all this is true?"  

In other realms, that one was usually easier to answer.  Science, of course, is cut-and-dried; factual truth in science is measurable, quantifiable, observable.  But even with situations that aren't exactly rational, there's usually a way to approach the question.  How do I know that my family and friends love me?  Because they demonstrate it in a tangible way, every day.

But the claims of religion seemed to me to be outside even that, and I never was able to get answers that satisfied.  Most of the responses I did get boiled down to "I've had a personal experience of God" or "the existence of God gives meaning to my day-to-day experience," neither of which was particularly convincing for me.  I have never had anything like a transcendent spiritual experience of an omnipresent deity.  And something imbuing meaning into your life doesn't make it true.  I'd read plenty of meaningful fiction, after all.  And as far as my wanting it to be true, if there was one thing I'd learned by that point, it was that the universe is under no compulsion to behave in a way that makes me comfortable.

So ultimately, I left religion behind entirely.  I have no quarrels with anyone who has found a spiritual home that works for them, as long as they're not forcing it on anyone else; in fact, I've sometimes envied people who can find reason to believe, wholeheartedly, in a greater power.  I just never seemed to be able to manage it myself.

That's not to say I'm unhappy as an atheist.  Perhaps I can't access the reassurance and comfort that someone has who is deeply religious, but there are a lot of the petty rules and pointless, often harmful, restrictions that I wish I'd abandoned many years earlier.  (The chief of which is my years of shame over my bisexuality.  The damage done to the queer community by the largely religiously-motivated bigotry of our society is staggering and heartbreaking -- and given who just got elected to run the United States, it's far from over.)

But there's something about being part of a religion that I do miss, and it isn't only the sense of community.  You can find community in a book group or weekly sewing night or runners' club, after all.  What I find I miss most, strangely enough, is the ritual.

There's something compelling about the ritual of religion.  The Roman Catholicism of my youth is one of the most thoroughly ritualistic religions I know of; the idea is that any believer should be able to walk into any Catholic church in the world on Sunday morning and know what to do and what to say.  (Giving rise to the old joke, "How do you recognize a Catholic Star Wars fan?"  "If you say to them, 'May the Force be with you,' they respond, 'And also with you.'")  The vestments of the priests, the statuary and stained glass windows, the incense and candles and hymns and organ music -- it all comes together into something that, to the believer, is balm to the soul, leaving them connected to other believers around the world and back, literally millennia, in time.

Window in the Church of St. Oswald, Durham, England  [Image licensed under the Creative Commons Tom Parnell, Church of St Oswald - stained glass window, CC BY-SA 4.0]

The reason this comes up is twofold.  First, we're approaching the Christmas season, and I always associate this time of year with rituals that, for the most part, I no longer participate in -- Advent, Christmas music, decorating trees, Midnight Mass.  The result is that for me, the holiday season is largely a time of wistful sadness.  I look on all this as a very mixed bag, of course; it's hard to imagine my having a sufficient change of heart to stay up until the wee hours on Christmas Eve so I can get in my car and go take in a church service.

But seeing others participate in these things makes me realize what I've lost -- or, more accurately, what I've voluntarily given up.  And I can't help but feel some sense of grief about that.

The other reason is more upbeat -- a paper this week in Proceedings of the National Academy of Sciences about an archaeological site deep in a cave in Israel that shows signs of having been used for the purposes of rituals...

... thirty-five thousand years ago.

The cave was occupied before that; the upper levels has evidence of inhabitants fifty thousand years ago, including a partial skull that shows evidence of interbreeding between Homo sapiens and Neanderthals.  But there are deeper parts of the cave, places of perpetual darkness, where nevertheless people congregated.  There's art on the walls, and evidence of the soot from torch fires.

The authors write:
Identifying communal rituals in the Paleolithic is of scientific importance, as it reflects the expression of collective identity and the maintenance of group cohesion.  This study provides evidence indicating the practice of deep cave collective rituals in the Levant during the Early Upper Paleolithic (EUP) period.  It is demonstrated that these gatherings occurred within a distinct ritual compound and were centered around an engraved object in the deepest part of Manot Cave, a pivotal EUP site in southwest Asia.  The ritual compound, segregated from the living areas, encompasses a large gallery partitioned by a cluster of remarkable speleothems [water-deposited minerals].  Within this gallery, an engraved boulder stands out, displaying geometric signs suggesting a unique representation of a tortoise.  Isotopic analysis of calcite crusts on the boulder’s grooves revealed alignment with values found in speleothems from the cave dated to ~37 to 35 ka BP.  Additionally, meticulous shape analysis of the grooves’ cross-section and the discernible presence of microlinear scratches on the grooves’ walls confirmed their anthropogenic origin.  Examination of stalagmite laminae (36 ka BP) near the engraved boulder revealed a significant presence of wood ash particles within.  This finding provides evidence for using fire to illuminate the dark, deep part of the cave during rituals.  Acoustic tests conducted in various cave areas indicate that the ritual compound was well suited for communal gatherings, facilitating conversations, speeches, and hearing.  Our results underscore the critical role of collective practices centered around a symbolic object in fostering a functional social network within the regional EUP communities.

I find this absolutely fascinating.  The drive to create and participate in rituals is deep-seated, powerful, and has a very long history.  Its role in cultural cohesion is obvious.  Of course, the same force generates negative consequences; the us-versus-them attitudes that have driven the lion's share of the world's conflicts, both on the small scale and the global.  Rituals bind communities together, but also identify outsiders and keep them excluded.  (And the rituals often were guarded fiercely down to the level of minute details.  Consider that people were burned at the stake in England for such transgressions as translating the Bible into English.)

So it's complex.  But so is everything.  My yearning for participation in rituals celebrating a belief system I no longer belong to is, honestly, self-contradictory.  But all I can say is that we've been creatures of ceremony for over thirty thousand years, so I shouldn't expect myself to be exempt, somehow.

As Walt Whitman put it, "Do I contradict myself?  Very well, then, I contradict myself.  (I am large, I contain multitudes.)"

****************************************

Tuesday, December 10, 2024

Dark shadows

After yesterday's rather depressing post about politics, today we're going to turn our eyes away from the troubled and turbulent Earth and out to the skies.

"The more we look, the more we see" sounds like a tautology, but it the realm of the sciences, it isn't.  Sometimes it takes training, and careful examination of what's in front of you, even to know exactly what it is you're looking at.

This is especially true in astronomy.  Consider that in only four hundred years, we've gone from:

  • stars being equidistant points of light on a sphere with the Earth at the center;
  • to recognizing that stars are, in fact, not all the same distance away from us, and their apparent motion comes from the combination of Earth's rotation and its circling the Sun;
  • to realizing that even the nearest stars are incredibly far away;
  • to discovering that the Sun is a star -- and the stars are suns -- and they're all made of more or less the same stuff;
  • to the shocked understanding that galaxies are millions of light years away, are composed of billions of stars -- and there are trillions of galaxies, almost all of which are rushing away from us at breakneck speeds.

Along the way, we've discovered hundreds of different celestial objects and phenomena, some of which are positively mind-boggling, and many of which we still have yet to explain completely.

The topic comes up because of an article I read yesterday by astronomer Phil Plait.  I discovered Plait a few years ago because of his excellent website Bad Astronomy (about myths and misconceptions concerning the skies).  I've also read several of his books, and he's an excellent example of a scientist who is also highly skilled at bringing cutting-edge science to us interested laypeople.  (I especially recommend Death from the Skies!, about which writer Daniel H. Wilson said, "Reading this book is like getting punched in the face by Carl Sagan.  Frightening, yet oddly exhilarating.")

In any case, Plait's article is entitled "What Is Inside Our Galaxy's Darkest Places?", and is about dust clouds.  I knew at least a little about celestial dust clouds, which are thought to be the raw materials that can eventually collapse to form stars and planets, something I touched on in a post last week.  But there was a lot in the article that was new to me -- and intriguingly weird.

The dark dust clouds Plait describes are called Bok globules, after astronomer Bart Bok who studied them, and there are estimated to be millions of them in our galaxy alone.  And "dark" is something of an understatement; the dust and gas they contain reduces the intensity of any light coming through them by a factor of fifteen trillion.  The result is that they look like a black, starless blotch in the sky.  I was immediately reminded of the Black Thing from Madeleine L'Engle's A Wrinkle in Time:

"That shadow out there."  Calvin gestured.  "What is it?  I don't like it."

"Watch," Mrs. Whatsit commanded.

It was a shadow, nothing but a shadow.  It was not even as tangible as a cloud.  Was it cast by something?  Or was it a thing in itself?

The sky darkened.  The gold left the light and they were surrounded by blue, blue deepening until where there had been nothing but the evening sky there was now a faint pulse of a star, and then another and another and another.  There were more stars than Meg had ever seen before.

"The atmosphere is so thin here," Mrs. Whatsit said, as though in answer to her unasked question, "that it does not obscure your vision as it would at home.  Now look.  Look straight ahead."

Meg looked.  The dark shadow was still there.  It had not lessened or dispersed with the coming of night.  And where the shadow was, the stars were not visible. 

Of course, Bok globules are just dust clouds, not the distilled essence of evil.

I hope.

Barnard 68, a Bok globule about five hundred light years from Earth [Image credit: European Southern Observatory]

But the thing that amazed me the most about these dust clouds was how little matter they actually contain, and yet how good they are at blocking light.  Plait tells us that they average about a million molecules per cubic centimeter -- which seems like a lot until you find out that air at sea level contains ten trillion trillion molecules per cubic centimeter.  But despite their thinness, if you put the Sun a half-light-year away from the Earth -- so, only a little more than ten percent of the distance to the nearest star --and put a typical Bok globule in between, the Sun's light would be so attenuated it wouldn't be visible to the naked eye.

Which is why I started with "the more you look, the more you see."  Or -- more accurately, in this case -- the more you look, the more you realize how much we might not be seeing.

In any case, I don't want to steal any more of Plait's thunder, because you should all read his article, which is wonderful fun (and is linked above).  And if you're on Bluesky, subscribe to him, because his posts are awesome.

So that's today's cool new thing I learned about the universe.  Which is also valuable because it takes my mind off what's happening down here.  All in all, things seem to look up when I do.

****************************************

Monday, December 9, 2024

Reaping the whirlwind

Back in 1980, I came up with an idea for a novel.

Ronald Reagan had just been elected president, and many of us were alarmed at what seemed like a lurch toward far-right populism -- anti-regulation, pro-corporate policy that was marketed as somehow being beneficial to the working class.  The buzzwords were "trickle-down economics," the idea that if you gave big tax breaks to the rich, the benefits would "trickle down" to you and me and the rest of the working stiffs.  It was bolstered by a belief that the rich were actually concerned about lifting the working class out of poverty; that it was possible, in the society as it was, for a poor person to become wealthy.

That, to quote Steinbeck, the poor were just "temporarily embarrassed millionaires."

It didn't work.  The rich got richer (as intended) and the working class reaped exactly zero benefits from it.  And it generated deep resentment, as corporate profits soared, CEOs raked in unimaginable amounts of cash -- and workers' salaries stagnated.

And I thought: this can't go on forever.  At some point, people are going to get fed up, decide they have nothing to lose, and pull the whole superstructure down.

This was the genesis of my novel In the Midst of Lions.

The title comes from a line from Psalm 56: "Have pity on me, O God, have pity on me... for I lie prostrate in the midst of lions that devour men."  The story is set in Seattle, and centers around five completely ordinary people who are caught up in the collapse.  The attacks are precipitated by a shadowy worldwide organization of violent anarchists called the Lackland Liberation Authority; the "Lacklanders" are people who lost their property from corporate buy-ups of land for industrial agriculture and mining, and because price increases made home ownership out of reach.  Threatening LLA graffiti, in their trademark red spray paint, begins showing up on walls.  

Then the attacks start.  At first, they're scattered and sporadic, targeting a few of the most egregious offenders; but when that doesn't work, they strike hard, and simultaneously, at governmental and business leaders across the world.

The result is spiraling chaos.

Back in 1980 I wrote a few chapters of it, but somehow sensed that I didn't have the background, knowledge, or writing skill to pull off something this big, so I tabled the project.  It was in the back of my mind -- for forty years.  In 2020 I finally decided to tackle it, and wrote it and two sequels (The Scattering Winds and The Chains of Orion), which I published in 2023.  


The reason this comes up is that there's a passage from In the Midst of Lions that's been on my mind for the last couple of days:
“But there’s one thing I don’t understand,” Soren said.  “If they had this coordinated, worldwide plot, planned well in advance, there has to have been communication between different places.  By destroying the telecommunication hubs, they’ve cut themselves off along with the rest of us.  It’s sawing off the tree branch you’re sitting on.”

“I doubt they care.” Cassandra’s lips tightened, the only display of emotion she revealed.  “I’ve read some of the Lacklanders’ manifestos.  They’re no different than the suicide bombers in the Middle East back during the Gulf Wars.  The point is to destroy the power structure they despise.  If they can take down the corporate-capitalist overlords, they still count it as a success even if they go down along with them.”

“That makes no sense at all,” Mary said.

“I didn’t say it was rational.”

The deadly attack on UnitedHealth CEO Brian Thompson comes from this same desperation.  UnitedHealth is in first place amongst American health insurance companies for percentage of claims turned down -- estimates are around a 32% denial rate.  "Deny, Defend, Depose" was written on the bullet casings -- and the disingenuous media is still saying, "Gee, I wonder what the murderer's motive was?"  Instead of outrage at the violent act, the result has been an outpouring of anger against health insurance companies -- and by extension, ultra-wealthy corporate CEOs everywhere -- coupled with a complete lack of sympathy for Thompson and a celebration of his killer (who, at the time of this writing, remains unidentified and at large).  A Facebook post by UnitedHealth asking for "understanding in this difficult time" got almost a hundred thousand responses -- 77,000 of which were laugh emojis.  

But what gave me the biggest shiver up the spine was the following image of graffiti I saw on Facebook.

In red spray paint.  In Seattle.


I said to a friend -- and I was only half joking -- "I didn't think I'd have to move In the Midst of Lions to the non-fiction shelf quite this soon."

Thompson's murder, and the glee that followed, isn't laudable, but it is understandable.  And it definitely isn't one pundit's characterization of "a sign of the deep moral and ethical corrosion of America."  It's a result of something that we've seen over and over again in history, from the American and French Revolutions to what's happening right now in Syria; if you push people long enough and hard enough, profit off their struggle, empower corrupt oligarchs and expect the working class simply to play along, eventually the whole tower of cards collapses.  People will then take action by whatever means they can to put an end to it -- legally or illegally, ethically or unethically, peacefully or violently.

Something Donald Trump and his ultra-wealthy corporate capitalist cronies might want to keep in mind.

Stephen King wrote, in his book The Stand, "The effective half-life of evil is always relatively short."  It's a line that's stuck with me since I first read it, perhaps thirty years ago.  The power-hungry and super-wealthy -- who are, of course, usually one and the same -- think their riches will protect them.  That's what King Louis XVI thought; so did Napoleon, Jean-Claude Duvalier, Jean-Bédel Bokassa, Idi Amin, Benito Mussolini, Pol Pot, Ferdinand Marcos, and Muammar Gaddafi.  

All of them were wrong, and several of them paid for that error with their lives.

The problem with all this is that the result of the downfall of dictators is often chaos, destroying economy, infrastructure... and the ordinary people who only wanted to be able to feed their families and have a roof over their heads.  In In the Midst of Lions, it's not just the corporate oligarchs who end up suffering, it's everyone.

I'd like to hope that the people in charge will recognize where we're headed before it's too late, but unfortunately, we have a very poor track record of learning from history.  (Or from cautionary fiction, for that matter.)  The overweening arrogance that comes with wealth and power tends to make them say, "Oh, sure, it may have happened to all those people in history... but it won't happen to me."

It all reminds me of another biblical quote, this one from the Book of Hosea: "Who sows the wind, reaps the whirlwind."

****************************************

Saturday, December 7, 2024

Talking in your sleep

A little over a year ago, I decided to do something I've always wanted to do -- learn Japanese.

I've had a fascination with Japan since I was a kid.  My dad lived there for a while during the 1950s, and while he was there collected Japanese art and old vinyl records of Japanese folk and pop music, so I grew up surrounded by reminders of the culture.  As a result, I've always wanted to learn more about the country and its people and history, and -- one day, perhaps -- visit.

So in September of 2023 I signed up for Duolingo, and began to inch my way through learning the language.

[Image is in the Public Domain]

It's a challenge, to say the least.  Japanese usually shows up on lists of "the five most difficult languages to learn."  Not only are there the three different scripts you have to master in order to be literate, the grammatical structure is really different from English.  The trickiest part, at least thus far, is managing particles -- little words that follow nouns and indicate how they're being used in the sentence.  They're a bit like English prepositions, but there's a subtlety to them that is hard to grok.  Here's a simple example:

Watashi wa gozen juuji ni tokoshan de ane aimasu.

(I) (particle indicating the subject of the sentence) (A.M.) (ten o'clock) (particle indicating movement or time) (library) (particle indicating where something is happening) (my sister) (am meeting with) = "I am meeting my sister at ten A.M. at the library."

Get the particles wrong, and the sentence ends up somewhere between grammatically incorrect and completely incomprehensible.

So I'm coming along.  Slowly.  I have a reasonably good affinity for languages -- I grew up bilingual (English/French) and have a master's degree in linguistics -- but the hardest part for me is simply remembering the vocabulary.  The grammar patterns take some getting used to, but once I see how they work, they tend to stick.  The vocabulary, though?  Over and over again I'll run into a word, and I'm certain I've seen it before and at one point knew what it meant, and it will not come back to mind.  So I look it up...

... and then go, "Oh, of course.  Duh.  I knew that."

But according to a study this week out of the University of South Australia, apparently what I'm doing wrong is simple: I need more sleep.

Researchers in the Department of Neuroscience took 35 native English speakers and taught them "Mini-Pinyin" -- an invented pseudolanguage that has Mandarin Chinese vocabulary but English sentence structure.  (None of them had prior experience with Mandarin.)  They were sorted into two groups; the first learned the language in the morning and returned twelve hours later to be tested, and the second learned it in the evening, slept overnight in the lab, and were tested the following morning.

The second group did dramatically better than the first.  Significantly, during sleep their brains showed a higher-than-average level of brain wave patterns called slow oscillations and sleep spindles, that are thought to be connected with memory consolidation -- uploading short-term memories from the hippocampus into long-term storage in the cerebral cortex.  Your brain, in effect, talks in its sleep, routing information from one location to another.

"This coupling likely reflects the transfer of learned information from the hippocampus to the cortex, enhancing long-term memory storage," said Zachariah Cross, who co-authored the study.  "Post-sleep neural activity showed unique patterns of theta oscillations associated with cognitive control and memory consolidation, suggesting a strong link between sleep-induced brainwave co-ordination and learning outcomes."

So if you're taking a language class, or if -- like me -- you're just learning another language for your own entertainment, you're likely to have more success in retention if you study in the evening, and get a good night's sleep before you're called upon to use what you've learned.

Of course, many of us could use more sleep for a variety of other reasons.  Insomnia is a bear, and poor sleep is linked with a whole host of health-related woes.  But a nice benefit of dedicating yourself to getting better sleep duration and quality is an improvement in memory.

And hopefully for me, better scores on my Duolingo lessons.

****************************************