Skeptophilia (skep-to-fil-i-a) (n.) - the love of logical thought, skepticism, and thinking critically. Being an exploration of the applications of skeptical thinking to the world at large, with periodic excursions into linguistics, music, politics, cryptozoology, and why people keep seeing the face of Jesus on grilled cheese sandwiches.

Tuesday, May 31, 2022

Stepping into Pride

A dear friend of mine sent me a message a couple of weeks ago.  It was a recommendation to watch a recent Netflix series, and read the graphic novel that inspired it.  "Trust me on this," she said.  "This is the story you and I both needed when we were teenagers.  You'll love it... but you might want to have kleenex handy."

The show (and book) are called Heartstopper, by Alice Oseman.  And my friend was right on all counts.

It's the story of two boys in an all-male school in England -- one of them gay (and out), the other bisexual (and, at least at the beginning, closeted).  The story of their deep friendship, mutual attraction, and eventual falling in love is sweet, beautiful, and charming.  I'm not usually someone who picks up young adult fiction, and even less romance fiction; but Heartstopper had me in the palm of its hand right from the beginning.  The "kleenex" part of my friend's comment wasn't because it's in any sense a tragedy; there are (of course) some bumps in the road, and a few of the couple's classmates are bigoted, homophobic assholes, but by and large, it's a heartwarming and upbeat story about overcoming inhibitions, finding happiness, and being open to the world about who you are.

The tears that well up when I even think about the story of Nick Nelson and Charlie Spring are, for me and my friend both, largely because of how long she and I lived in fear and shame.  We were denied the opportunity to explore that part of ourselves; not only to relax and have fun dating, but even to figure out what it meant and get comfortable with who we are.  It was longer for me.  At least she came out publicly as a lesbian fairly young.  It took me until I was fifty-two even to come out to friends.  That's thirty-seven years of being terrified that anyone, even the people who loved me, would find out that I'm attracted equally to men and women.

The first few years, it was not only fear of ridicule or ostracism, it was fear for my safety.  Southern Louisiana in the 1970s was not a safe place for LGTBQ kids.  I know four people in my graduating class (not counting myself) who came out as queer later in life, and none of them even gave a hint of it until after graduation.  If you think it's a significant likelihood that you'll get the shit beaten out of you in the locker room if people find out, why in the hell would you not keep it a secret?

Things are better now.  Thank heaven.  My last year of teaching, three years ago, there were several kids I knew who were out as queer or trans.  But we still have a very long way to go.  A teacher friend of mine in Texas has had to create an Amazon wish list of books that have characters that are queer, non-Christian, or are people of color, because in her state, school district after school district are taking those books off library shelves, denying kids access even to finding out that there are people who aren't straight, white, and Christian.  Apparently, now it's considered "woke" (how I have come to fucking hate that word) to provide a way to say to non-majority kids, "Hey, it's okay.  You are okay.  Be who you are."

Ugly bigotry, while less than what I experienced when I was a teenager, still is all too common.  Just in the last week I saw two posts on social media that made that nauseatingly clear.  One said, "If I ever see a 'trans woman' in the girls' bathroom, I'm going to punch him in the face and tell the judge I identify as the tooth fairy."  The other said, "Men are from Mars, women are from Venus, and any other genders you pulled out of Uranus."

Hurr-hurr-hurr.  It sure is funny to threaten one of the most marginalized groups of people in the United States with violence, and to deny that anyone other than cis/heterosexual people even exist.

Still and all, we're making progress.  Slow and incremental steps, but progress.  My teacher friend's extensive Amazon wish list was cleared out and is on the way to her as we speak -- it took less than twelve hours for her friends to purchase every damn book she asked for.  I may have been late to the game, but I now can say to anyone, "I'm queer/bisexual" and not give a flying rat's ass what they think about it.  Florida governor Ron DeSantis pushed for sanctions on Disney, the state's premier attraction and biggest money-maker, because they balked against his pet project, the "Don't Say Gay" bill -- and Disney responded by opening a new line of queer-themed merchandise called the "Pride Collection," which is about as close as a corporation can come to a collective raised middle finger.

Tomorrow is the first day of Pride Month, and there's a lot to feel good about.  Even so, in a lot of places, it seems like we're regressing, not progressing.  Irrespective of my own sexual orientation, I don't understand why, exactly, people are so determined to control what consenting adults do in the privacy of their own homes.  Why it's just fine to have young adult fiction with heterosexual romances and marriages, but even depicting a queer couple is "ramming wokeness down everyone's throats" and "turning kids gay."  Why the GOP, who pride themselves on their "get the government out of the private sector" stance, are A-okay with the government trying to stop businesses from establishing policies ensuring acceptance and equal rights for LGBTQ employees and customers.

Pride lasts for one month, but pride lasts forever.

So, yeah.  I cried hard during the scene when Nick and Charlie kiss for the first time.  I'm not ashamed of that.  It's okay to get all emotional when a scene is sweet and touching, which this surely was; it is not okay that some of my tears were because of the fact that at that age, I would never have had the courage, nor even the opportunity, to experience such a thing.  Hell, there was no queer fiction accessible back then, neither books, nor television, nor movies.  I didn't even know such relationships existed.  Note, by the way, that this lack of positive role modeling didn't make me any less queer; all it did was make me ashamed and terrified of being queer.  (Due to my completely dysfunctional upbringing, I was also terrified of having a relationship with a girl, but that's another story entirely.  Suffice it to say that during much of my life, I have been very, very lonely -- and am fantastically fortunate to be in the warm, nurturing, loving marriage I now have.)

It's kind of summed up in the poignant line from Nick, when he realizes he needs to claim his identity, and his chance for love: "I wish knew you when I was younger, and that I'd known then what I know now."

In conclusion, to the increasing number of straight people in the world who are 100% accepting of us non-straight types, thank you.  To my queer friends, keep being strong, keep being defiant, keep being who you are, and happy Pride Month.

And to the homophobes, you can take your ugly, antiquated bigotry and shove it up your ass.



Monday, May 30, 2022

An encounter with Charybdis

At the center of our seemingly tranquil galaxy, there's a black hole massive enough that it significantly warps spacetime, swallows any matter that gets close enough, and in the process emits truly colossal amounts of radiation.  Named Sagittarius A*, it was discovered in 1954 because of its enormous output in the radio region of the spectrum.  [N. B.  Throughout this post, when I refer to the black hole's radiation output, I am not of course talking about anything coming from inside its event horizon; that's physically impossible.  But the infalling matter that gets eaten by it does emit electromagnetic radiation before it takes its final plunge and disappears forever.  Lots of it.]

This thing is a real behemoth, at an estimated four million times the mass of the Sun.  There is a lot of interstellar dust between it and us -- after all, when you're looking at the constellation of Sagittarius, you're looking down a line going directly along the plane of the galaxy toward its center -- but even without the dust, it wouldn't be all that bright.  Most of its output isn't in the visible light region of the spectrum.  This doesn't mean it's dim in the larger sense; not only are there the radio waves that were the first part of its signal detected, but it has enormous peaks in the gamma and x-ray part of the spectrum as well.

Earlier this month, the European Southern Observatory released the first actual photograph of Sagittarius A*:

[Image licensed under the Creative Commons EHT Collaboration, EHT Sagittarius A black hole, CC BY 4.0]

How could something that enormous form?  We have a pretty good idea about how massive stars (over ten times the mass of the Sun) become black holes; when their cores run out of fuel, the gravitational pull of its mass collapses it to the point that the escape velocity at its surface exceeds the speed of light.  At that point everything that falls within its event horizon is there to stay.

But we're not talking about ten times more massive than the Sun; this thing is four million times more massive.  Where did all that matter come from -- and how did it end up at the center of not only our galaxy, but every spiral galaxy studied?

A step was taken in our understanding of galactic black hole formation by a team of astronomers at the University of North Carolina - Chapel Hill, in a paper that appeared this week in The Astrophysical Journal.  It's long been known that most large galaxies are attended by an array of dwarf galaxies, such as the Milky Way's Small and Large Magellanic Clouds.  (Which, unfortunately, are only visible in the Southern Hemisphere.  This is why they're named after Magellan.  Typical of the Eurocentric approach to naming stuff; clearly indigenous people knew about the Magellanic Clouds long before Magellan ever saw them.)  It's also known that because of the gravitational pull of the larger galaxies, the smaller ones eventually collide with them and merge into a single galaxy.  In fact, that even happens to big galaxies; gravity has a way of winning, given enough time.  The Milky Way and the Andromeda Galaxy, which are about the same size, will eventually come together into a single blob of stars, but what its final shape will be is impossible to predict.

As an aside, there's no need to worry about this.  First, it's not going to happen for another four and a half billion years.  Second, when galaxies (of any size) collide, there are relatively few actual stellar collisions.  Galaxies are mostly empty space, and when they merge the stars that comprise them mostly just pass each other without incident.

But not the black holes at their centers.  Those, being the center of mass of the entire aggregation, eventually slam together in a collision with a magnitude that's impossible to imagine.  And the team at UNC found out that this is one of the ways that galactic black holes become so large; they discovered that even dwarf galaxies have central black holes, and when they get swallowed up, that mass gets added to the central black hole of the larger galaxy.

Sagittarius A* sits in the middle of the whirling vortex of stars, like the sea monster Charybdis in Greek mythology, sucking down anything that comes close enough -- including, apparently, other black holes.  The celestial fireworks with a collision between two large black holes, such as the ones in the Milky Way and Andromeda, must release a fantastic amount of energy.

Wouldn't that be something to see?

From a safe distance, of course.


Saturday, May 28, 2022

Social media dissociation

I suspect that many of my readers will resonate with my desire to fritter away less time on social media.

I don't mean the actual "social" part of social media.  I have friends whom I seldom if ever get to see, and especially since the pandemic started, visiting online is about my only opportunity.  I greatly value those conversations.  What I'm referring to is the aimless scrolling, looking for new content, any new content.  Trying to find a distraction even though I know that a dozen other things, from listening to some music, to playing with my dogs, to going for a run -- even weeding the garden -- will leave me feeling better.

But -- once again, as I'm sure many of you can attest -- it can be exceedingly hard to say "enough" and close the app.  It was one thing when your connectivity had to be via a desktop or laptop computer; but now that just about all of us (even me, Luddite though I am) are carrying around our social media addiction in our pockets, it's way too easy to say "just a few more minutes" and drop back into the world of scrolling.

One effect I've noticed it's had on me is a shortening of my attention span.  Something has to be absolutely immersive to keep my attention for over five minutes.  Two of my favorite YouTube science channels, the wonderful Veratasium and physicist Sabine Hossenfelder's awesome Science Without the Gobbledygook, have videos that average at about ten to twelve minutes long, and man... sometimes that is a struggle, however fascinating the topic.

I don't like this trend.  I won't say I've ever had the best of focus -- distractions and my wandering mind have been issues since I was in grade school -- but social media have made it considerably worse.  Frequently I think about how addicted I am to scrolling, and it's a real cause of worry.

But then I start scrolling again and forget all about it.

That last bit was the subject of a study from the University of Washington that was presented last month at the CHI Conference on Human Factors in Computing Systems.  In, "'I Don’t Even Remember What I Read': How Design Influences Dissociation on Social Media," a team led by Amanda Baughan looked at how social media apps are actually designed to have this exact effect -- and that although we frequently call it an addiction, it is more accurately described as dissociation.

"Dissociation is defined by being completely absorbed in whatever it is you're doing," Baughan said, in an interview with Science Daily.  "But people only realize that they've dissociated in hindsight.  So once you exit dissociation there's sometimes this feeling of: 'How did I get here?'  It's like when people on social media realize: 'Oh my gosh, how did thirty minutes go by?  I just meant to check one notification.'"

Which is spot-on.  Even the title is a bullseye; after a half-hour on Twitter, I'd virtually always be hard-pressed to tell you the content of more than one or two of the tweets I looked at.  The time slips by, and it feels very much like I glance up at the clock, and three hours are gone without my having anything at all to show for it.

It always reminds me of a quote from C. S. Lewis's The Screwtape Letters.  While I (obviously) don't buy into the theology, his analysis of time-wasting by the arch-demon Screwtape is scarily accurate:
As this condition becomes more fully established, you will be gradually freed from the tiresome business of providing Pleasures as temptations.  As the uneasiness and his reluctance to face it cut him off more and more from all real happiness, and as habit renders the pleasures of vanity and excitement and flippancy at once less pleasant and harder to forgo (for that is what habit fortunately does to a pleasure) you will find that anything or nothing is sufficient to attract his wandering attention.  You no longer need a good book, which he really likes, to keep him from his prayers or his work or his sleep; a column of advertisements in yesterday’s paper will do.  You can make him waste his time not only in conversation he enjoys with people whom he likes, but in conversations with those he cares nothing about on subjects that bore him.  You can make him do nothing at all for long periods.  You can keep him up late at night, not roistering, but staring at a dead fire in a cold room.  All the healthy and outgoing activities which we want him to avoid can be inhibited and nothing given in return, so that at last he may say, as one of my own patients said on his arrival down here [in hell], "I now see that I spent most of my life in doing neither what I ought nor what I liked."

That last line, especially, is a fair knockout, and it kind of makes me suspicious that social media may have been developed down in hell after all.

Baughan, however, says maybe we shouldn't be so hard on ourselves.  "I think people experience a lot of shame around social media use," she said.  "One of the things I like about this framing of 'dissociation' rather than 'addiction' is that it changes the narrative.  Instead of: 'I should be able to have more self-control,' it's more like: 'We all naturally dissociate in many ways throughout our day -- whether it's daydreaming or scrolling through Instagram, we stop paying attention to what's happening around us.'"

Even so, for a lot of us, it gets kind of obsessive at times.  It's worse when I'm anxious or depressed, when I crave a distraction not only from unpleasant external circumstances but from the workings of my own brain.  And it's problematic that when that occurs, the combination of depression and social media create a feedback loop that keeps me from seeking out activities -- which sometimes just means turning off the computer and doing something, anything, different -- that will actually shake me out of my low mood.

But she's right that shaming ourselves isn't productive, either.  Maybe a lot of us could benefit by some moderation in our screen time, but self-flagellation doesn't accomplish anything.  I'm not going to give up on social media entirely -- like I said, without it I would lose touch with too many contacts I value -- but setting myself some stricter time limits is probably a good idea.

And now that you've read this, maybe it's time for you to shut off the device, too.  What are you going to do instead?  I think I'll go for a run.


Friday, May 27, 2022

Unexplainable malarkey

A regular reader and frequent contributor to Skeptophilia sent me a link yesterday, with the message, "Oooh, look!  Another company has discovered that it can sell bogus woo-woo stuff using your favorite words -- frequency, field, energy, and vibration!"

Regular readers undoubtedly know how pissed off I get when people use scientific words and can't even be bothered to look up the actual definitions.  It's even worse when they use said misused scientific words to rip people off, although clearly some of the responsibility lies with the consumers, because after all, they could also bother to look up the actual definitions if they wanted to -- caveat emptor, and all of that sort of thing.

So, anyway, I clicked the link, and it brought me to a site called "Unexplainable Frequencies."  My first reaction was that I don't see how a frequency can be unexplainable.  I mean, it's either 638.7 Hertz or it isn't.  In any case, even from the title I knew this site was gonna be good for a few faceplants.  Here's the banner headline on the homepage:

Everything In Existence Has It's Own Frequency Signature. Every Person, Every Animal, And Every Planet Vibrate At it's Own Rhythm. Pure Direct Frequencies Can Help You Heal, Grow, And Change.

Evidently, one of the things that "Pure Direct Frequencies" doesn't do is to help you to learn the difference between "it's" and "its," and that you Don't Need To Capitalize Every Word To Make Your Point.  But maybe I'm just being picky, here.

Further down the page, we find out that we can purchase mp3s ("hundreds of thousands sold," they tell us, which makes me despair for the human race).  These mp3s contain sound recordings with "frequencies" that supposedly  help us to accomplish things in a variety of areas, including:
  • Manifestation
  • Wealth
  • Visualization
  • Astral Projection
  • Lucid Dreams
  • Spirit Guide
  • Chakra Work
  • Remote Viewing
  • Psychic/ESP
  • Christ Consciousness
  • IQ Increaser
So, I decided to listen to some sound samples.  I picked "IQ Increaser," because heaven knows some days I could use some help in that department.  The description said:
Our custom IQ/ Memory Booster recording is in a category of it’s [sic] own, and is one of our top rated products for good reason.  We begin the session by penetrating your body’s own unique energy field with a low vibrational frequency designed to create feelings of “total knowingness.”  You will begin feeling connected and well rounded within the first few minutes.  You may confuse your new disposition with overconfidence but as you will soon see it’s intended.  Change requires confidence you can’t achieve your desired result unless you believe it’s inevitably going to happen.

We’ll then begin blasting your brain with a frequency directly related to Intelligence.  In fact those with brain functions operating in this range are considered geniuses.  This will help your brains [sic] capacity for learning and understanding complex concepts.  In addition to boosting your intelligence this portion of your session can aid arthritis pain, stop involuntary eye movements, and regulate the pulses in women.

Midway through the recording you will begin reflecting on your session and without realizing it you will be recollecting fine details about the past ten minutes.  We manipulated your brain into a higher memory state through frequency and tone.  You will remember things more easily and think deeper than you ever knew you could.  You’ve only unlocked the ability you’ve always had.

You’ll then begin feeling more in tune to what’s really happening around you and enjoy feelings of enlightenment.  You wont realize its happening but we’ve been channeling vibrations towards your cerebral cortex.  You’ll begin to feel your forehead getting warmer and tingling in your spleen.

Your session concludes with another fortifying frequency associated with the functioning of the cerebral cortex.  We want to encourage your brain to store information more efficiently.  When your session concludes we encourage you to try memory games to test your new found ability.  You will notice a considerable difference between your memory skills before and after use.
All of this sounded pretty hopeful, although I wasn't sure how I felt about having my spleen tingle. Nor is it clear why it only helps women with their pulse.  Maybe guys' hearts are tuned to a different unexplainable frequency, I dunno.  But either way, I figured it was worth the risk.  So I started the clip, and closed my eyes.

The Astral Sleep by Jeroen van Valkenburg (1998) [Image licensed under the Creative Commons Jeroen van Valkenburg artist QS:P170,Q91911584, The Astral Sleep, CC BY 2.0]

After about 45 seconds, I had an amazing experience!  I said, "Huh."  And I stopped the clip.  Listening to "IQ Increaser" is about as interesting as reading a telephone book.  It turned out to be a bunch of slowly shifting electronic keyboard noises that just kind of go on and on.  I experienced no spleen tingles, my forehead is still the same temperature, my cerebral cortex is still un-vibrated and lacking in total knowingness, and my thinking processes seem as fuzzy as ever, although that last one may be because I haven't had my second cup of coffee yet.  I can't imagine listening to this stuff for an hour -- it gives new meaning to the word "monotonous." It sounds like music that was rejected by Music From The Hearts Of Space on the basis of being too ethereal.

The best part of the whole site, however, is the "Testimonials" page.  To listen to these people talk, you'd swear that listening to the keyboard noises caused major life changes, or at least multiple orgasms.  Here are a couple:
"I bought this mp3 to help me visualize and calm my mind's chatter.  I was surprised how quickly my brain winded down and melted away, leaving me in a perfect visualization state.  This recording did what it claimed."

"I been playing this frequency for a few days now in the background when I relax and it certainly does do something weird to my mind.  I will continue to play it regularly."
Myself, I don't think see how your brain melting is a good thing.  But I suppose it had to already be partly melted in order to purchase this malarkey.

But here's my favorite:
"I been listening to the astral projection custom session and I can sometimes feel my body tingling and starting to shift around.  I think I will be traveling the astral plane before I know it.  Thank You Unexplainable Frequencies!"
So, evidently, there are at least a few people who have achieved positive results, although my own personal opinion is that anything they accomplished by listening to "Unexplainable Frequencies" could have been accomplished without them.  Sorry if you're one of the Satisfied Customers, but "Unexplainable Frequencies" is a lot of pseudoscientific horseshit.

Anyhow, that's today's heaping helping of woo-woo.  More people using words about which they obviously don't have the first glimmer of understanding.  I suppose we should look on the bright side, however; I never saw that they used the word "quantum."


Thursday, May 26, 2022

Monkeying around with the truth

I don't think I'll ever understand the conspiracy theorist mindset.

It's not, mind you, that I think conspiracies never happen.  It's just that the vast majority of them get found out or otherwise fall apart through gossip and sheer ineptitude.  Humans are lousy at keeping secrets -- and the more people are in the know about the secrets, the faster they get found out.  If you don't believe me (hell, maybe I'm one of the conspirators and am trying to fool you -- mwa ha ha etc.), check out this study I wrote about last year that actually showed there's an inverse relationship between the number of people in a conspiracy and how fast it collapses.

Also, if there were a successful conspiracy -- the likelihood of it being figured out by stupid people is fairly low.  Which was my reaction when I read that the recent outbreak of monkeypox is already being branded a left-wing fabrication by people like Arizona State Senator Wendy Rogers, who amongst (many) other things buys into the idiotic claim that Donald Trump actually won the 2020 presidential election, and that the voter fraud that put Joe Biden into office was the work of "seditious Jews."

So it's pretty clear that Wendy Rogers has spent too much time doing sit-ups underneath parked cars.  But being crazy and stupid doesn't, unfortunately, make you quiet, so it came as no surprise to me that she is now saying the following about the monkeypox outbreak:

  • Monkeypox is an invention of the Democrats to compensate for falling approval ratings and to "reestablish tyrannical control" over rights and freedoms.  (Unfortunately for Rogers, monkeypox was discovered in 1958.)
  • The fact that the virus is spreading much faster than monkeypox usually does should make us suspicious about "what Gates, Fauci, and the rest of the so-called 'public health experts' have been up to for the last few years."  (Which ignores the fact that viruses are excellent at evolving to become more transmissible.  Oh, but wait, she doesn't believe in evolution, either.)

Then her followers started yapping along with her, and adding to the foolishness:

  • Monkeypox is a side-effect of the COVID-19 vaccine.  (It's not.)
  • It's a complete fake; the entire outbreak is a hoax.  (It's not.)
  • Okay, maybe it's not a hoax, but it's only spreading in Blacks and gay people.  (It's not.)
  • Just like COVID-19 is the same thing as the flu, monkeypox is the same thing as shingles.  (It's not, and it's fucking not.)

Unfortunately, the last bit was made considerably worse when someone found a photograph on a Mumbai-based website that was labeled as monkeypox, but was actually a photo of a shingles rash that had been taken from the website of the Queensland Health Department.  The Mumbai health website apologized for, and fixed, the error as soon as they found out about it, but by then it was too late.  Honestly, the confusion was understandable; they do look similar, and you probably know that the causative agent in shingles is the chickenpox (varicella) virus, which is in the same genus (Orthopoxvirus) as monkeypox.

Thus the similarity.

But did I mention that they are not the same thing?  

Monkeypox virus [Image is in the Public Domain]

I know whereof I speak; last year, because 2021 wasn't already enough of a shitshow, I got shingles.  It was pretty mild as such things go, but still hurt like hell, giving me the characteristic "electric zaps" of pain.  But -- unlike monkeypox -- I had no fever, no swollen lymph nodes, none of the other warning signs that it was anything but ordinary shingles.

And, most significantly, when I took a week's worth of aciclovir, it went away.  As shingles does.  As monkeypox does not.

But I'm not expecting any of this to convince anyone who isn't already convinced, especially not Wendy Rogers, who appears to have a half-pound of LaffyTaffy where most of us have a brain.  As I've mentioned before, once you've decided everyone's lying to you, you're unreachable.  Anyone who tries is either a dupe or a shill, so What I Already Believed q.e.d.

Or, put another way, you can't logic your way out of a position you didn't logic your way into.

What's most upsetting, though, is how many people immediately jump on the bandwagon with horseshit like this.  Epidemics and outbreaks are scary, I get that.  We live in a big, chaotic, unpredictable world.  But sometimes stuff just happens.  Everything isn't a plot, a conspiracy, wheels within wheels.

But with people like Wendy Rogers, that's not good enough.  Not only does attributing everything bad to some grand conspiracy appeal to her mindset, it also allows her to scapegoat the people she already hated.

For me, I'd rather side with Carl Sagan, as he expressed the philosophy in his wonderful book The Demon-Haunted World: Science as a Candle in the Darkness: "For me, it is far better to grasp the universe as it really is than to persist in delusion, however satisfying and reassuring."


Wednesday, May 25, 2022

Golden years gold medal

You've probably heard the old joke about a man going in for major surgery.  "Doc," he says, right before the anesthetic is administered, "I gotta ask... after this surgery, will I be able to play the piano?"

The surgeon smiles reassuringly and says, "Of course you will."

"Awesome!" the man says.  "I've always wanted to play the piano!"

That's what came to mind when I read an article in Science called, "Will You Keep Winning Races Into Old Age?  Your Cells Hold Clues," by Tess Joosse.  I'm hoping that like the aspiring pianist, old age will put me into the winner's bracket, because since I started running semi-competitively forty years ago, I've yet to win a race.  I train, I run regularly, but I'm still (and probably always will be) a solid middle-of-the-packer.  The closest I've ever come was about three years ago, when I came in third in my age group.

To be scrupulously honest, there were only six people in my age group.  But I'll take my little victories wherever I can get them.

Me last year, about to not cross the finish line first

Be that as it may, I'm still in there trying.  I'm 61, and I know that regular exercise is essential not only for continuing physical health but mental wellbeing.  In fact, on June 8 I'm running in the Ithaca Twilight 5K, a wonderful race down the footpaths along Cayuga Lake, and because I'm recovering from a series of health setbacks I've lowered my sights to simply getting across the finish line without having to be carted over it in a wheelbarrow.

Even though the "will you keep winning?" part of the headline of the article struck me as funny, the research itself is pretty cool.  Russell Hepple, a biologist at the University of Florida, wondered what was going on with people who are still competitive racers even into old age -- such as his father-in-law, who holds the record time for an eighty-year-old in the Boston Marathon.  Hepple and his colleagues did an assay on the muscle tissue of world-class senior athletes and a group of non-athletes, and found no fewer than eight hundred proteins that were produced in amounts that were significantly different between the two groups.  Some were higher in the athletes; others were lower.  But one obvious patterns was that over half of the proteins the study found were ones that are expressed by, or otherwise affect, the mitochondria.

For some reason, the factoid "the mitochondria are the powerhouses of the cell" is one that sticks in the minds of just about everyone who has taken high school biology, but the way they work is actually pretty amazing.  Your mitochondria are actually symbiotic single-celled life-forms living inside your cells -- they even have their own DNA -- and they have evolved a complex series of chemical reactions (collectively known as aerobic cellular respiration) to break down glucose and store its energy in a molecule called ATP, which is the direct driver of damn near every process living things do.  The amount of ATP created and the rate at which it's used are in an incredibly tight balance; it's estimated that you produce (and consume/recycle) your body weight in ATP every day, which amounts to ten million ATP molecules per second, per cell.

So it's no surprise that octogenarian racers have better mitochondrial function than the rest of us slobs.  In fact, the study found that 176 of the proteins studied were unique to elite senior athletes; how much of that is because of a lucky combination of genes, and how much is because their continuous training has triggered protein production that in non-athletes tapers off or stops entirely, isn't known.

Also an open question is whether administering one or more of these proteins would boost aerobic exercise capacity in older people who aren't athletes (but would like to be).  Luigi Ferrucci of the National Institute on Aging, who co-authored the study, has proposed trying this in mice and seeing if it does increase endurance and stamina, without any untoward side effects.

In any case, I suspect that no matter what I do, I'll never be a gold medalist.  That's okay with me.  I love running for running's sake, and the race community (at least around here) is super supportive of everyone regardless of their level.  (At a race I was in a while back, a twelve-year-old boy had posted himself just past the finish line, and was high-fiving each runner as they crossed.  When I stumbled my way across, he grinned at me and said, "Well done, Shirtless Tattoo Guy!"  That, to me, encapsulates the spirit of racing in my area.)

But I'll be interested to see where this research leads.  Anything I can do to stave off decline (physical or mental) as I get older is a good thing.  Until then, though, I'll keep running, and keep being okay with finishing in the middle of the pack.


Tuesday, May 24, 2022

Forensic geology

I've been interested in rocks since I was a kid.  My dad was a rockhound -- more specifically, a lapidary, who made jewelry from such semiprecious stones as turquoise, agate, and jasper.  The high point of my year was our annual trip to Arizona and New Mexico, when we split our time between searching for cool rocks in the canyons and hills of the southwestern desert and pawing through the offerings of the hundreds of rock shops found throughout the region.

Besides the simple beauty of the rocks themselves, it fascinated me to find out that with many rocks, you could figure out how and when they formed.  A lot of the gem-quality rocks and minerals my dad was looking for -- malachite, azurite, and opal amongst them -- are created by slow precipitation of layers of minerals from supersaturated water; others, such as lapis lazuli, rhodonite, and garnets form when metal-bearing rocks are metamorphosed by contact with magma far underground.

[Image licensed under the Creative Commons Olga Semiletova, Минералы горных пород, Creative Commons Attribution-Share Alike 4.0 International license]

Once I found out that the "when" part was also often knowable, through such techniques as radioisotope dating and stratigraphy, it was always with a sense of awe that I held pieces of rock in my hand.  Even around where I live now, where there are few if any of the lovely gem-quality stones you find in the southwest, there's still something kind of mind-boggling about knowing the layers of limestone and shale that form the bedrock here in upstate New York were formed in the warm shallows of a warm ocean during the Devonian Period, on the order of four hundred million years ago.

But if you think that's impressive, wait till you hear about the research out of the University of Johannesburg that was published in the journal Icarus last week.

The research centered around a stone in the desert of western Egypt called Hypatia, given the name by Egyptian geologist Aly Barakat in honor of the brilliant, tragic polymath whose career was cut short when she was brutally murdered by a mob on the orders of Cyril, bishop of Alexandria.  (The aftermath, although infuriating, is typical of the time; Hypatia was largely forgotten, while Cyril went on to be canonized as a saint by the Roman Catholic Church.)  The stone, fittingly considering Hypatia's contributions to astronomy, turns out to be extraterrestrial in origin, later falling as a meteorite to the surface of the Earth.

But "extraterrestrial" is a big place, as it were.  Where exactly did it form?  Chemical tests on the rock found that it didn't match the composition of any known asteroid or comet; then, the mystery deepened when it was found to contain nickel phosphide, which has never been found on any solid material tested in the entire Solar System.

Further tests only made the rock seem more anomalous.  Silicon, second only to oxygen as the most common element in the Earth's crust (a little over 28%, to be exact), was almost absent, as were calcium, chromium, and manganese; on the other hand, there was far more iron, sulfur, phosphorus, copper, and vanadium than you'd expect.  The ratios were far off not only from rocks in our Solar System, they didn't match the composition of interstellar dust, either.

The researchers decided to go at it from the other direction.  Instead of trying to find another sample that matched, they looked at what process would create the element ratios that Hypatia has.  And they found only one candidate that matched.

A type 1a supernova.

Type 1a supernovas occur in binary star systems, when one of the stars is relatively low mass (on the order of the Sun) and ends its life as a super-compact white dwarf star.  White dwarf stars have an upper limit on their mass (specifically about 1.4 times the mass of the Sun) called the Chandrasekhar limit, after Nobel Prize winning astronomer Subrahmanyan Chandrasekhar.  The reason is that at the end of a star's life, when the outward pressure caused by the fusion in the core drops to the point that it can't overcome the inward pull of gravity from the star's mass, it begins to collapse until some other force kicks in to oppose it.  In white dwarf stars, this occurs when the mutual repulsion of electrons in the star's constituent atoms counterbalances the pull of gravity.  In stellar remnants more than 1.4 times the mass of the Sun, electrostatic repulsion isn't powerful enough to halt the collapse.  (The other two possibilities, for progressively higher masses, are neutron stars and black holes.)

In binary stars, when one of the members becomes a white dwarf, the gravitational pull of its extremely compact mass begins to siphon material from its companion.  This (obviously) increases the white dwarf's mass.  Once it passes the Chandrasekhar limit, the white dwarf resumes its collapse.  The temperature of the white dwarf skyrockets, and...

... BOOM.

The whole thing blows itself to smithereens.  Fortunately for us, really; a lot of the elements that make up the Solar System were formed in violent events such as the various kinds of supernovas.  But the models of the relatively rare type 1a (only thought to happen once or twice a century in a typical galaxy of a hundred billion stars) generate a distinct set of elements -- and the percent composition of Hypatia matches the prediction perfectly.

So this chunk of rock in the Egyptian desert was created in the cataclysmic self-destruction of a white dwarf star, probably long before the Solar System even formed.  Since then it's been coursing through interstellar space, eventually colliding with our obscure little planet in the outskirts of the Milky Way.

When I was twelve, holding a piece of billion-year-old limestone from the Grand Canyon, little did I realize how much more amazing such origin stories could get.

I think the real Hypatia would have been fascinated.


Monday, May 23, 2022

Behind the mirror

I know I've snarked before about the how unbearably goofy the old 1960s television show Lost in Space was, but I have to admit that every once in a (long) while, they nailed it.  And one of the best examples is the first-season episode "The Magic Mirror."

Well, mostly nailed it.  The subplot about how real girls care about makeup and hair and being pretty is more than a little cringe-inducing.  But the overarching story -- about mirrors being portals to a parallel world, and a boy who is trapped behind them because he has no reflection -- is brilliant.  And the other-side-of-the-mirror world he lives in is hauntingly surreal.

I was thinking about this episode because of a paper that appeared in Physical Review Letters last week entitled, "Symmetry of Cosmological Observables, a Mirror World Dark Sector, and the Hubble Constant," by Francis-Yan Cyr-Ravine, Fei Ge, and Lloyd Knox, of the University of New Mexico.  What this paper does is offer a possible solution to the Hubble constant problem -- that the rate of expansion of the universe as predicted by current mathematical models is significantly smaller than the actual measured expansion rate.

What Cyr-Racine, Ge, and Knox propose is that there is an unseen "mirror world" of particles that coexists alongside our own, interacting only through gravity but otherwise invisible to detection.  At first, I thought they might be talking about something like dark matter -- a form of matter that only (very) weakly interacts with ordinary matter -- but it turns out that what they're saying is even weirder.

"This discrepancy is one that many cosmologists have been trying to solve by changing our current cosmological model," Cyr-Racine told Science Daily "The challenge is to do so without ruining the agreement between standard model predictions and many other cosmological phenomena, such as the cosmic microwave background...  Basically, we point out that a lot of the observations we do in cosmology have an inherent symmetry under rescaling the universe as a whole.  This might provide a way to understand why there appears to be a discrepancy between different measurements of the universe's expansion rate.  In practice, this scaling symmetry could only be realized by including a mirror world in the model -- a parallel universe with new particles that are all copies of known particles.  The mirror world idea first arose in the 1990s but has not previously been recognized as a potential solution to the Hubble constant problem.  This might seem crazy at face value, but such mirror worlds have a large physics literature in a completely different context since they can help solve important problem in particle physics.  Our work allows us to link, for the first time, this large literature to an important problem in cosmology."

The word "important" is a bit of an understatement.  The Hubble constant problem is one of the biggest puzzles in physics; why theory and observation are so different on this one critical point, and how to fix the theory without blowing to smithereens everything that the theory does predict correctly.  "It went from two and a half Sigma, to three, and three and a half to four Sigma. By now, we are pretty much at the five-Sigma level," said Cyr-Racine.  "That's the key number which makes this a real problem because you have two measurements of the same thing, which if you have a consistent picture of the universe should just be completely consistent with each other, but they differ by a very statistically significant amount.  That's the premise here, and we've been thinking about what could be causing that and why are these measurements discrepant?  So that's a big problem for cosmology.  We just don't seem to understand what the universe is doing today."

I know that despite my background in science, I can have a pretty wild imagination.  It's an occupational hazard of being a speculative fiction writer.  I hear some new scientific finding, and immediately start putting some weird spin on it that, though it might be interesting, is completely unwarranted by the actual research.  But look at Cyr-Racine's own words: a parallel universe with new particles that are all copies of known particles.  I think I'm to be excused for thinking of "The Magic Mirror" and other science fiction stories about ghostly worlds coexisting, unseen, with our own.

I'm not going to pretend to understand the math behind the Cyr-Racine et al. paper; despite having a B.S. in physics, academic papers in the discipline usually lose me in the first paragraph (if not the abstract).  But it's a fascinating and spooky idea.  I doubt if what's going on has anything to do with surreal worlds behind mirrors and boys who are trapped because they have no reflection, but the reality -- if it bears up under analysis -- isn't a whole lot less weird.


Saturday, May 21, 2022

A tincture of madness

There's long been a supposed connection between being highly creative and being mentally ill. The list of individuals who were both is a long one.  Ernest Hemingway, Georgia O'Keeffe, Hermann Hesse, Maurice Ravel, F. Scott Fitzgerald, Pyotr Ilyich TchaikovskyEdgar Allen Poe, Jackson Pollock, Cole Porter, Vincent van Gogh, and Robert Schumann all suffered from varying degrees of mental problems, most of them from severe depression, schizophrenia, or bipolar disorder.  More than one of these spent the last years of life in a mental institution, and more than one committed suicide.

People who have expressed their creativity in technical fields show the same tendencies.  Physicist Ludwig Boltzmann killed himself; Charles Darwin seems to have had severe agoraphobia, and spent most of the later years of his life in virtual isolation; and the wildly brilliant mathematician Kurt Gödel was a paranoid schizophrenic who became so convinced people were poisoning his food that he finally stopped eating completely and starved to death.

"The only difference between myself and a madman," Salvador Dalí famously quipped, "is that I am not mad."  Two thousand years earlier, the Roman writer Seneca said, "There is no genius without a tincture of madness."

[Image licensed under the Creative Commons Mental illness silhouette, Paget Michael Creelman, Creative Commons Attribution-Share Alike 4.0 International license]

Research has supported that there is a fundamental connection between creativity and mental illness.  One of the chief investigators into this link is Fredrik Ullén, of the Karolinska Institutet of Stockholm, who not only showed that creativity correlates with genetic markers for both schizophrenia and bipolar disorder, but demonstrated a connection between creativity and the dopamine (pleasure/reward) system in the brain -- the same system that is implicated in several forms of mental illness, including schizophrenia, obsessive-compulsive disorder, and tendency to addiction.

Ullén administered a test that was designed to measure a subject's capacity for creative thinking, specifically for developing more than one solution to the same problem or using non-linear solution methods to arrive at an answer. He then analyzed the brain activity of the individuals who scored the highest, and found that across the board, they had lower amounts of dopamine receptors in a part of the brain called the thalamus, one of the main "switchboards" in the higher brain, and responsible for sorting and processing sensory stimuli.

The implication is that creative people don't have as rigid a sorting mechanism as other, less creative people -- that having a built-in deficiency in your relay system may help you to arrive at solutions to problems that others might not have seen.

The connection between the thalamus, creativity, and sorting issues is supported by a different bit of brain research that found that a miswiring of the thalamus is implicated in another bizarre mental disorder, called synesthesia.  In synesthesia, signals from the sensory organs are misrouted to the incorrect interpretive centers in the cerebrum, and an auditory signal (for example) might be received in the visual cortex.  As a result, you would "see sounds." Other senses can be crosswired, however -- the seminal study of the disorder is described in Richard Cytowic's book, The Man Who Tasted Shapes.

Synesthsia is apparently also much more common among the creative.  Alexander Scriabin, the early twentieth century Russian composer, wrote his music as much from how it looked to him as how it sounded.  He describes a sensation of color being overlaid on what he was actually seeing when he heard specific combinations of notes.  The colors were consistent; C# minor, for example, was always green, Eb major always magenta.  And although Alexander Scriabin's synesthesia was perhaps the most intense, he is not the only composer who was synesthetic; the evidence is strong that Liszt, Rimsky-Korsakov, and Olivier Messaien also had this same miswiring.

The studies by Ullén and others have now taken the first steps toward connecting these physiological manifestations with the phenomenon of creativity itself.  "Thinking outside the box," Ullén said, "may be facilitated by having a somewhat less intact box."


Friday, May 20, 2022

Pitch perfect

It's funny, sometimes, what we don't know.  I've played the flute for almost forty years -- started out self-taught (bad experience with elementary school band), then was lucky enough to study with a brilliant classical flutist named Margaret Vitus for five years when I lived in Seattle.  I've since played in three different bands and a community orchestra, and besides the classical repertoire, I've become fairly proficient in Celtic, English country dance, and Balkan music.

But it wasn't until the last band I was in, the trio Crooked Sixpence, that I actually figured out some peculiarities of my own instrument.  I was fortunate enough to play with Kathy Selby, who is not only a brilliant Celtic fiddler but a physicist (then teaching at Cornell University).  Kathy taught a class called "The Physics of Music," which combined her two areas of expertise -- and the class looked at, amongst other things, how specific instruments work.

So it seemed natural for me to ask her something that's always puzzled me; why flutes go sharp once they warm up.  The difference is greater (obviously) when it's cold out -- so the temperature increase the instrument experiences once I start playing it is bigger -- but it is noticeable even on a warm day.  On first glance, it seemed to make no sense.  Objects expand when they warm up, so (I thought) the thermal expansion would make the tube longer, and the pitch should drop, making it go flat.  That they actually go sharp seemed completely opposite to my intuition.

And of course, she immediately knew the answer; it's because sound travels faster in warm air.  Since the frequency of a wave is directly proportional to its velocity, if the sound wave is moving faster, its frequency goes up -- and so does its pitch.  The thermal expansion of the tube is minuscule, so any drop in pitch from the tube becoming longer is negligible.

I also found out from Kathy -- when I attended a free lecture on musical acoustics she gave -- why a bunch of different instruments playing the same note all sound different.  I knew that that the fundamental note (let's say it's A above middle C) has to have a wavelength that is the same length as the tube (or string, or whatever) of the instrument that's playing it.  A sound of that wavelength will set up a standing wave that then sets the air moving and projects outward toward the listener.

But a flute playing an A above middle C and a fiddle playing an A above middle C sound completely different.  The reason, I learned, is because there is more than one wavelength that fits a particular length:

[Image is licensed under the Creative Commons Allowed and forbidden standing waves, File:High School Chemistry.pdf, CK-12 Foundation]

The ones on the left "fit;" the ones on the right don't.  The top one on the left is the fundamental pitch.  The ones further down are called overtones, and that's the key to why instruments sound different.  The greater the number and amplitude of the overtones, the more the sound wave the instrument produces deviates from a simple sine curve.

Sound waveforms, top to bottom -- flute, piano, trumpet.  [Image from Doug Davis, 2002]

As you can see, flute tones are pretty simple, very close to a sine curve.  But look at the trumpet waveform.  Same fundamental pitch -- the peaks and troughs of the waveform line up with the flute's and the piano's -- but the shape is entirely different.  That's because of the number, and intensity, of the overtones.  (Instruments that have forced vibrations from a bow being dragged against the string, like violins and cellos, have a lot more overtones -- and thus more complex waveforms -- than instruments where a string is plucked or struck, like guitars and pianos.  The same comparison holds for double-reed wind instruments like oboes and bassoons, which produce way more complex sound waveforms than flutes do.)

The whole topic comes up because of a paper that was presented recently at the annual meeting of the American Physical Society, which contained the solution to a long-standing question in the physics of music; why do the pipes of an organ play a tone that is considerably lower pitched than the sound wave that should fit the length of the pipe?

Organ builders have known about this for over a hundred years; to get an organ pipe to sound the note you intend, you have to build it a little shorter than you'd expect.  (The "end correction" you have to use to make the pipe's pitch match what physics would predict from its length is equal to 0.6 times the radius of the pipe.)  But why?  Shouldn't a wave of that length be a little too long for the pipe, and be one of the "forbidden standing waves" shown on the right side of the first figure?

The key to the answer was discovered, quite by accident, by a Swiss organ builder named Bernhardt Edskes.  He was working on repairing an organ, and noticed that a tiny piece of gold plating had flaked off one of the pipes.  He only saw it because when he played that pipe, the flake floated above the top of the pipe.  But since there was air blowing up the pipe, why wasn't the flake completely blown away?

Leo van Hemmen, a physicist at the Technological University of Munich, realized that both the "end correction" question and Edskes's mysterious floating piece of gold were the result of the same phenomenon.  When an organ pipe is played, the rising column of air causes the formation of a stable vortex above the top of the pipe.  When van Hemmen used smoke to make the vortex visible, and its height turned out to be exactly 0.6 times the radius of the pipe, he knew he'd solved the puzzle.  The spinning cylinder of air creates a longer tube for the sound to resonate in -- so the wavelength of the lower-pitched note fits perfectly.

Humans have been making music for tens of thousands of years, and I find it fascinating that we are only now understanding the intricacies of what's going on inside the instruments we play.  It may be that we don't need to know the physics of music to enjoy it, but for me, it's fun to find out how complex these things are -- and that all As above middle C are not created equal.


Thursday, May 19, 2022

Words, words, words

In Dorothy Sayers' novel Gaudy Night, set (and written) in 1930s England, a group of Oxford University dons are the targets of threats and violence by a deranged individual.  The motive of the perpetrator (spoiler alert!) turns out to be that one of the dons had, years earlier, caught the perpetrator's spouse in academic dishonesty, and the spouse had been dismissed from his position, and ultimately committed suicide.

Near the end of the novel, the main character, Harriet Vane, experiences a great deal of conflict over the resolution of the mystery.  Which individual was really at fault?  Was it the woman who made the threats, a widow whose grief drove her to threaten those she felt were smug, ivory-tower intellectuals who cared nothing for the love and devotion of a wife for her husband?  Was it her husband, who knowingly committed academic fraud?  Or was it the don who had exposed the husband's "crime" -- which was withholding evidence contrary to his thesis in a paper?  Is that a sin that's worth a life?

The perpetrator, when found out, snarls at the dons, "... (C)ouldn't you leave my man alone?  He told a lie about somebody who was dead and dust hundreds of years ago.  Nobody was the worse for that.  Was a dirty bit of paper more important than all our lives and happiness?  You broke him and killed him -- all for nothing."  The don whose words led to the man's dismissal, and ultimately his suicide, says, "I knew nothing of (his suicide) until now...  I had no choice in the matter.  I could not foresee the consequences... but even if I had..."  She trails off, making it clear that in her view, her words had to be spoken, that academic integrity was a mandate -- even if that stance left a human being in ruins.

It's not, really, a very happy story.  One is left feeling, at the end of the book, that the incident left only losers, no winners.

The same is true of the tragedy that happened in Buffalo, New York last Saturday.

The accused shooter, eighteen-year-old Payton Gendron, drove for two and a half hours from his home in Conklin, New York, allegedly motivated by trying to find the neighborhood with the highest proportion of Black residents.   He is clearly a seriously disturbed individual.  While in high school, he was investigated by Broome County police for threatening his classmates; ultimately the investigation was closed, with Gendron saying he had been "joking."  One of his former teachers reported that she had asked him for his plans after graduation, and he told her, "I want to murder and commit suicide."  It's a little appalling that someone like him was able to procure body armor and three guns -- including an XM-15 Bushmaster semi-automatic rifle, which is banned in New York state -- without setting off enough red flags to stop a freight train.  I'm not intending to discuss the issue of gun laws, however.  What I want to look at is what created Payton Gendron.  Because at the center of his rage were nothing more than words.  Words, words, words.

He wrote a 180-page manifesto that mirrors the "Great Replacement" theory of Jean-Renaud Camus, that the leftists are deliberately crafting policy to replace people of White European descent with immigrants and People of Color.  Gendron made no secret of his views and his intentions.  He had accounts on social media outlets Discord and Twitch; on the former he had a to-do list of preparations for the attack, and he used the latter to livestream the attack itself.  He identified all people of color as the danger, not just immigrants --  after all, the Black people he deliberately chose as targets were just as much American citizens as he is, and almost certainly their ancestors had been here for hundreds of years. 

Gendron himself has no problem explaining why he did what he did.  He told investigators, "I simply became racist after I learned the truth."

But he didn't come up with that "truth" himself; others put it there.  Others fed him those lies and distortions, and in his twisted, faulty logic he bought them wholesale.  Gendron himself is, of course, ultimately the one responsible for the shootings; but what blame lies with the people who, whatever their motives, broadcast the ideologies he espoused?

Tucker Carlson, for example, makes his opinion crystal-clear.  Last year he was interviewed by Megyn Kelly for a radio broadcast, and he said, "'The Great Replacement' theory is, in fact, not a theory.  It’s something that the Democrats brag about constantly, up to and including the president, and in one sentence, it’s this: Rather than convince the current population that our policies are working and they should vote for us as a result, we can’t be bothered to do that.  We’re instead going to change the composition of the population and bring in people who will vote for us."

He's not the only one.  Representative Steve King of Iowa said, "The idea of multiculturalism, that every culture is equal -- that’s not objectively true…  We’ve been fed that information for the past twenty-five years, and we’re not going to become a greater nation if we continue to do that."  Texas Agriculture Commissioner Sid Miller posted a photograph of George Soros on Facebook with the caption, "Start the race war."  Fox News host Laura Ingraham isn't exactly subtle, either.  "Massive demographic changes have been foisted upon the American people and they're changes that none of us ever voted for and most of us don't like," she said on her show in 2019.  "From Virginia to California, we see stark examples of how radically in some ways the country has changed.  Now, much of this is related to both illegal and in some cases, legal immigration that, of course, progressives love."

After the shooting, people like Carlson were blasted for using their positions as pundits to stoke fear, rage, and violence -- and very quickly, they responded in kind, absolving themselves of any responsibility.  "The truth about Payton Gendron does tell you a lot about the ruthlessness and dishonesty of our political leadership," Carlson said, the day after the shooting.  "Within minutes of Saturday’s shooting, before all of the bodies of those ten murdered Americans had even been identified by their loved ones, professional Democrats had begun a coordinated campaign to blame those murders on their political opponents.  'They did it!' they said, immediately...  So, what is hate speech?  Well, it’s speech that our leaders hate.  So because a mentally ill teenager murdered strangers, you cannot be allowed to express your political views out loud.  That’s what they’re telling you.  That’s what they’ve wanted to tell you for a long time."

Which packs a lot of terrifying rhetoric into one paragraph.  First, no sensible person, left, right, or center, defines hate speech as "speech our leaders hate."  The Supreme Court itself has given the term a clear definition: "abusive or threatening speech or writing that expresses prejudice against a particular group, especially on the basis of race, religion, or sexual orientation."  Second -- sure, Gendron is mentally ill, but that's not why he targeted Black people for murder.  Lots of people have mental illness (I've blogged here more than once about my own struggles with it), and very few of them murder people.  Blaming mental illness for Gendron's actions is just a way for Carlson to deflect any criticism leveled at him for the results of what he has said vehemently and repeatedly.

Third, virtually no one -- once again, regardless of political stripe -- is trying to stop people from expressing their political views.  The vast majority of us agree with British writer Evelyn Hall, "I disagree with what you've said, but I will defend to the death your right to say it."  Conservative commentator and former GOP Representative Joe Walsh, who -- despite the fact that we'd probably disagree on a lot of things -- is one of the most honest, honorable voices we have today, said, "Try being nonpartisan for a day.  Call out stuff that’s wrong, stupid, or dishonest no matter where it comes from.  Even if it comes from your side. Just try it."  And he summarized Tucker Carlson's self-defense as follows: "[Carlson basically told] his audience that THEY are the victims.  Not the ten innocent souls killed in Buffalo.  Nope, Tucker’s audience are the real victims here...  [His attitude is] 'I don’t even know what white replacement theory is.  All I know is America is becoming less and less white.  And that’s a really bad thing.  But that makes me a racist?  For just stating facts?'"

[Image licensed under the Creative Commons Ivan Radic, A colorful Stop Racism sign (50115127871), CC BY 2.0

Of course, all Carlson, Ingraham, et al. are trying to accomplish are two things; to use emotionally-charged language in order to make their own opinions sound unassailable, and to generate such a negative spin on their opponents' thinking that listeners are left believing that only morons could possibly agree with them.  

I'm appalled not just because these political hacks are using this tragedy to hammer in their own views with an increasingly polarized citizenry; but because they are doing this, willfully blind to the end results of their words, just like the Oxford don in Gaudy Night whose dedication to the nth degree of academic integrity made her blind to the human cost of her actions.  Words are tools, and they are using them with as much thought and responsibility as a five-year-old with a chainsaw.

I will end with a devout hope for healing for the Buffalo community that has lost ten of its people, and that the families of those who died will be able to find consolation in the outpouring of sympathy from the vast majority of Americans who still value compassion over political rhetoric.  And to the ideologues who are using this tragedy as a platform to defend their own repugnant views, I can only say: shut the hell up.


Wednesday, May 18, 2022

Planet cupcake

I just found out that Neal Adams, most famous as a comic book artist and creator of characters for DC Comics, died a couple of weeks ago at the age of eighty.

I'm not an aficionado of superhero stories, either in comic/graphic novel or movie form, so I didn't know much about Adams's contributes to that genre other than that he was involved somehow.  I knew Adams from a contribution to a much less publicized field: loopy pseudoscience.

As you all know, the Earth is a geologically active place.  Most scientists attribute this to plate tectonics, the shifting of Earth's geological plates relative to one another.  Their explanation is that these processes have been going on throughout Earth's history, driven by magmatic convection currents in the Earth's mantle, and that while active plate margins are expected to be -- well, more active -- any apparent clustering of geographically-separated tectonic events is simple coincidence, insignificant in the bigger picture.

Neal Adams disagreed.

In a video that you really should watch in its entirety, Adams called our attention to phenomena such as the following:
  • The formation of a three-kilometer-long crack in the ground in Huacullani, Chucuito Province, Peru, following an earthquake
  • The opening of a wedge-shaped, 500-meter-long, 60-meter-deep rift in Ethiopia, along the Great Rift Valley
  • The sudden creation of a crack in the ground in Iceland, and the subsequent draining of Lake Kleifarvatn into the fissure
  • The presence of a deep hydrothermal vent in the Mid-Cayman Rise, a spreading center in the middle of the Caribbean Sea
  • Increasing tension along the San Andreas Fault, causing cracks and fissures to form
Adams took these stories, and many others like them, and decided that the conventional explanation -- that all of these places are on plate margins, so cracks in the ground are to be expected -- is wrong.  And in a classic case of adding two plus two and getting 113, he deduced the following:

The Earth is expanding.

Yes, just like a cupcake in the oven, the Earth is getting bigger, and as it does, its surface cracks and splits.  The tectonic plates are a mere side-effect of this phenomenon, and are basically the broken up surface of the cupcake, pulled apart as the inside swelled.  Now, a cupcake, of course, is only increasing in volume, as the air bubbles in the batter expand; its mass remains the same.  Is that what's happening here?  Some kind of planetary dough rising?

Don't be silly.  We haven't trashed nearly enough physics yet.  It's not just volume; the Earth is actually gaining mass.

Wait, you might be saying; what about the Law of Conservation of Mass, which is strictly enforced in most jurisdictions?  Simple, Adams said.  No problemo.  Physicists have demonstrated that empty space can give rise to electron/positron pairs without any violation of physical law, because of the presence of "vacuum energy."  "Empty space" is actually, they say, a roiling foam of particles and antiparticles, most of which annihilate each other immediately.

So, Adams said, this sort of pair-production is happening inside the Earth.  Therefore it's gaining mass.  And expanding.

Of course, Adams conveniently ignored the fact that if this was happening, half of the mass thus produced would be antimatter.  If the Earth's middle was producing matter and antimatter fast enough to pop open cracks on the surface, the antimatter would follow the E = mc^2 rule (also strictly enforced) and blow us to smithereens.  After all, you may recall from scientific documentaries such as the original Star Trek what happens when antimatter containment is lost -- Captain Kirk strikes a dramatic pose, usually with his shirt ripped open to expose one or more nipples, and the show breaks to a commercial.  

And heaven knows we don't want any of that to happen.

So there are some problems with Adams' theory.  But this hasn't stopped websites from popping up supporting the Cupcake Earth Hypothesis and touting how amazing Adams's video is.  Apparently the argument is that the claims in the video must be true because (1) it has cool animation of the Earth shrinking and the continents fitting together as you go back in time, and (2) uses dramatic music from 2001: A Space Odyssey.  

Notwithstanding those points in its favor, it raises a few key questions in my mind:
  • What happened to all of the oceans?
  • If the Earth really was (let's say) a quarter as massive, 100 million years ago, it would have had a quarter of the gravitational pull.  Which would have resulted in a good bit of our atmosphere leaking out into space, not to mention herds of enormous dinosaurs bouncing about the landscape in the fashion of Neil Armstrong on the surface of the moon.  So why do we still have an atmosphere?
  • Why am I spending so much time and effort addressing this goofy theory?
As far as the last question, I recognize that I can't debunk every silly idea in the world, and in fact I had originally intended to write about marginally more reasonable claims, such as sightings of sea serpents off the coast of England.  But then I found out that Adams had gone to that big E-Z Bake Oven in the sky, and I felt like I at least owed him that much.  So I'll end by passing along my condolences to his family, friends, and fans, and maybe today I'll eat a cupcake or two in his memory.


Tuesday, May 17, 2022


For many years we owned two cats, Puck and Geronimo.

Imagine two soft, gentle, affectionate, fluffy kitties.  Puck and Geronimo were the exact opposite of what you just pictured.

What neither of our cats looked even remotely like.  [Image courtesy of the Creative Commons Nicolas Suzor from Brisbane, Australia, Cute grey kitten, CC BY-SA 2.0]

Puck and Geronimo were siblings, both long-bodied, tough, lean, and solid black.  Puck had some odd features, though.  She had one single white whisker accentuating a face that was already kinda... off.  Her eyes didn't quite line up, so you never could be 100% sure of where she was looking.  She had one broken fang, so her tongue frequently protruded from the side of her mouth.  Plus, her voice sounded like a creaky wheel.  She was actually quite a sweet, affectionate cat, but even dedicated cat lovers had to admit she looked like she had a screw loose.

Geronimo, on the other hand, hated everyone, with two exceptions: (1) my wife; and (2) our dog, Grendel.  When we adopted Grendel, we were assured by the shelter that he was great with cats.  But shelter staff -- no insult intended, they do amazing work -- can sometimes overplay animals' good qualities in the interest of getting them adopted, so when we brought him home, we introduced him to the cats on leash, with me hanging on to my end of it like grim death.  Puck, he ignored completely.  Then he came up and sniffed Geronimo, who sniffed him back (without hissing, which was Geronimo's primary way of communicating with the entire world).  So I tentatively relaxed my end of the lead...

... and Grendel lifted his big front paw and body-slammed Geronimo to the floor.

I leaped forward, yelling, "Noooooooo....!!!!"  But then Grendel started to lick Geronimo's face.  Geronimo, although still pinned to the ground, started purring.  And thus was born the only interspecies gay romance I've ever witnessed.  They were boyfriends for as long as we had them.

But other than those exceptions, Geronimo viewed the entire world with something between haughty disdain and utter loathing.  Sometimes I'd look up from what I was doing to find Geronimo staring at me, his yellow eyes narrowed to slits, and he was clearly thinking, "I am going to disembowel you in your sleep."

What brings all this up is a paper that appeared in Nature last week about some research done at Kyoto University.  A team led by animal behavioral psychologist Saho Takagi did a clever set of experiments to see if cats could not only learn their own names but the names of other cats, and their results suggest that the answer is yes.

They worked with two sets of cats -- household pets, and "café cats."  Apparently in Japan, it's common to have cats living in cafés, for the benefit of patrons who would like to pet cats while they have their coffee and pastries, or at least have cats glaring at them and making harsh judgments about their general appearance.  They had their test subjects "softly restrained" by volunteers, who I hope were wearing body armor at the time, and the cats were given vocal stimuli (the cats' own names, the names of other cats living in the same place, and neutral words falling into neither categories), along with photographs of different cats, sometimes the photograph of the cat being named, sometimes not.

They found that the cats tended to look more quickly and for a longer duration at photographs when the photograph was of the cat being named.  It was evident that the cats tested did indeed know the names of the cats that cohabited with them.  (Except for one test subject who "completed only the first trial before escaping from the room and climbing out of reach.")

I found these data mildly surprising, considering that our own cats gave no evidence of knowing either their own names or each other's.  Geronimo usually responded to being called as follows:

Us:  Geronimo!!!

Geronimo:  Fuck you.

Us:  Geronimo, come get your dinner!

Geronimo:  Fuck you.

Us:  C'mon, kitty kitty kitty!

Geronimo:  Fuck you.

Us: We have a plate of fresh salmon for you!

Geronimo:  Fuck you...  Salmon?  Well, okay, maybe this time.

So I don't know how we'd have been able to tell if he did know his name.

But all of this does point out something I've always thought, which is that a lot of animals are way smarter than we give them credit for.  I know one of our current dogs, Guinness, always gives us this incredibly intent look when we talk to him, as if he's trying his hardest to understand every word we're saying.  Our other dog, Cleo, spends a lot of time ignoring us, but she's a Shiba Inu, which in my opinion is a cat wearing a dog suit.

So okay, maybe that doesn't exactly support the contention that our pets are really smart.  But my point stands.

In any case, that's our cool piece of animal behavior research for today.  If you are the owner of two or more cats, see if you can figure out if they know each other's names.

If any of your cats have a temperament like Geronimo's, you might want to have fresh salmon handy.