Skeptophilia (skep-to-fil-i-a) (n.) - the love of logical thought, skepticism, and thinking critically. Being an exploration of the applications of skeptical thinking to the world at large, with periodic excursions into linguistics, music, politics, cryptozoology, and why people keep seeing the face of Jesus on grilled cheese sandwiches.

Friday, October 14, 2022

Nonlocal and unreal

This year, the Nobel Prize in Physics went to three scientists who have proven beyond a shadow of a doubt that our common-sense perception of how the universe works is very, very far off from the reality.

What that reality actually is remains to be seen.

John Clauser, Alain Aspect, and Anton Zeilinger were the recipients of the award this year "for experiments with entangled photons, establishing the violation of Bell inequalities and pioneering quantum information science."  Their experiments established a mind-boggling fact: the universe is not locally real.

What that means, in non-technical language, is harder to pin down.  In physics, the concept of locality has to do with the fact that information transfer has a speed limit -- the speed of light.  If an event occurs at one point in space, then that event can only affect another point in space if it's nearby enough that light has enough time to travel between one and the other.  Reality means that an object's properties are independent of observation; it's a hard-science version of the time-honored question, "if a tree falls in the forest, and no one is there, does it make a sound?"

While the "locality" piece isn't perhaps something that impacts us on a daily basis -- light travels so fast that on the scales we usually deal with, it may as well be instantaneous -- "reality" certainly does.  Even the physicists balked for decades against the hints they were getting that locality and reality were on shaky ground.  No less a luminary than Albert Einstein said, "Do you really believe that the Moon is not there when you are not looking at it?"  But ever since Northern Irish physicist John Stewart Bell first proposed that there was something at the heart of quantum mechanics that called local reality into question, way back in 1962, the loopholes for avoiding that bizarre conclusion have been closing one by one.

The heart of the problem lies with entanglement.  The idea here is that you can create a pair of particles such that you know if one has a particular property (such as a spin axis pointing up) the other will have the opposite property (spin axis pointing down).  So far, nothing too weird about that.  It's no odder than putting each of a pair of gloves into a sealed box, and handing a box to your friend; if when your friend opens his box, he finds a left-handed glove, you automatically know your box must contain the right-handed one.  The system was set up that way.

But what Bell implied was that this wasn't the case.  The gloves were neither right nor left until you opened one of the boxes; if your friend did that, and observed a left-handed glove, the glove in your box "sensed that" (whatever the hell that means!) and instantaneously became right-handed, regardless of how far apart they were at the time.  The measurement process somehow created the state of the system, even if the parts of it were separated by a distance too great for light to cross.

For a long time, the prevailing approach amongst physicists was just to pretend it wasn't happening, an approach David Mermin summed up as "shut up and calculate."  Perhaps there were "hidden variables" that made some sort of locally real explanation account for the strange phenomenon of entanglement; using our analogy, that the gloves were what they were even though they hadn't been observed yet, no superluminal communication necessary.  And for a while, they kind of got away with it.  But with a series of ingenious experiments, Clauser, Aspect, and Zeilinger conclusively showed that there are no hidden variables; the universe, it seems, is not locally real.

What exactly is happening is another matter.  The three recipients of this year's Nobel Prize in Physics have shown that what John Stewart Bell proposed sixty years ago is spot-on correct, as crazy as it sounds.  There is something about the process of observation that does lock the observed object into a particular state faster than should be possible; Schrödinger's long-suffering cat seems to be not a wild metaphor but how the universe actually works.


I find this whole thing fascinating but a little overwhelming.  It's hard to imagine how our physical surroundings can behave in a manner so completely opposite to our common-sense notions.  But Clauser, Aspect, and Zeilinger have demonstrated conclusively that they do -- and it lies with the rest of the physics community to tell us laypeople exactly what that means.

****************************************


Thursday, October 13, 2022

An ancient invasion

Just about anywhere you are in the world, you are confronted constantly with invasive species.

Some are so ubiquitous we've stopped even noticing them.  Here in the United States, for example, most lawn grasses are non-natives (including, amusingly, Kentucky bluegrass), as are dandelions, daisies, burdock, garlic mustard, multiflora rose, bush honeysuckle, and thistle.  None of our domesticated animals are native to North America, but neither are such ridiculously common creatures as house mice, the various species of rats, Japanese beetles, pigeons, starlings, house sparrows, and goldfish.  

It's tempting to lump all these species together and say "exotic = bad," but that's a vast, and inaccurate, oversimplification.  Some have clearly had devastating effects on native species; feral and owned-but-outdoor cats, for example, kill an estimated two billion birds a year in the United States alone.  (Yes, that's billion, not million.  Cats are responsible for more bird deaths than any other single cause.)  Other exotics have had far less impact; dandelions may be in every lawn in North America, for example, but they don't seem to do much in the way of outcompeting other species.  (And, as I said earlier, lawn grasses are exotics themselves anyhow.)

A lot of effort by environmental agencies has been put into eradication of exotics, to varying levels of success.  Rats and mice, for example, are generally a lost cause, given their fast reproductive rate and ability to survive on damn near any kind of food; but some isolated islands have done pretty well, most notably South Georgia, which wiped out their rat and mouse infestation in 2018 in order to save endangered birds that nest there.

The southeastern United States, however, has had almost zero success controlling kudzu, also called "mile-a-minute vine" because of its stupendous growth rate.  Introduced in 1876, and hailed as a source of browse for cattle and starch-rich roots that could be used in place of potatoes, the vine went on to cover trees, barns, and slow-moving individuals, and to this day blankets acres during its growing season.

Kudzu in Atlanta, Georgia [Image is in the Public Domain]

Where it gets interesting is the observation by one of my AP Environmental Science students a while back, who said, "But if you go back far enough, isn't everything exotic?"  It's a point well taken.  Species move around, and introductions happen by accident pretty much continuously.  (In fact, there's a whole mathematical model called island biogeography that has to do with the effects of such factors as island size and distance from the mainland on immigration rate and stable biodiversity.)  Our own deliberate and accidental introductions are only continuing a process that has been going on for a long time.

A very long time, to judge by the research of Ian Forsythe (of the University of Cincinnati) and Alycia Stigall (of the University of Tennessee - Knoxville).  They've been studying the "Richmondian Invasion" -- a sudden influx of new species into the shallow sea that covered what is now northern Kentucky, southwestern Ohio, and southeastern Indiana that occurred during the Late Ordovician, 450 million years ago.

The invasion was surprisingly rapid.  Due to exceptionally well-preserved strata, they were able to show that the new species were introduced from the north, as rising seas allowed them to cross what had been a low ridge of dry land, over only a few thousand years.  And what Forsythe and Stigall found was despite the magnitude of the invasion, and the speed with which it occurred, it didn't have very much effect on the recipient ecosystem's pre-existing species.

The reason, Forsythe and Stigall say, is that most of the invaders were low on the trophic ladder -- they were filter-feeders and grazers on phytoplankton.  It'd have been a different story if the invaders had been high-trophic-level predators.

All of this should inform our decisions on where to put our limited resources for environmental management.  High-impact, high-trophic-level invaders -- feral cats, rats, and the like -- are more critical to control than low-level herbivores like pigeons and house sparrows.  (It bears mention, though, that just being a herbivore doesn't mean "harmless;" here in the northeastern United States, whole forests of ash trees are being killed by the emerald ash borer, and farmers and viticulturists are rightly flipping out about the wildfire-spread of the spotted lanternfly.)

So it's a complex subject.  But it's fascinating that an analysis of an exotic invasion 450 million years ago might inform our decisions about how to manage exotics today.  Yet another indication of the value of pure research -- it can give us an angle on real-world problems that we wouldn't have arrived at otherwise.

****************************************


Wednesday, October 12, 2022

The Grand Duke of Microscopica

I have recently become aware of the phenomenon, apparently of long standing, of various cranks, misfits, wags, and malcontents seceding from their home country and founding their own sovereign nations.

These so-called "micronations" are universally ignored by the parent country, but this hasn't stopped the aforementioned cranks et al. from founding a good many of them.  (See the Wikipedia list, with descriptions, here.)  The commonality across the lot is that the leaders seem to trumpet fairly loudly but then make sure to fly under the radar when it comes to potential unpleasantness.  For example, the Principality of Hutt River (formerly a part of Australia) regularly has its taxes paid to Australia by its founder, Crown Prince Leonard I (formerly Leonard Casley), with the proviso that the tax check is to be considered "a gift from one world leader to another."

I find this whole phenomenon simultaneously charming and perplexing.  Perplexing because (with the exception of the handful who have clearly set the whole thing up as a joke), these people seem to take themselves awfully seriously.  Consider the Principality of Sealand, which consists solely of one abandoned military staging platform in the North Sea.  Take a look at Sealand's webpage (of course it has a webpage).  Reading through that, and the other assorted websites for micronations, leaves me thinking, "Are you people loonies?  Or what?"

The Principality of Sealand [Image licensed under the Creative Commons Ryan Lackey, Sealand aerial view, CC BY 2.0]

On the other hand, it is somewhat charming, in a twisted, Duchy of Grand Fenwick sort of way.  The majority of the self-proclaimed nobility from micronations seem to be doing no real harm.  Let them issue their own currency, stamps, and legal documents.  Hey, if it gives them a hobby, then why not?  I don't think it's really any crazier than many other hobbies, such as collecting beer bottle caps or doing Civil War battle re-enactments.

And then, the depressive existentialist side of my personality has to pipe up and ask, "Why is this so different from what all countries are doing?"  Countries only exist because a group of people with adequate weaponry have decided to band together, declare that they have the right to draw a line on the ground across which None Shall Pass, and tell everyone what they can and can't do.  The lines are mostly arbitrary, and a good many of the laws seem to be as well.  (Imagine trying to explain to an alien why on the north side of an invisible line on the ground, LGBTQ people can marry, and on the south side, they can't.  I think all you'd get from the alien was mild puzzlement, up until the point where he decides that there really isn't any intelligent life on Earth, and vaporizes you with his laser pistol.)

So then, what's the difference between micronations and regular nations?  There's this thing called "recognition" -- that other countries recognize the existence of a legitimate nation.  So, because the United States is pretending not to notice the Republic of Molossia (a totalitarian dictatorship, formerly part of Nevada), it doesn't exist?  It's a little like a four-year-old covering his eyes and concluding that everyone he can't see is gone.

Of course, recognition isn't everything.  There are also diplomatic ties -- who are you willing to negotiate with?  But that gets a little dicey, too, because there are countries that clearly exist by most people's definition (e.g. Cuba) with whom we have no diplomatic relations.  So, you only exist if (1) we are willing to admit you exist, and (2) we both agree to send people to meet at a five-star hotel to drink hundred-dollar-a-glass wine and discuss how much our people want to cooperate, despite our differences and our occasional desire to annihilate each other?

Sorry for appearing cynical.  But so much of politics seems to me to be high-stakes game playing, not so very far advanced from the Inner Circles and Exclusive Clubs that middle schoolers dream up, with the only difference being that middle schoolers aren't capable of blowing each other up with tactical nuclear weapons.  Yet.

Anyway, my point is that other than scale, there seems to be little to separate the micronations from the ordinary type.  And given the current economic and ecological mess that the United States is sitting in, I'm thinking that maybe I should secede, too.  I will only continue to pay taxes as a Generous Donation Of Aid To My American Friends, and Guinness will be appointed Chief Pooch In Charge Of Tennis Ball Chasing And Strategic Naps.  Cleo will clearly be Court Jester.  I, of course, will now go by the moniker King Gordon I, "the Magnificent," of the Sovereign Kingdom of Perry City.  Carol already thinks she's the queen, so her status won't change much.  It does, of course, open up a serious possibility of a war of succession when I die, because I don't think that Duke Nathan of Houston will easily give up the throne to the heir apparent, Crown Prince Lucas of Fall Creek, given that they didn't get along as toddlers and things have only gone downhill since then.

Whatever happens, it should be worth a page in the history books.  Or at least a website.

****************************************


Tuesday, October 11, 2022

Wrack and rune

Yesterday I stumbled upon a post claiming that someone had programmed an artificial intelligence to cast magical spells, and that sent me down a rabbit hole that was way deeper than I'd expected.

The post was this:


So I googled "AI magical spells," and that was the last anyone saw of me for about four hours.

The first thing I ran into was an article in Vice about a GPT-3-powered AI interface called "Norn" that co-authored (so to speak) a book called A Chaos Magick Butoh GrimoireButoh is a Japanese dance form that incorporates "playful and grotesque imagery, taboo topics, and extreme or absurd environments," but I haven't found any indication that it's connected with magic or the occult, so that's peculiar right from the get-go.  Alley Wurds, the human half of the co-authorship of the Grimoire, apparently tied it in by practicing Butoh moves until exhaustion and then going to the computer to see what Norn could create about the experience.  "GPT-3 has read a huge amount of stuff that it cuts and pastes together in a stream of consciousness free association manner, due to its lack of long term memory," Wurds said.  "So GPT-3 is like the subconscious mind of the internet expressing itself through cut-ups. I’m using my experience in the occult to direct this subconscious mind, rather than just my own."

This resulted in disquieting shit like the following:

The knowledge ritual involves chanting a mantra.  The mantra must be repeated while walking deosil [clockwise] around the boundary of the circle, and then stopping at each of the four cardinal points to meditate upon the knowledge you seek...  You sit down, repeat the mantra, and visualize a pentagram glowing with light.  Once this pentagram is fully visualized, you must carve the pentagram upon your flesh, and repeat the mantra yet again.  You carve the pentagram upon your chest, and feel the blood trickling down.  Repeat the mantra.  Once you are ready, you then pick up your ceremonial blade and stab it through your chest, killing yourself.  You repeat the mantra as you die.  This causes your soul to be released from your body.  The knowledge you seek will appear in your mind.

So, upside: you get to find out the knowledge you're looking for!  Downside: you're dead.

Unsurprisingly, the hyper-religious and hyper-paranoid (the Venn diagram for those two sets would have considerable overlap) are seriously freaking out about stuff like this.  Allowing soulless machines to learn how to cast magic spells and summon demons and whatnot is going to open a portal to hell and release Cthulhu and activate the sigils of evil and the gods of the underworld alone know what else.  But I was having a hell of a time trying to find out where the original post -- the one about the AI casting runes -- came from.  There was a lot of shrieking hyperbole about how evil it was and how we were all doomed, but no one seemed to know for sure where it had originated.

It took me an inordinate amount of time to figure out that the runes weren't created by an AI at all, and had, in fact, zero to do with AI.

They were a set of rune designs developed by author Brandon Sanderson for his Cosmere fictional universe.

Emphasis, of course, on the fictional part.

I shoulda known.  Just last week a woman in Texas became internet-famous for posting on Facebook that parents shouldn't let their kids watch Hocus Pocus 2 because the characters in the movie "could be casting any type of spell [that]… could be coming through that TV screen into your home," and that it could "unleash hell on your kids."

Ignoring the fact that once again, Hocus Pocus 2 is a work of fiction.

So yeah.  That's how I spent my afternoon yesterday.  You'd think I'd have learned by this time.  Run into something like that, immediately say, "It's the conspiracy nuts freaking out over fiction again," and forthwith moving on to some more productive enterprise, like attempting to explain quantum physics to my dog.  But I guess I should look on it as a public service.  I delve into these things so you don't have to.

You're welcome.

****************************************


Monday, October 10, 2022

Head hunters

Today's post combines archaeology, mythology, and an etymological mystery -- surely a recipe for something fascinating.

I first ran into the Blemmyes in Umberto Eco's tour-de-force medieval murder mystery The Name of the Rose, where they are described as a race of people living in Africa who have no heads; their faces are in the middle of their torsos.  The topic comes up because of the habit of a manuscript illuminator, Brother Adelmo, who has a habit of adorning his manuscript with fanciful creatures -- not only familiar ones like centaurs and unicorns and dragons, but Cynocephali (dog-headed men), Sciapodes (people with one leg and a huge foot, the inspiration for the Monopods in C. S. Lewis's The Voyage of the Dawn Treader) and... the Blemmyes.

One of the Blemmyes (from a 1556 map by Guillaume de Testu) [Image is in the Public Domain]

So naturally I thought that the Blemmyes were a complete fiction.  (Actually, given the illustration, I hoped they were a complete fiction, because they're freakin' creepy-looking.)

That's why I was pretty surprised when I ran into a story on Science Daily yesterday, about some research out of the Universitat Autonoma de Barcelona that was published last week in the American Journal of Archaeology.  The paper was about the Blemmyes -- who were apparently a real nomadic people that lived in what is now southern and central Egypt during Roman times, and who had their faces on their heads as per the usual human specifications.

What's weirdest about this is that the sources that mention the mythological headless Blemmyes and the ordinary human Blemmyes have almost no overlap; it's as if the authors of one didn't even talk to the authors of the other.  This might be understandable if it was some kind of linguistic coincidence, where two groups of people just happened to use similar-sounding words to describe two entirely different things; but I'm sorry, "Blemmyes" -- not only identical-sounding word, but identical spelling -- is just too weird for me to accept that they're unrelated homophones.  Add to that the fact that the alleged territory of the mythological Blemmyes and the home of the real Blemmyes both were what is now southern Egypt and northern Sudan, and I can't swallow it as some bizarre coincidence.

But the medievalists don't seem to have a good idea of how it happened.  The real Blemmyes, according to third century B. C. E. historian and writer Eratosthenes, were named after one of their ancient kings, King Blemys, but he is unattested elsewhere.  Other linguists have traced the name of the actual people to the Coptic word Ⲃⲁⲗⲛⲉⲙⲙⲱⲟⲩⲓ, Balnemmōui, but tracking the word earlier than that has proven impossible.  What seems certain is that the real Blemmyes are the ancestors of the people who today call themselves the Beja, who live in southern Egypt, Sudan, and Eritrea.

The mythological Blemmyes are even more of a mystery to linguists.  Seventeenth-century French antiquarian Samuel Bochart thought their name came from the Hebrew bly (בלי) "without" and moach (מוח) "brain;" linguist Louis Morié believed it was from the Greek blemma (βλέμμα) "look, glance" and muō (μύω) "close the eyes;" Egyptologist Hans Wolfgang Helck drew its descent from a Coptic word for "blind."

The truth is, no one knows for sure.

Oh, but if you want an even stranger coincidence, the paper in The American Journal of Archaeology about the real Blemmyes is about the discovery in the Egyptian town of Berenike of one of their shrines, within which was entombed fifteen mummified falcons...

... all of which were headless.

You can't make this shit up.

In any case, we're left with a mystery.  The fictional Blemmyes and the real Blemmyes -- and the descendants of the latter, the Beja -- seem to have nothing whatsoever to do with one another, except for a common name and living in approximately the same place.  But there has to be some connection, right?  I dunno, maybe we should be out there looking for real Cynocephali and Sciapodes.  

****************************************


Saturday, October 8, 2022

A cataclysmic pirouette

Hamlet famously states to his friend, "There are more things in heaven and earth, Horatio, than are dreamt of in your philosophy," and every time we look into the night sky, we're reminded how true that is.

In the last hundred years astronomers have discovered deadly gamma-ray bursters and black holes, neutron stars for which a teaspoon of their material would weigh as much as a mountain, planets made of stormy swirls of ammonia, methane, and hydrogen, ones made of super-hot molten metal, water-worlds completely covered with deep oceans.  We've seen newborn stars and stars in their violent death throes, looked out in space and back in time to the very beginning, when the universe itself was in its infancy.

Even with all these wonders, new and bizarre phenomena are still being discovered every time our technology improves.  Take, for example, the "cataclysmic variable" that was the subject of a paper in Nature this week, a pair of stars locked in such a tight dance that they whirl around their common center of gravity in only fifty-one minutes.

Given the euphonious name ZTF J1813+4251, this pair of stars is comprised of a white dwarf -- the burnt-out core of a low-mass star like the Sun -- and an even more lightweight star not much bigger than the planet Jupiter.  The white dwarf has been swallowing (the astronomical term is "accreting") the hydrogen fuel from its partner, and they're drawing closer together, meaning that the process will speed up.  Eventually all that will be left of the partner star will be its core, and astronomers predict that at that point, they will have an orbital period of eighteen minutes.  But once the accretion process ends, drag in the pair's movement will rob energy from the system, the wild stellar pirouette will slow down, and they will gradually start to move apart again.

It's fortunate that the partner star is as light as it is; if it had more mass, it would be headed toward one of the most violent fates a star can have -- a type 1a supernova.  White dwarfs are the remnants of stars that have exhausted all their fuel, and they shrink until the inward pull of gravity is counterbalanced by the mutual repulsion of the negatively-charged electrons that surround the atoms they're made of.  There's a limit, though, to how much this repulsive force can withstand; it's called the Chandrasekhar limit, after its discoverer Subrahmanyan Chandrasekhar, and is equal to 1.44 solar masses.  For a lone white dwarf -- as our Sun will one day be -- this is not a problem, as there won't be anything substantial adding to its mass after it reaches that point.

The situation is different when a low-mass star is in a binary system with a giant star.  When the low-mass star burns out and becomes a white dwarf, it begins to rob its partner of matter -- just as ZTF J1813+4251 is doing.  But in this case, there is a lot more mass there to rob.  Eventually, the white dwarf steals enough matter from its companion to go past the Chandrasekhar limit, and at that point, the mutual repulsion of the electrons in the stars atoms lose their contest with the inward pull of gravity.  The white dwarf's core collapses completely, making the temperature skyrocket so high that its helium ash can fuse into carbon and other heavier elements, suddenly releasing catastrophic amounts of energy.  The result is...

... boom.

In the process, the matter from the exploded dwarf star is scattered around the cosmos, and becomes the parent material for forming planets.  It is, in fact, how most of the carbon, oxygen, and nitrogen in our bodies were formed.

As Carl Sagan famously said, "We are made of starstuff."

A type 1a supernova remnant [Image is in the Public Domain courtesy of NASA/JPL]

But ZTF J1813+4251 isn't headed for such a dramatic exit -- eventually the white dwarf will pull away the outer layers of the partner star's atmosphere, and after that the two will just spiral around each other wildly for a few million years, gradually cooling and slowing from their current frenetic pace.  So maybe "cataclysmic" isn't the right word for this pair; their crazy tarantella will simply wind down, leaving two cold clumps of stellar ash behind.

Honestly, if I were a star, I think I'd rather go out with a bang.

****************************************


Friday, October 7, 2022

The face of evil

I just finished a book that I'm going to be thinking about for a very long time; Alice Oseman's wonderful, devastating, beautiful, heartbreaking, and ultimately triumphant novel Radio Silence.

What has kept my mind coming back to the story over and over since closing the last page is not the pair of main characters, Frances Janvier and Aled Last, as well-drawn and engaging as they are; it's a minor character -- at least judging by the number of scenes in which she actually appears -- Aled's mother, Carol Last, whose influence pervades the entire story like some kind of awful miasma.

She's not what I would call "big evil."  Mrs. Last is no Sauron, no Darth Vader, no Jadis the White Witch.  She has no desire to rule the world and mow down thousands.  Her evil is so small as to be almost banal.  She "redecorates" Aled's room while he's away at school, destroying all of his posters and adornments, even painting over the mural of a galaxy he'd created on his ceiling, replacing it with a blank white surface.  She has his old dog put down without his knowledge, without even a chance to say goodbye.  She sends him a saccharine text every single time he makes a new episode of his beloved podcast, about spending his time in more productive pursuits instead of his "silly little show."  She takes her daughter's "inappropriate" clothing and burns it in the back yard, right in front of her.

And each and every time, she has an unshakable justification for why she does what she does.  There's always a reason, and any objections have about as much effect on her as an ocean wave striking a cliff face.  In the most chilling scene in the whole book, Mrs. Last proudly shows Frances what she's done to Aled's room while he's away, saying with a tight little smile, "It's just a few little rearrangements here and there.  I'm sure he'll appreciate a change...  Feels very fresh, don't you think?  A cleaner, emptier space makes a cleaner, sharper mind."

She doesn't even listen for Frances's response; of course the answer is yes.

For the Mrs. Lasts of the world, the answer is always yes.

It's a tribute to Alice Oseman's skill as a novelist that my response to Mrs. Last was as strong as it was.  But why we all feel revulsion at such a character is telling.  It's like an analysis I read a while back of why the most hated character in the Harry Potter universe isn't Lord Voldemort -- far and away, it's Dolores Umbridge.  

Very few of us, fortunately, ever meet a Lord Voldemort.

But all of us know a Dolores Umbridge.  A teacher, a boss, a family member, a significant other, an acquaintance who, given a little power, uses it to tear down the souls of the vulnerable or dependent, and remodel them to suit.  A person who couches it all with a sweet smile that never reaches the eyes and a declaration of, "You know it's all for your own good, dear."


This, for most of us who have read the Potter series, is the real face of evil, not the grotesque, distorted visage of Lord Voldemort.

I know a lot of the reason that both Dolores Umbridge and Mrs. Last made me as sick at heart as they did is that my own childhood was laced through with this sort of thing.  Nothing as overt as what Dolores did to Harry or what Mrs. Last did to her children, perhaps; but the message I got was nothing if not consistent.  "You can't possibly like that music/television show/book/movie, can you?"  "Why are you wasting your time with that?"  "Mrs. So-and-So's son has accomplished so much, she must be so proud of him.  Maybe you should try following his example."  "Why bother with that?  You'll just give up in three weeks when you find out how hard it is."  And, most pervasively, over and over again, "No one wants to hear about that," whenever I talked about what I cared most deeply about, what I was passionate about.

My response was much like Aled's in Radio Silence; hide.  Protect what I loved so it wouldn't be destroyed.  It came out in uglier ways, sometimes; I did my own share of mistreating those who were vulnerable, to my everlasting shame, living up to my grandma's wise if tragic words that "hurt people hurt people."  I became secretive, angry, and deeply despondent.

And it took me years to admit that this subversive attempt to demolish who I actually was and rebuild some new, improved version was nothing short of emotional abuse.

That the Mrs. Lasts in my own life didn't win was more due to luck than anything I did to stop them.  For the past twenty years especially I have been fortunate enough to have people in my life who are determined to nurture rather than destroy, and I can say truly that they saved my life, both figuratively and literally.  

I'll end this post with an exhortation to be that for the people around you; do not ever underestimate the power of simply appreciating and loving those you meet for who they are, embracing their weird, unique wonderful selves without feeling any need to change them.  Drop the desperate need to hem people in, to make them conform to some arbitrary standards of how they dress, what they eat, what music and books and shows they love.  Thank heavens we don't all feel passionate about the same stuff, right?  How boring would it be if every last person had exactly identical tastes, loves, opinions, and obsessions?

I'll end with a quote from someone I've quoted here many times before: journalist Kathryn Schulz, whose astonishing TED Talk "On Being Wrong" should be required listening for everyone.  Toward the end, she has an observation about why different perspectives don't imply that one person is right and the other is wrong -- and how sterile the world would be if that were true:
But to me, what's most baffling and most tragic about this is that it misses the whole point of being human.  It's like we want to imagine that our minds are these perfectly transparent windows and we just gaze out of them and describe the world as it unfolds.  And we want everybody else to gaze out of the same window and see the exact same thing.  That is not true, and if it were, life would be incredibly boring.  The miracle of your mind isn't that you can see the world as it is.  It's that you can see the world as it isn't.  We can remember the past, and we can think about the future, and we can imagine what it's like to be some other person in some other place.  And the most beautiful part is that we all do this a little differently.
****************************************