Skeptophilia (skep-to-fil-i-a) (n.) - the love of logical thought, skepticism, and thinking critically. Being an exploration of the applications of skeptical thinking to the world at large, with periodic excursions into linguistics, music, politics, cryptozoology, and why people keep seeing the face of Jesus on grilled cheese sandwiches.

Saturday, April 1, 2023

The muzzle

The first people targeted by political ideologues are almost always the artists, authors, poets, and other creatives.

No other group has a way of striking at the soul the way these people do; often with one single image or turn of phrase they point out with blinding clarity the hypocrisy and ugliness of the people in power.  No wonder they're suppressed -- sometimes violently.  Faced with depictions of nudity or sexuality, one man said:
It is not the mission of art to wallow in filth for filth's sake, to paint the human being only in a state of putrefaction, to draw cretins as symbols of motherhood, or to present deformed idiots as representatives of manly strength...  [We will see to it that] works of art which cannot be understood in themselves but need some pretentious instruction book to justify their existence will never again find their way to the people.

Another commented, "Degenerates are not always criminals, prostitutes, anarchists and pronounced lunatics; they are often authors and artists."

Make no mistake; book bans and book burnings, shutting down or defunding libraries and art exhibits, are not about protecting children from age-inappropriate material.  There is an honest discussion to be had about what is appropriate for children to learn about at what age, and no one -- liberal or conservative -- disputes that point.  This, however, goes way beyond that.

The people doing this don't want anyone, anywhere, to have access to books or art that runs against the straight White Christian agenda.  So the first to go are creative works by or about minorities, anything dealing openly with sexuality, and anything that even mentions LGBTQ+ people; i.e., anything labeled "degenerate."  It's not like the goal isn't obvious, especially with regards to sexuality.  "All things which take place in the sexual sphere are not the private affair of the individual," said one government official, "but signify the life and death of the nation."

And once that kind of thing gets started, it gets whipped into a frenzy, because the people doing it honestly believe they're fighting evil.  One witness to a book burning said the following:

I held my breath while he hurled the first volume into the flames: it was like burning something alive.  Then students followed with whole armfuls of books, while schoolboys screamed into the microphone their condemnations of this and that author, and as each name was mentioned the crowd booed and hissed.  You felt the venom behind their denunciations.  Children of fourteen mouthing abuse.

Creative people can fight back, but once the works are destroyed, in some sense it's too late.  One author, more optimistic than I am, said, "History has taught you nothing if you think you can kill ideas.  Tyrants have tried to do that often before, and the ideas have risen up in their might and destroyed them.  You can burn my books and the books of the best minds... but the ideas in them have seeped through a million channels and will continue to quicken other minds."

Perhaps so, but once access is stopped, you don't even have to burn the physical copies.  This is something fascists have learned all too well.  Control what people find out -- place a stranglehold on the media, and muzzle the people who dissent, especially the artists and writers -- and you're ninety percent of the way to victory.  "Those who don't read good books," said another famous author, "have no advantage over those who can't."

[Image licensed under the Creative Commons Alan Levine from Strawberry, United States, Book burning (3), CC BY 2.0]

The only acceptable response is to fight back.  Hard.  Especially us creative types, who are so frequently in the bullseye of the hatred.  If, as an adult, you find something offensive -- fine, don't read it.  However, passing legislation to prevent anyone else from reading it is the road to ceding control to the state over what people are allowed to see, hear, and think.  And if you don't think this is one short step from denying the personhood and right to exist of people who have an ethnicity, religion, political ideology, or sexual orientation different from the short list of ones accepted by the powers-that-be, you are being willfully blind to history.

Because -- oh, sorry, forgot to mention -- everything in this post comes from the rise of Nazi Germany in the 1930s.  Who did you think I was talking about?

[Nota bene: the quotes are, in order, from Adolf Hitler (1937); German nationalist Max Nordau (1892); Heinrich Himmler (1937); American journalist Lilian T. Mowrer (1933); Helen Keller (1933); and Mark Twain (1895)]

****************************************



Friday, March 31, 2023

The global melting pot

One of the shakiest concepts in biological anthropology is race.

Pretty much all biologists agree that race, as usually defined, has very little genetic basis.  Note that I'm not saying race doesn't exist; just that it's primarily a cultural, not a biological, phenomenon.  Given the fact that race has been used as the basis for systematic oppression for millennia, it would be somewhere beyond disingenuous to claim that it isn't real.

The problem is, determination of race has usually been based upon a handful of physical characteristics, most often skin, eye, and hair pigmentation and the presence or absence of an epicanthal fold across the inner corner of the eye.  These traits are not only superficial and not necessarily indicative of an underlying relationship, the pigment-related ones are highly subject to natural selection.  Back in the nineteenth and early twentieth century, however, this highly oversimplified and drastically inaccurate criterion was used to develop maps like this one:

The "three great races" according to the 1885 Meyers Konversations-Lexikon 

This subdivides all humanity into three groups -- "Caucasoid" (shown in various shades of blue), "Negroid" (shown in brown), and "Mongoloid" (shown in yellow and orange).  (The people of India and Sri Lanka, shown in green, are said to be "of uncertain affinities.")  If you're jumping up and down saying, "Wait, but... but..." -- well, you should be.  The lumping together of people like Indigenous Australians and all sub-Saharan Africans (based mainly on skin color) is only the most glaring error.  (Another is that any classification putting the Finns, Polynesians, Koreans, and Mayans into a single group has something seriously amiss.)

The worst part of all of this is that this sort of map was used to justify colonialism.  If you believed that there really was a qualitative difference (for that, read genetic) between the "three great races," it was only one step away from deciding which one was the best and shrugging your shoulders at the subjugation by that one of the other two. 

The truth is way more complicated, and way more interesting.  By far the highest amount of genetic diversity in the world is in sub-Saharan Africa; a 2009 study by Jeffrey Long found more genetic differences between individuals from two different ethnic groups in central Africa than between a typical White American and a typical person from Japan.  To quote a paper by Long, Keith Hunley, and Graciela Cabana that appeared in The American Journal of Physical Anthropology in 2015: "Western-based racial classifications have no taxonomic significance."

The reason all this comes up -- besides, of course, the continuing relevance of this discussion to the aforementioned systematic oppression based on race that is still happening in many parts of the world, including the United States -- is a paper that appeared last week in Nature looking at the genetics of the Swahili people of east Africa, a large ethnic group extending from southern Somalia down to northern Mozambique.  While usually thought to be a quintessentially sub-Saharan African population, the Swahili were found to have only around half of their genetic ancestry from known African roots; the other half came from southwestern Asia, primarily Persia, India, and Arabia.

The authors write:

[We analyzed] ancient DNA data for 80 individuals from 6 medieval and early modern (AD 1250–1800) coastal towns and an inland town after AD 1650.  More than half of the DNA of many of the individuals from coastal towns originates from primarily female ancestors from Africa, with a large proportion—and occasionally more than half—of the DNA coming from Asian ancestors.  The Asian ancestry includes components associated with Persia and India, with 80–90% of the Asian DNA originating from Persian men.  Peoples of African and Asian origins began to mix by about AD 1000, coinciding with the large-scale adoption of Islam.  Before about AD 1500, the Southwest Asian ancestry was mainly Persian-related, consistent with the narrative of the Kilwa Chronicle, the oldest history told by people of the Swahili coast.  After this time, the sources of DNA became increasingly Arabian, consistent with evidence of growing interactions with southern Arabia.  Subsequent interactions with Asian and African people further changed the ancestry of present-day people of the Swahili coast in relation to the medieval individuals whose DNA we sequenced.
Note that on the Meyers Konversations-Lexikon map, the Arabians and Persians are considered "Caucasoid," the Indians are "uncertain," while the Swahili are definitely "Negroid."

A bit awkward, that.

It's appalling that we still use an outmoded and scientifically-unsound concept to justify bigotry, prejudice, and discrimination, despite the mountains of evidence showing that there's no biological basis whatsoever to the way race is usually defined.  Easy, I suppose, to hang on to your biases like grim death rather than questioning them when new data comes along.  Not even all that new; the Long study I referenced above was from fourteen years ago.  And hell, the Italian geneticist Luigi Luca Cavalli-Sforza was researching all this back in the 1960s.  Okay, it takes time for people's minds to catch up with scientific discovery, but how much damn time do you need?

The truth is that (1) ultimately, we all come from Africa, (2) since then, we've continued to move around all over the place, and therefore (3) the world is just a huge single melting pot.  Oh, and (4), the result is that we're all of (very) mixed ancestry.  I'm sorry if that makes some people feel squinky, but as I've pointed out before, the universe is under no obligation to align with your preconceived notions about how the world should work.

Time to accept the beauty and complexity of our shared humanity, and stop looking for further ways to divide us.

****************************************



Thursday, March 30, 2023

Dark days

I'm going to propose a new law, in the vein of Murphy's Law ("If it can go wrong, it will"), Betteridge's Law ("If a headline ends in a question mark, the answer is 'no'"), and Poe's Law ("A sufficiently well-done satire is indistinguishable from the real thing"): "If a statement begins with, 'Scientists claim...' without mentioning any specific scientists, it's completely made up."

I ran into an excellent (by which I mean "ridiculous") example of that over at the site Anomalien just yesterday, called "The Mysterious Phenomenon of the Onset of Sudden Darkness."  The article, which is (as advertised) about times when darkness suddenly fell during the day for no apparent reason, gets off to a great start by citing the Bible (specifically the darkness sent by God in the Book of Exodus to punish the Egyptians for keeping Moses et al. in slavery), because that's clearly admissible as hard evidence.  "Scientists," we are told, "are seriously concerned about this phenomenon."

I have spoken with a great many scientists over the years, and not a single one of them has voiced any concern about sudden-onset darkness.  Maybe they're keeping it secret because they don't want us laypeople getting scared, or something.

That being said, and even excluding the Pharaonic Plagues, the claim has been around for a while.  One of my favorite books growing up -- I still have my rather battered copy, in fact -- was Strangely Enough, by C. B. Colby, which deals with dozens of weird "Strange But True!" tales.  One of them, called "New England's Darkest Day," describes an event that allegedly occurred on May 19, 1780, in which pitch darkness fell on a sunny day.  Colby writes:

May 19 dawned as bright and clear as usual, except that there appeared to be a haze in the southwest.  (One town history reports that it was raining.)  This haze grew darker, and soon the whole sky was covered with a thick cloud which was traveling northeast rapidly.  It reached the Canadian border by midmorning.  Meanwhile the eastern part of New York, as well as Maine, New Hampshire, Rhode Island, Massachusetts, and Connecticut were becoming darker.

By one o'clock some sections were so dark that white paper held a few inches from the eyes couldn't be seen.  It was as dark as a starless night.  Apprehension soon turned to panic.  Schools were dismissed, and lanterns and candles were lighted in homes and along the streets...

That night the darkness continued, and it was noted that by the light of lanterns everything seemed to have a greenish hue.  A full moon, due to rise at nine, did not show until after 1 AM, when it appeared high in the sky and blood-red.  Shortly afterward stars began to appear, and the following morning the sun was as bright as ever, after fourteen hours of the strangest darkness ever to panic staunch New Englanders.

Surprisingly, there's no doubt this actually happened; as Colby states, it's recorded in dozens of town histories.  However, the actual cause isn't anything paranormal.  It was most likely a combination of dense fog and the smoke from a massive forest fire in what is now Algonquin Provincial Park in Ontario, which left evidence in the form of tree ring scars from the late spring of that year, precisely when the "Dark Day" occurred.  And, in fact, Colby conveniently doesn't mention that there are also reports in town histories that "the air smelled like soot" and after the sky cleared, some places (especially in New Hampshire) had layers of ash on the ground up to fifteen centimeters deep.

Kind of blows away the mystery, doesn't it?

Artist's depiction of the "Dark Day" [Image is in the Public Domain, courtesy of the New England Historical Society]

The Anomalien article isn't even on as firm a ground as Colby is.  The majority of their accounts are single-person anecdotes; even the ones that aren't have very little going for them.  Take, for example, the case in Louisville, Kentucky, which they say is so certain "it's almost become a textbook" [sic].  On March 7, 1911, they say, a "viscous darkness" fell upon the entire city, lasting for an hour and resulting in massive panic.

Funny that such a strange, widespread, and terrifying event merited zero mention in the Louisville newspaper that came out only four days later.  You'd think it'd have been headline news.

That doesn't stop the folks at Anomalien from attributing the phenomenon to you-know-who:

Is it all aliens to be blamed?  Researchers... believe that unexpected pitch darkness occurs in the event of a violation of the integrity of space.  At such moments, it is possible to penetrate both into different dimensions and worlds, and out of them...  

Some researchers believe that the phenomenon of sudden pitch darkness is associated with the presence on earth of creatures, unknown to science, with supernatural abilities.  All these cryptids and other strange creatures enter our world through the corridors of pitch darkness.  And they seem to be more familiar with this phenomenon than we are.  They know when this passage will open, and they use it.  Only they do not immediately disappear along with the darkness, but wait for the next opportunity to return to their world.

Oh?  "Researchers believe that," do they?  I'll be waiting for the paper in Science.

Anyhow, there you have it.  Bonnet's Law in action.  I'm just as happy that the claim is nonsense; the sun's out right now, and I'm hoping it stays that way.  It's gloomy enough around here in early spring without aliens and cryptids and whatnot opening dimensional portals and creating "corridors of pitch darkness."  Plus, having creatures ("unknown to science, with supernatural abilities") bumbling about in the dark would freak out my dog, who is -- no offense to him intended, he's a Very Good Boy -- a great big coward.

So let's just keep the lights on, shall we?  Thanks.

****************************************



Wednesday, March 29, 2023

The biochemical symphony

Sometimes I run into a piece of scientific research that's so odd and charming that I just have to tell you about it.

Take, for example, the paper that appeared in ACS Nano that ties together two of my favorite things -- biology and music.  It has the imposing title, "A Self-Consistent Sonification Method to Translate Amino Acid Sequences into Musical Compositions and Application in Protein Design Using Artificial Intelligence," and was authored by Chi-Hua Yu, Zhao Qin, Francisco J. Martin-Martinez, and Markus J. Buehler, all of the Massachusetts Institute of Technology.  Their research uses a fascinating lens to study protein structure: converting the amino acid sequence and structure of a protein into music, then having an AI software study the musical pattern that results as a way of learning more about how proteins function -- and how that function might be altered.

What's cool is that the musical note that represents each amino acid isn't randomly chosen.  It's based on the amino acid's actual quantum vibrational frequency.  So when you listen to it, you're not just hearing a whimsical combination of notes based on something from nature; you're actually hearing the protein itself.

[Image licensed under the Creative Commons © Nevit Dilmen, Music 01754, CC BY-SA 3.0]

In an article about the research in MIT News, written by David L. Chandler, you can hear clips from the Yu et al. study.  I recommend the second one especially -- the one titled "An Orchestra of Amino Acids" -- which is a "sonification" of spider silk protein.  The strange, percussive rhythm is kind of mesmerizing, and if someone had told me that it was a composition by an avant-garde modern composer -- Philip Glass, perhaps, or Steve Reich -- I would have believed it without question.  But what's coolest about this is that the music actually means something beyond the sound.  The AI is now able to discern the difference between some basic protein structures, including two of the most common -- the alpha-helix (shaped like a spring) and the beta-pleated-sheet (shaped like the pleats on a kilt -- because they sound different.  This gives us a lens into protein function that we didn't have before.  "[Proteins] have their own language, and we don’t know how it works," said Markus Buehler, who co-authored the study.  "We don’t know what makes a silk protein a silk protein or what patterns reflect the functions found in an enzyme.  We don’t know the code."

But this is exactly what the AI, and the scientists running it, hope to find out.  "When you look at a molecule in a textbook, it’s static," Buehler said.  "But it’s not static at all.  It’s moving and vibrating.  Every bit of matter is a set of vibrations.  And we can use this concept as a way of describing matter."

This new approach has impressed a lot of people not only for its potential applications, but from how amazingly creative it is.  This is why it drives me nuts when people say that science isn't a creative process. They apparently have the impression that science is pure grunt work, inoculating petri dishes, looking at data from particle accelerators, analyzing rock layers.  But at its heart, the best science is about making connections between disparate ideas -- just like this research does -- and is as deeply creative as writing a symphony.

"Markus Buehler has been gifted with a most creative soul, and his explorations into the inner workings of biomolecules are advancing our understanding of the mechanical response of biological materials in a most significant manner," said Marc Meyers, professor of materials science at the University of California at San Diego, who was not involved in this work.  "The focusing of this imagination to music is a novel and intriguing direction. his is experimental music at its best.  The rhythms of life, including the pulsations of our heart, were the initial sources of repetitive sounds that engendered the marvelous world of music.  Markus has descended into the nanospace to extract the rhythms of the amino acids, the building blocks of life."

What is most amazing about this is the potential for the AI, once trained, to go in reverse -- to be given an altered musical pattern, and to predict from that what the function of a protein engineered from that music would do.  Proteins are perhaps the most fundamental pieces of living things; the majority of genes do what they do by making proteins, which then guide processes within the organism (including frequently affecting other genes).  The idea that we could use music as a lens into how our biochemistry works is kind of stunning.

So that's your science-is-so-freaking-cool moment for the day.  I peruse the science news pretty much daily, looking for intriguing new research, but this one's gonna be hard to top.  Now I think I'm going to go back to the paper and click on the sound links -- and listen to the proteins sing.

****************************************



Tuesday, March 28, 2023

Escaping the bottle

Two years ago, I wrote a post about the work of Nick Bostrom (of Oxford University) and David Kipping (of Columbia University) regarding the unsettling possibility that we -- and by "we," I mean the entire observable universe -- might be a giant computer simulation.

There are a lot of other scientists who take this possibility seriously.  In fact, back in 2016 there was a fascinating panel discussion (well worth watching in its entirety), moderated by astrophysicist Neil deGrasse Tyson, considering the question.  Interestingly, Tyson -- who I consider to be a skeptic's skeptic -- was himself very accepting of the claim, and said at the end that if hard evidence is ever found that we are living in a simulation, he'll "be the only one in the room who's not surprised."

Other participants brought up some mind-boggling points.  The brilliant Swedish-American cosmologist Max Tegmark, of MIT, asked the question of why the fundamental rules of physics are mathematical.  He went on to point out that if you were a character inside a computer game (even a simple one), and you started to analyze the behavior of things in the game from within the game -- i.e., to do science -- you'd see the same thing.  Okay, in our universe the math is more complicated than the rules governing a computer game, but when you get down to the most basic levels, it still is just math.  "Everything is mathematical," he said.  "And if everything is mathematical, then it's programmable."

One of the most interesting approaches came from Zohreh Davoudi, also of MIT.  Davoudi is studying high-energy cosmic rays -- orders of magnitude more energetic than anything we can create in the lab -- as a way of probing the universe for what amount to glitches in the simulation.  It's analogous to the screen-door effect , a well-known phenomenon in visual displays, where (because there isn't sufficient resolution or computing power to give an infinitely smooth picture) if you zoom in too much, images pixellate.  The same thing, Davoudi says, could happen at extremely high energies; since you'd need an infinite amount of information to simulate behavior of particles on those scales, glitchiness in extreme conditions could be a hint we're inside a simulation.  "We're looking for evidence of cutting corners to make the simulation run with less demand on memory," she said.  "It's one way to test the claim empirically."

The reason this comes up is because of a recent paper by Roman Yampolskiy (of the University of Louisville) called, simply, "How to Hack the Simulation?"  Yampolskiy springboards from the arguments of Bostrom, Kipping, and others -- if you accept that it's possible, or even likely, that we're in a simulation, is there a way to hack our way out of it?

The open question, of course, is whether we should.  As I recall from The Matrix, the world inside the Matrix was a hell of a lot more pleasant than the apocalyptic hellscape outside it.

Be that as it may, Yampolskiy presents a detailed argument about whether it's even possible to hack ourselves out of a simulation (and answers the question "yes").  Not only does he, like Tegmark, use examples from computer games, but also describes an astonishing experiment I'd never heard of where the connectome (map of neural connections in the brain) of a roundworm, Caenorhabditis elegans, was uploaded into a robot body which then was able to navigate its environment exactly as the real, living worm did.  (The more I think about this experiment, the more freaked out I become.  Did the robotic worm know it was in a simulated body?)

Evaluating the strength of Yampolskiy's technical arguments is a bit beyond me, but to me where it becomes really interesting is when he gets into concrete suggestions of how we could get a glimpse of the world outside the simulation.  One method, he says, is get enormous numbers of people to do something identical and (presumably) easy to simulate, and then simultaneously all doing something different.  He writes:

If, say, 100 million of us do nothing (maybe by closing our eyes and meditating and thinking nothing), then the forecasting load-balancing algorithms will pack more and more of us in the same machine.  The next step is, then, for all of us to get very active very quickly (doing something that requires intense processing and I/O) all at the same time.  This has a chance to overload some machines, making them run short of resources, being unable to meet the computation/communication needed for the simulation.  Upon being overloaded, some basic checks will start to be dropped, and the system will be open for exploitation in this period...  The system may not be able to perform all those checks in an overloaded state...  We can... try to break causality.  Maybe by catching a ball before someone throws it to you.  Or we can try to attack this by playing with the timing, trying to make things asynchronous.

Of course, the problem here is that it's damn near impossible to get a hundred people to cooperate and follow directions, much less a hundred million.

Another suggestion is to increase the demand on the system by creating our own simulation -- a possibility Bostrom and Kipping considered, that we could be in a near-infinite nesting of universes within universes.  Yampolskiy says the problem is computing power; even if we're positing a simulator way smarter than we are, there's a limit, and we might be able to exploit that:

The most obvious strategy would be to try to cause the equivalent of a stack overflow—asking for more space in the active memory of a program than is available—by creating an infinitely, or at least excessively, recursive process.  And the way to do that would be to build our own simulated realities, designed so that within those virtual worlds are entities creating their version of a simulated reality, which is in turn doing the same, and so on all the way down the rabbit hole.  If all of this worked, the universe as we know it might crash, revealing itself as a mirage just as we winked out of existence.

In which case the triumph of being right would be cancelled out rather spectacularly by the fact that we'd immediately afterward cease to exist.

The whole question is as fascinating as it is unsettling, and Yampolskiy's analysis is at least is a start (along with more technical approaches like Davoudi's cosmic ray experiments) toward putting this on firmer scientific ground.  Until we can do that, I tend to agree with theoretical physicist James Sylvester Gates, of the University of Maryland, who criticizes the simulator argument as not being science at all.  "The simulator hypothesis is equivalent to God," Gates said.  "At its heart, it is a theological argument -- that there's a programmer who lives outside our universe and is controlling things here from out there.  The fact is, if the simulator's universe is inaccessible to us, it puts the claim outside the realm of science entirely."

So despite Bostrom and Kipping's mathematical argument and Tyson's statement that he won't be surprised to find evidence, I'm still dubious -- not because I don't think it's possible we're in a simulation, but because I don't believe that it's going to turn out to be testable.  I doubt very much that Mario knows he's a two-dimensional image on a computer monitor, for example; even though he actually is, I don't see how he could figure that out from inside the program.  (That particular problem was dealt with in brilliant fashion in the Star Trek: The Next Generation episode "Ship in a Bottle" -- where in the end even the brilliant Professor Moriarty never did figure out that he was still trapped on the Holodeck.)


So those are our unsettling thoughts for the day.  Me, I have to wonder why, if we are in a simulation, the Great Simulators chose to make this place so freakin' weird.  Maybe it's just for the entertainment value.  As Max Tegmark put it, "If you're unsure at the end of the day if you live in a simulation, go out there and live really interesting lives and do unexpected things so the simulators don't get bored and shut you down." 

Which seems like good advice whether we're in a simulation or not.

****************************************



Monday, March 27, 2023

The avalanche

I always give a grim chuckle whenever someone on the far right calls us liberals "snowflakes," because when it comes to taking offense over absolutely everything, there's nothing like a MAGA Republican.

If you think I'm overstating my case, you have only to look at what's currently happening in the state of Florida to see that if anything, I'm being generous.  The right-wing elected officials in Florida are so pants-wettingly terrified of any viewpoints other than their own Christofascist agenda that they don't even want anyone finding out there are people who think differently.

Take, for example, the school principal in Tallahassee who was forced to resign because she had the temerity to show students in the sixth grade a photograph of Michelangelo's David

[Image licensed under the Creative Commons Michelangelo artist QS:P170,Q5592 Jörg Bittner Unna, 'David' by Michelangelo Fir JBU005 denoised, CC BY-SA 3.0]

David was originally commissioned to be placed in Florence Cathedral.  In, to make it abundantly clear, a Christian house of worship.  But it was soon considered such a masterpiece of art that it was taken out -- and placed in the public square outside the Palazzo Vecchio, so it could be seen by everyone.

But now?  According to the elected officials of Florida, whose sensibilities haven't even caught up to the sixteenth century, we can't have sixth graders see a world-renowned piece of sculpture, evidently because then they'll find out that people have genitals.

Then there's book bans.  Clay County School District just announced a new list of books that are officially banned from any school in the district, bringing the total up to 355.  Here are the new additions:


It doesn't take a genius to notice a pattern, here.  Anything dealing with LGBTQ+ themes (Heartstopper, Radio Silence, One Man Guy), anything to do with the Black experience (Americanah, Notes from a Young Black ChefPunching the Air, and Black Brother, Black Brother, among many others), anything criticizing Republicans (Russian Hacking in American Elections), and anything written by an outspoken liberal (The Fault in Our Stars, Slaughterhouse Five).  

Apparently we can't have anyone finding out there's a world out there besides those who are straight, white, Christian conservatives.

You'd think if these people were as confident in the self-evident righteousness of their own beliefs as they claim to be, they wouldn't be so fucking scared of the rest of us.

I think the problem here is that we've allowed the purveyors of this narrow-minded, bigoted bullshit to portray themselves as the valiant defenders of the cause, instead of calling them what they are: craven cowards.  They are constantly, deeply fearful, afraid that any exposure to a view beyond their own tiny, terrified world will cause the entire thing to come crashing down like a house of cards.

It's pathetic, really.  No wonder so many of them carry assault rifles when they go to Walmart.

When it comes down to it, though, isn't all fascism about fear?  Why would you be so desperate to build an autocracy if you weren't afraid of dissent?  Yeah, there's the attraction of power and its perks, I get that; but really, the desperation to crush all opposing views is born from a deep-seated and terrified knowledge that if people find out there are other ways, they'll realize they've been lied to and start demanding scary stuff like free speech and free access to information.

So to Ron DeSantis and his cronies who are so determined to erase those of us who aren't like them: I'm sorry you're so bone-shakingly terrified.  I do feel badly for you, because it must be a horrible way to live.  But just because I pity you doesn't mean that I and the others like me are going to stand silent and let you erase us.  You want to fight?  Well, battle joined.

I think you're about to find out that a bunch of snowflakes together create an avalanche.

****************************************



Saturday, March 25, 2023

Myth come to life

While I've been known to make fun of the cryptid hunters, there's something to be said for their persistence.

Not only do we have people working hard to prove the continued existence of animals thought by science to be extinct -- most notably, the thylacine (Thylacinus cynocephalus) of southern Australia, which actually has a Facebook page devoted to sightings -- there are the devotees of animals science has never admitted in the first place, like Bigfoot, Nessie, and dozens of lesser-known denizens of myth and legend.

Despite my skepticism, no one would be more delighted than me if one of these elusive beasties turned out to be real.  Which is why I was so tickled when a friend and loyal reader of Skeptophilia sent me a link about a cryptid I'd never heard of -- the Corsican cat-fox -- which was just proven to be very real indeed.

The legend has been around for centuries; a wildcat in Corsica that is larger than your typical house cat, has rusty brown fur, and a long, ringed tail, notorious for raiding chicken coops.  Called in the Corsican language ghjattu-volpe -- "cat-fox" -- it was thought to be a myth.

It's not.  In an intensive effort to establish the legend's veracity, the ghjattu-volpe was found -- not only photographed, but captured for DNA sampling.


Genetic analysis has shown that its DNA is distinct from domestic cats, from wildcats in mainland Europe, and wildcats in the neighboring island of Sardinia.  

The fact that this animal stayed undetected for so long has left the locals saying "see, we told you so," and encouraged the absolute hell out of the proponents of other elusive animal claims.  Even so, I think some cryptids are unlikely in the extreme -- the Loch Ness Monster topping that list.  The idea that there is a breeding population of plesiosaurs in Loch Ness, which somehow survived the last ice age (during which that region of Scotland was under a thirty-meter-thick sheet of ice) and has gone undetected despite years of searching with sonar and other high-tech telemetry devices, strikes me as a little ridiculous.

However, I don't find anything inherently implausible about there being a large, elusive proto-hominid in the Pacific Northwest.  I lived in Seattle for ten years and spent my summers camping in the Cascades and Olympics, and man, that is some trackless wilderness up there.  Neither do I doubt the possibility of the survival of thylacines, ivory-billed woodpeckers, and various other thought-to-be-extinct species.

But "possible" and "not inherently implausible" doesn't equal "real."  I remain very much a "show me the money" type.  And that means more than just blurred photos and videos.  (To borrow a phrase from Neil deGrasse Tyson, Photoshop probably has an "add Bigfoot" button.)  Until there's hard evidence, I'm not going to be in the True Believer column.

Even so, I have to admit that the Corsican cat-fox certainly is encouraging to those of us who want to believe.

****************************************