Skeptophilia (skep-to-fil-i-a) (n.) - the love of logical thought, skepticism, and thinking critically. Being an exploration of the applications of skeptical thinking to the world at large, with periodic excursions into linguistics, music, politics, cryptozoology, and why people keep seeing the face of Jesus on grilled cheese sandwiches.

Wednesday, May 24, 2023

Nerds FTW

There's a stereotype that science nerds, and especially science fiction nerds, are hopeless in the romance department.

I'd sort of accepted this without question despite being one myself, and happily married to a wonderful woman.  Of course, truth be told, said wonderful woman pretty much had to tackle me to get me to realize she was, in fact, interested in me, because I'm just that clueless when someone is flirting with me.  But still.  Eventually the light bulb appeared over my head, and we've been a couple ever since.

Good thing for me, because not only am I a science nerd and a science fiction nerd, I write science fiction.  Which has to rank me even higher on the romantically-challenged scale.

Or so I thought, till I read a study by Stephanie C. Stern, Brianne Robbins, Jessica E. Black, and, Jennifer L. Barnes that appeared in the journal Psychology of Aesthetics, Creativity, and the Arts, entitled, "What You Read and What You Believe: Genre Exposure and Beliefs About Relationships."  And therein we find a surprising result.

Exactly the opposite is true.  We sci-fi/fantasy nerds make better lovers.

Who knew?  Not me, for sure, because I still think I'm kind of clueless, frankly.  But here's what the authors have to say:
Research has shown that exposure to specific fiction genres is associated with theory of mind and attitudes toward gender roles and sexual behavior; however, relatively little research has investigated the relationship between exposure to written fiction and beliefs about relationships, a variable known to relate to relationship quality in the real world.  Here, participants were asked to complete both the Genre Familiarity Test, an author recognition test that assesses prior exposure to seven different written fiction genres, and the Relationship Belief Inventory, a measure that assesses the degree to which participants hold five unrealistic and destructive beliefs about the way that romantic relationships should work.  After controlling for personality, gender, age, and exposure to other genres, three genres were found to be significantly correlated with different relationship beliefs. Individuals who scored higher on exposure to classics were less likely to believe that disagreement is destructive.  Science fiction/fantasy readers were also less likely to support the belief that disagreement is destructive, as well as the belief that partners cannot change, the belief that sexes are different, and the belief that mindreading is expected in relationships.  In contrast, prior exposure to the romance genre was positively correlated with the belief that the sexes are different, but not with any other subscale of the Relationships Belief Inventory.
Get that?  Of the genres tested, the sci-fi/fantasy readers score the best on metrics that predict good relationship quality.  So yeah: go nerds.

As Tom Jacobs wrote about the research in The Pacific Standard, "[T]he cliché of fans of these genres being lonely geeks is clearly mistaken.  No doubt they have difficulties with relationships like everyone else.  But it apparently helps to have J. R. R. Tolkien or George R. R. Martin as your unofficial couples counselor."

Tolkien?  Okay.  Aragorn and Arwen, Celeborn and Galadriel, even Sam Gamgee and Rose Cotton -- all romances to warm the heart.  But George R. R. Martin?  Not so sure if I want the guy who crafted Joffrey Baratheon's family tree to give me advice about who to hook up with.

One other thing I've always wondered, though, is how book covers affect our expectations.  I mean, look at your typical romance, which shows a gorgeous woman wearing a dress from the Merciful-Heavens-How-Does-That-Stay-Up school of haute couture, being seduced by a gorgeous shirtless guy with a smoldering expression who exudes so much testosterone that small children go through puberty just by walking past him.  Now, I don't know about you, but no one I know actually looks like that.  I mean, I think the people I know are nice enough looking, but Sir Trevor Hotbody and Lady Viola de Cleevauge they're not.

Of course, high fantasy isn't much better.  There, the hero always has abs you could crack a walnut against, and is raising the Magic Sword of Wizardry aloft with arms that give you the impression he works out by bench pressing Subarus.  The female protagonists usually are equally well-endowed, sometimes hiding the fact that they have bodily proportions that are anatomically impossible by being portrayed with pointed ears and slanted eyes, informing us that they're actually Elves, so all bets are off, extreme-sexiness-wise.

And being chased by a horde of Amazon Space Women in Togas isn't exactly realistic, either.  [Image is in the Public Domain]

So even if we sci-fi nerds have a better grasp on reality as it pertains to relationships in general, you have to wonder how it affects our bodily images.  Like we need more to feel bad about in that regard; between Victoria's Secret and Abercrombie & Fitch, it's a wonder that any of us are willing to go to the mall without wearing a burqa.

But anyhow, that's the news from the world of psychology.  Me, I find it fairly encouraging that the scientifically-minded are successful at romance.  It means we have a higher likelihood of procreating, and heaven knows we need more smart people in the world these days.  It's also nice to see a stereotype shattered.  After all, as Oliver Wendell Holmes said, "No generalization is worth a damn.  Including this one."

****************************************



Tuesday, May 23, 2023

Discarded genius

Way back in 1952, British mathematician and computer scientist Alan Turing proposed a mathematical model to account for pattern formation that results in (seemingly) random patches -- something observed in as disparate manifestations as leopard spots and the growth patterns of desert plants.

Proving that this model accurately reflected what was going on, however, was more difficult.  It wasn't until three months ago that an elegant experiment using thinly-spread chia seeds on a moisture-poor growth medium showed that Turing's model predicted the patterns perfectly.

"In previous studies,” said study co-author Brendan D’Aquino, who presented the research at the March meeting of the American Physical Society, "people kind of retroactively fit models to observe Turing patterns that they found in the world.  But here we were actually able to show that changing the relevant parameters in the model produces experimental results that we would expect."

Honestly, it shouldn't have been surprising.  Turing's genius was unparalleled; the "Turing pattern" model is hardly the only brainchild of his that is still bearing fruit, almost seventy years after his death.  His research on the halting problem -- figuring out if it is possible to determine ahead of time whether a computer program designed to prove the truth or falsity of mathematical theorems will reach a conclusion in a finite number of steps -- generated an answer of "no" and a paper that mathematician Avi Wigderson called "easily the most influential math paper in history."  Turing's work in cryptography is nothing short of mind-blowing; he led the research that allowed the deciphering of the incredibly complex code produced by Nazi Germany's Enigma machine, a feat that was a major contribution to Germany's defeat in 1945.

A monument to Alan Turing at Bletchley Park, where the cryptographic team worked during World War II [Image licensed under the Creative Commons Antoine Taveneaux, Turing-statue-Bletchley 14, CC BY-SA 3.0]

Turing's colleague, mathematician and cryptographer Peter Hilton, wrote the following about him:
It is a rare experience to meet an authentic genius.  Those of us privileged to inhabit the world of scholarship are familiar with the intellectual stimulation furnished by talented colleagues.  We can admire the ideas they share with us and are usually able to understand their source; we may even often believe that we ourselves could have created such concepts and originated such thoughts.  However, the experience of sharing the intellectual life of a genius is entirely different; one realizes that one is in the presence of an intelligence, a sensibility of such profundity and originality that one is filled with wonder and excitement.  Alan Turing was such a genius, and those, like myself, who had the astonishing and unexpected opportunity, created by the strange exigencies of the Second World War, to be able to count Turing as colleague and friend will never forget that experience, nor can we ever lose its immense benefit to us.

Hilton's words are all the more darkly ironic when you find out that two years after the research into pattern formation, Turing committed suicide at the age of 41.

His slide into depression started in January 1952, when his house was burgled.  The police, while investigating the burglary, found evidence that Turing was in a relationship with another man, something that was illegal in the United Kingdom at the time.  In short order Turing and his lover were both arrested and charged with gross indecency.  After a short trial in which Turing refused to argue against the charges, he was found guilty, and avoided jail time if he agreed to a hormonal treatment nicknamed "chemical castration" designed to destroy his libido.

It worked.  It also destroyed his spirit.  The "authentic genius" who helped Britain win the Second World War, whose contributions to mathematics and computer science are still the subject of fruitful research today, poisoned himself to death in June of 1954 because of the actions taken against him by his own government.

How little we've progressed in seven decades.

Here in the United States, state after state are passing laws discriminating against queer people, denying gender-affirming care to trans people, legislating what is and is not allowable based not upon any real concrete harm done, but on thinly-veiled biblical moralism.  The result is yet another generation growing up having to hide who they are lest they face the same kind of soul-killing consequences Alan Turing did back in the early 1950s.

People like Florida governor Ron DeSantis and Texas governor Greg Abbott, who have championed this sort of legislation, seem blind to the consequences.  Or, more likely, they know the consequences and simply don't give a damn how many lives this will cost.  Worse, some of their allies actually embrace the potential death toll.  At the Conservative Political Action Conference in March, Daily Wire host Michael Knowles said, "For the good of society… transgenderism must be eradicated from public life entirely.  The whole preposterous ideology, at every level."

No, Michael, there is no "ism" here.  It's not an "ideology;" it's not a political belief or a religion.  What you are saying is "eradicate transgender people."  You are advocating genocide, pure and simple.

And so, tacitly, are the other people who are pushing anti-LGBTQ+ laws.  Not as blatantly, perhaps, but that's the underlying message.  They don't want queer people to be quiet; they want us erased.

I can speak first-hand to how devastating it is to be terrified to have anyone discover who you are.  I was in the closet for four decades out of shame, not to mention fear of the consequences of being out.  When I was 54 I finally said "fuck it" and came out to friends and family; I came out publicly -- here at Skeptophilia, in fact -- five years after that.  

I'm one of the lucky ones.  I had nearly uniform positive responses.

But if I lived in Florida or Texas?  Or in my home state of Louisiana?  I doubt very much whether I'd have had the courage to speak my truth.  The possibility of dire consequences would have very likely kept me silent.  In Florida, especially -- I honestly don't know how any queer people or allies are still willing to live there.  I get that upping stakes and moving simply isn't possible for a lot of people, and that even if they could all relocate, that's tantamount to surrender.  But still.  Given the direction things are going, it's a monumental act of courage simply to stay there and continue to fight.

It's sickening that we are still facing these same battles.  Haven't we learned anything from the example of a country that discarded the very genius who helped them to defeat the Nazis, in the name of some warped puritanical moralism? 

This is no time to give up out of exhaustion, however, tempting though it is.  Remember Turing, and others like him who suffered (and are still suffering) simply because of who they are.  Keep speaking up, keep voting, and keep fighting.  And remember the quote -- of uncertain origin, though often misattributed to Edmund Burke -- "All that is necessary for the triumph of evil is that good people do nothing."

****************************************



Monday, May 22, 2023

Dawn life

Currently I'm working my way through Mark McMenamin's book The Garden of Ediacara, an analysis of the fossil evidence from the Vendian Period, the last bit of the Precambrian (650-543 million years ago).

The subject of McMenamin's book is undeniably fascinating -- more about that in a moment -- but it's uneven reading.  Part of it is a travelogue of his work in Namibia, Mexico, and Australia, places where there are significant outcrops of late Precambrian sedimentary rocks, but it's obvious from page one that most of what he does is write papers for scholarly journals.  As a result, it's halfway between an introduction to the topic for laypeople and an extended academic paper, and I've been glad as I worked my way through it that I have at least a passing background in paleontology.

Something that struck me right away, however, was that I've been laboring under a serious misunderstanding of the Ediacaran biota; that it overlapped significantly with the Cambrian explosion fauna, the bizarre creatures like Anomalocaris and Opabinia and the aptly-named Hallucigenia.  In reality, there was almost no overlap, and the Ediacaran organisms such as Cloudina and Dickinsonia were almost certainly driven to extinction and replaced by the large predatory forms of the early Cambrian.

A fossil of Dickinsonia costata from Australia [Image licensed under the Creative Commons Verisimilus at English Wikipedia, DickinsoniaCostata, CC BY-SA 3.0]

While the early Cambrians (best known from the Burgess Shale formation of British Columbia) are clearly animals, the bizarre Ediacarans are of completely uncertain affinities.  When McMenamin wrote his book (1998) there was considerable contention about what they were, with various paleontologists arguing vehemently that they were early animals, fungi, algae, or even giant protists (or protist colonies).  Despite the passage of twenty-five years, the issue is still far from settled.  Some make persuasive arguments that the Vendian biota doesn't belong to any of the five modern kingdoms of life (animals, plants, fungi, bacteria, and archaea), but are representatives of a completely different lineage, or more than one, that left no descendants at all.

So I'm grateful to McMenamin and his book for clearing up something I'd misunderstood for years.

I was in the middle of reading The Garden of Ediacara when, coincidentally, a friend and frequent contributor of topics for Skeptophilia sent me a link to an article in Smithsonian magazine about the evolutionary origin of animals.  Another point of contention amongst biologists is determining, out of the entire kingdom Animalia, which group branched off first.  (This is sometimes phrased as which is the "oldest" or "most primitive" -- both terminology I don't like, because every living animal on Earth has an exactly equal length of evolutionary history.  It's just that during that time, some branches have changed a great deal faster than others, and some groups share more recent common ancestry than others do.)

In any case, the argument is about which group of modern animals is the outgroup -- the one that split off first, and therefore is the most distantly related to all other animals.  When I took zoology (many, many years ago) the conventional wisdom was that it was sponges (Phylum Porifera).  And there's certainly a good case to be made there; sponges are weird animals, with no differentiated organs, skeletons made of either protein fibers, bits of calcium carbonate, or slivers of glass, and no nerves, muscles, or digestive tracts.  But genetic analysis has shown unequivocally that there's an even more distantly-related group -- the comb jellies (Phylum Ctenophora).

They look superficially like jellyfish, and that similarity led scientists to put them on the same branch as Phylum Cnidaria (which not only contains jellyfish, but sea anemones and corals).  The genetic studies, though, show that there's only a distant relationship between comb jellies and jellyfish.  The comb jellies, in fact, show more of a genetic similarity to certain species of protists than they do to other animals.

"That was the smoking gun," said Daniel Rokhsar, of the University of California - Berkeley, who co-authored the paper.

So this goes to show that there's a lot we still have to learn about the earliest life on our planet.  And I'm sure that as definitive as this study seems to be, it won't be the last word.  As more evidence surfaces, expect the arrangement to change.  This, after all, is how science works; it has a mechanism for self-correcting.  And far from the reaction I've seen people have -- that the shifting understanding means "it could all be proven wrong tomorrow" -- that capacity for change is science's main strength.

After all, isn't it a good thing to have your model shift to accommodate new information?  Seems like standing firm on what you believe despite strong evidence to the contrary is the cause of a lot of the problems in the world.

****************************************



Saturday, May 20, 2023

Raw nonsense

Despite the fact that our modern lifestyle has increased our life expectancy to longer than it's ever been in the history of humanity, romanticizing the practices of the past is still ridiculously widespread.

People who claim that "everything causes cancer" conveniently ignore two things: first, that a good many forms of cancer would decline dramatically if we'd do things doctors recommend, like cutting out tobacco and getting vaccinated against HPV; and second, that one of the reasons cancer rates have climbed is that we're no longer dying of other stuff, like diphtheria, typhoid, measles, and smallpox.

But that kind of thinking seldom makes any inroads into the minds of people committed to anti-vaxx (or completely anti-medical) propaganda.  The levels of irrationality some of this thinking reaches are truly staggering.  I had one person comment on one of my posts -- in all apparent seriousness -- "my great-grandma never got vaccinated against anything, and she survived."

Well, of course she did.  If she'd died at age three of diphtheria, she wouldn't have been your great-grandma, now would she?

How about asking great-grandma how many of her siblings and cousins died of childhood infectious diseases -- like my grandfather's two oldest sisters, Marie-Aimée and Anne-Désée, who died five days apart at the ages of 22 and 16 -- of measles.

The person who posted that comment should win some sort of award for compressing the greatest number of fallacies into the shortest possible space.  Confirmation bias, cherry-picking, anecdotal evidence, and the post hoc fallacy, all in nine words.  Kind of impressive, actually.

Despite all this, there are huge numbers of people who want to return to what our distant ancestors did, claiming that it's "healthier" or "more natural," conveniently neglecting the fact that back then, as Thomas Hobbes so trenchantly put it, "life was solitary, poor, nasty, brutish, and short."

The result is the kind of thing I ran into in an article in Ars Technica last week about a trend I hadn't heard of, which is to drink "raw water."  "Raw water," which you might guess from the name, is water that hasn't been filtered or treated, but is collected (or even bottled and sold) right from a spring or river or whatnot.  And predictably, what happened was that nineteen people fell ill with a diarrheal disease (specifically Campylobacter jejuni) when it turned out that their trendy "natural spring water" turned out to be just ordinary runoff from a creek drainage that had been contaminated by bacteria from bird nests.


The amount of pseudoscience you run into with this stuff is astonishing.  In researching this topic, I found people who claim that "industrially-processed water" (i.e. most tap water) has "mind-control drugs" in it, designed to turn us all into Koolaid-drinkin' sheeple, and even one that said treatment plants deliberately "alter the molecular structure of water, turning it into a toxin."

Making me wonder how, or if, these people passed high school chemistry.

I spent the summers during my twenties and thirties back-country camping in the Cascades and Olympics, and I know how careful you have to be.  The clearest bubbling mountain brook can be contaminated with nasty stuff like Giardia and Salmonella, two diseases that should be high on the list of germs you never want to have inside you.  I used iodine sterilizing tablets for all the water I drank -- and I never got sick.  But I knew people who did, and as one of them vividly described it, "Having Giardia means that for three weeks you're going to be on a first-name basis with your toilet."

Which is funny until you find out that in the process, he lost twenty pounds and spent three days in the hospital hooked up to an IV so he could stay hydrated.

Look, I know our high-tech world isn't perfect.  I know about pesticides and herbicides and industrial contamination and coverups and food additives with dubious health effects.  My wife and I try as hard as we can to eat locally-sourced organic meat and produce, not to mention growing our own vegetables.  But the admittedly true statement that technology and the pharmaceuticals industry have created some problems does not equate to "therefore we should jettison everything they provide and return to the Stone Age."

Speaking of fallacies, there's another one for you: the package-deal fallacy.  You get into this stuff, it reads like the "what not to do" section of a critical thinking textbook.

So if you're inclined to switch over to "raw water," just don't.  Drinking water is treated for a reason.  Our Stone Age ancestors didn't have such great lives, and idealizing it as some kind of idyllic Garden of Eden is complete horse shit. 

Horse shit ironically being one of the things that might well be in your "raw water."

****************************************



Friday, May 19, 2023

Mapping our world

My novel The Scattering Winds is the second of a trilogy, of which the first book (In the Midst of Lions) is scheduled to be out this summer.  The setting of the trilogy is the Pacific Northwest.  In the first book, there's a worldwide collapse of civilization.  In the second, set six hundred years later, what's left of humanity has reverted to a new Dark Ages, mostly non-literate and non-technological.  In the third (The Chains of the Pleiades), six hundred years after that, technology and space flight have been re-invented -- along with all the problems that brings.

The main character of the second book, Kallian Dorn, comes from a people have lost the knowledge of reading, committing all of their culture's memory to the mind of one person, called the Guardian of the Word.  But when they find a girl from a distant town, a refugee, who knows the rudiments of reading and writing, they recognize what's been lost, and struggle, slowly, to reclaim it.  Kallian undertakes a voyage, on foot, to the girl's home town -- and finds there a mostly-intact library from what he calls "the Before Times."

The following takes place when Kallian, who by this time has learned the basics of how to read, discovers a room full of maps in the library:

He went into the first room he encountered. It was labeled “Maps.”  Holding the lamp aloft, he passed into a room filled with odd cabinets, most of which had very wide, shallow drawers.  The nearest one said, “North America,” and he set the lamp down to open the top drawer.

Sitting on top was a yellowed piece of paper, about an arm’s length wide and tall, with a drawing of… what was it?  He peered closer, and read the inscription at the top, written in an ornate, curly script he could barely decipher.  It said, “United States of America, The Year of Our Lord 1882.”  There were names written in smaller, but equally frilly, lettering, and gave him enough information to conclude that it was a drawing of a land, as if seen from above.  The faded blue bits were bodies of water: Lake Ontario.  The Caribbean Sea.  The Atlantic Ocean.  The green parts—well, they were only green in splotches, mostly they had faded to a yellowish-brown—were land.  He saw features like “Appalachian Mountains” and “Great Plains” and “Mississippi Delta.”  The land was divided by oddly artificial-looking black lines, some dead straight, others following natural features such as the course of rivers.  Each of the blocks thus delineated had a strange and unfamiliar name: Massachusetts.  New York.  Georgia.  Kentucky.

Had these been kingdoms of the Before Time?

1882—if he was correct about what the date-numbers signified, this would have been about a century and a half before the collapse, before the floods and plagues that had ended the old world.  And a full 750 years before now.

But where was this United States of America, with its bizarrely-named mountains and lakes and kingdoms?  Without a referent, without having an arrow on the map saying “You are here,” he had no way to know if it was a day’s march away or on the other side of the world.

He flipped through the maps in those and other cabinets, handling them carefully to keep the age-worn paper from crumbling in his hands.  His mind was overwhelmed with how many different lands there were—whole cabinets devoted to maps from places called Europe, Africa, Asia, Australia.  But even looking at them, as fascinating as it was, was not like reading the books he’d found, where meaning provided an anchor to keep him fastened to reality as he knew it.  Without a key, the maps gave him no way to tell scale or location of anything.  Learning to read had unlocked one type of cipher; here was an entirely different kind, one where even though he could read the words, they didn’t make sense.

I was reminded of this scene when I read an article yesterday in Science News about archaeologists who believe they've discovered the oldest-ever aerial-view scale drawings -- in other words, maps.  There are structures in the Middle East nicknamed "kites" that were huge stone-walled enclosures used to trap animals like gazelles, funneling their movements toward waiting hunters.  And a team of archaeologists working in Jordan and Saudi Arabia have found nine-thousand-year-old engravings on stones that appear to be maps of nearby kites -- perhaps made by people strategizing how best to use them in their game-harvesting efforts.

Map-making, when you think about it, is kind of an amazing accomplishment.  It requires changing your perspective, picturing what some thing -- a city, a body of water, a country, an entire continent -- would look like from above.  And even if to our modern eyes, when we can see what things look like from the air, old maps look pretty inaccurate, it's important to remember that they did it all by surveying from ground (or sea) level.

And given that, they did pretty damn well, I think.

A map of the world, ca. 1565 [Image is in the Public Domain]

The fact that we were doing this nine thousand years ago is kind of astonishing.  Intrepid folks, our ancestors.

So many of the things we do today, and consider "modern," have far deeper roots than we realize.  And this ability to shift perspective, to consider what things would look like from another angle, is something we've had for a very long time -- even if to someone like Kallian Dorn, the results look very like magic.

****************************************



Thursday, May 18, 2023

Not magic

Can I beg scientists to please please puhleeeez stop giving names to scientific phenomena that induce the woo-woos to have multiple orgasms?

It's bad enough that the woo-woos already take perfectly reasonable terms like "frequency" and "vibration" and "resonance" and "quantum" and define them any damn way they want, and interpret the Heisenberg Uncertainty Principle as meaning "we're not certain about anything, therefore my noodling around is just as likely to be true as whatever Stephen Hawking came up with."  But some of the names the scientists themselves have come up with are just egging the woo-woos on, with the same result as my putting a pair of juicy pork chops on the floor in front of my dogs and turning my back.

Then acting all surprised when 3.8 milliseconds later, there's nothing but contented chewing noises.

This sort of thing happens way too often.  I suspect that a lot of it has to do with an honest desire to give laypeople some sort of simple, catchy phrase to hang onto.  After all, a lot of physics -- and the problem does seem to occur most often in physics -- is hard and math-y, so the real models themselves are often relatively inaccessible to people who haven't been trained in the field.  (And even some who have.  Despite my bachelor's degree in physics, 95% of academic papers in physics lose me after the first paragraph.  If I even get that far.)

But for fuck's sake, let's learn from our mistakes, okay?  Look at what happened when Murray Gell-Mann introduced a new quantum characteristic, and dubbed it "strangeness."  There's nothing especially strange about strangeness, or at least, it's no stranger than the rest of the quantum model.  Actually, it's a number, and describes the decay of certain particles in strong nuclear and electromagnetic interactions.  But the name stuck, and the "strange quark" has gotten the woo-woos going, lo unto this very day.

There's no better example, however, than the infamous "God Particle."  This moniker was given to the Higgs boson by physicist Leon Lederman because of its ability to interact with and influence any particle with mass, but Lederman himself quickly realized what a misstep this was.  He later said he regretted not calling it the "goddamn particle," but admitted this probably wouldn't have gotten past his editors.

The whole annoying subject comes up because of an article this week in Phys.org about a new mathematical model that may account for the chaotic high energy and information loss that occurs near black holes.  The gravitational fields around a black hole's event horizon are so warped by the high mass that standard formulations tend to break down, and simulating them in a model has proven to be extremely complex.

[Image licensed under the Creative Commons Event Horizon Telescope, Black hole - Messier 87 crop max res, CC BY 4.0]

So Kanato Goto, Tomoki Nosaka, and Masahiro Nozaki, of RIKEN Interdisciplinary Theoretical and Mathematical Sciences, have come up with a new model to account for the complicated behavior at the boundary of a black hole.  It uses a mathematical measure of how difficult a quantum system is to simulate on a classical (non-quantum) computer.  Figuring it into their calculations, Goto et al. were able to show that near the boundary of a black hole, systems will evolve to maximize this measure -- i.e., to become maximally complex and chaotic.

So far, so good, right?  But I haven't told you what this mathematical measure is called.

It's called magic.

When I saw this, I said, and I quote, "Oh, fuck."

In fact, the Phys.org article is titled, "Quantum 'Magic' Could Help Explain the Origin of Spacetime."  At least they had the good sense to put Magic in quotation marks.

Not that it'll help.  I predict that there will be articles on woo-woo websites popping up all over the place claiming that scientists are finally admitting that the whole universe is magical.  Citing the headline, probably without the relevant quotation marks, and conveniently ignoring the contents of the actual article.  "See?  We were right!  It is all magic!  So crystals and homeopathy and astrology and quantum vibrations of love and everything else we've been babbling about for decades!"

Confirmation Bias "R" Us, baby. 

Anyhow, I guess anything I say isn't going to offset this trend to give purely scientific stuff goofy and/or eye-catching names.  But as far as quantum magic -- without the quotation marks -- goes, let me end this rant with a quote from the inimitable Tim Minchin: "Throughout history, every mystery ever solved has turned out to be... not magic."

****************************************



Wednesday, May 17, 2023

Likes attract

A bit over 23 years ago, a friend introduced me to a woman she'd known since they were toddlers together.  I was recovering from an unpleasant divorce, trying to adjust to being a single dad, and (honestly) was pretty lonely.  The friend told me we'd get along great -- mutual interests in music, birdwatching, gardening, and travel.

"Two such similar people should definitely get to know each other," she said.

Despite the fact that even on a good day, I raise social awkwardness to the level of performance art, I got up my gumption, called her up, and asked her out.  Sure enough, we hit it off brilliantly.  That summer, we went with some friends on a three-week trip to Iceland.  After a few more adventures big and small, we decided to make it permanent.  We're still together.

Carol and me in Cornwall in 2015

And our friend was right; we are really similar.  We nearly finish each other's sentences sometimes.  And I can't keep track of the number of times one of us has said something random, and the other has responded in shock, "I was just going to say that."

Some new research out of Boston University suggests a reason why the old adage of "opposites attract" might not be that accurate.  We're attracted to people who are like us, usually (at least at first) about one or two standout traits -- like birdwatching and gardening -- because of self-essentialist reasoning.  This is the idea that our core being is shaped by our passions and our dislikes, and when we find someone who resonates with us on some of those, we assume they'll share our other personality traits, as well.

That we'll be "soulmates."

"If we had to come up with an image of our sense of self, it would be this nugget, an almost magical core inside that emanates out and causes what we can see and observe about people and ourselves," said Charles Chu, who co-authored the study.  "We argue that believing people have an underlying essence allows us to assume or infer that when we see someone who shares a single characteristic, they must share my entire deeply rooted essence, as well."

The problem is, that thinking has a flaw.  You can share one or two deep connections, and still be different on a whole lot of other things, including some important ones -- maybe even some that are deal-breakers.  "We are all so complex," Chu said.  "But we only have full insight into our own thoughts and feelings, and the minds of others are often a mystery to us.  What this work suggests is that we often fill in the blanks of others' minds with our own sense of self and that can sometimes lead us into some unwarranted assumptions."

With Carol and me, for example, there's the still-baffling disconnect we have over books.  With a very few exceptions -- The Hitchhiker's Guide to the Galaxy is one, and the Discworld novels of Terry Pratchett -- I can nearly guarantee that if I love a book, she won't, and vice-versa.  Even with authors we both like (for example, Christopher Moore), we don't resonate with the same books.  She loved Fluke and I thought it was too weird and implausible, even by Moore's standards, to work; I found Coyote Blue brilliant and it's probably her least favorite of Moore's books.  (At least we agree on A Dirty Job and its sequel, Secondhand Souls, which are flat-out genius.)

Fortunately, the fact that Carol thinks my all-time favorite book, Umberto Eco's Foucault's Pendulum, is a total snooze-fest wasn't enough to make either of us reconsider our choice in a partner.  I can't imagine how hard it must be to click with someone over one thing, and then find that there are deep and irreconcilable differences in something potentially divisive, such as politics, religion, or morality.  But even so, it's worth getting past our tendency to self-essentialist reasoning.  After all, it's when we encounter, and stay connected with, people who aren't like us that we tend to learn the most.  That applies to friends as well as romantic liaisons; one of my best friends, the wonderful author Gil Miller (speaking of books you definitely need to read, you should check his out as soon as you're done reading this) is pretty different from me in a lot of ways, but we've formed a close friendship founded on a deep mutual respect and an understanding that both of us base our beliefs on thoughtful consideration -- and are willing to entertain the possibility of changing our minds.

And maybe that's what it boils down to; respect, willingness to listen, and an understanding that we might actually not be right about everything.  As author Robert Fulghum put it, "Don't believe everything you think."

In any case, the recent research does shed some light about how connections form in the first place.  The mutual friend who introduced Carol and me certainly got it spot-on.  And even if we can't agree about what books to read, it's good to know we still have lots in common, 23 years later.

****************************************