Skeptophilia (skep-to-fil-i-a) (n.) - the love of logical thought, skepticism, and thinking critically. Being an exploration of the applications of skeptical thinking to the world at large, with periodic excursions into linguistics, music, politics, cryptozoology, and why people keep seeing the face of Jesus on grilled cheese sandwiches.

Tuesday, August 2, 2022

Death, with big nasty pointy teeth

One of the biggest mysteries in paleontology is what caused the Cambrian Explosion.

You probably know that the Cambrian Explosion is when, around 538.8 million years ago, all of the basic body plans of modern animals appeared in a relative flash.  Before that, there were various simple and soft-bodied forms; afterward, there were animals that were clearly arthropods, annelids (segmented worms), mollusks, echinoderms, corals, nematodes, and proto-vertebrates.

In addition, there were also a number of groups of uncertain relationship to better-known lineages, and which went extinct by the end of the Cambrian Period.  One of the weirdest is Opabinia:

[Image licensed under the Creative Commons Nobu Tamura (http://spinops.blogspot.com), Opabinia BW2, CC BY 3.0]

This kind of rapid diversification is usually an indication that something drastic has changed.  One event that sometimes causes this is a large extinction -- leaving behind open niches that the survivors can adapt to fill.  But there appears to have been no major extinction immediately prior to the Cambrian Explosion.

One of the most plausible explanations has its basis in the observation that a lot of the new forms had fossilizable parts -- shells, exoskeletons, teeth, stiff fins and tails adapted for rapid swimming, and so on.  These more durable body parts mostly are either of a defensive or offensive nature.  So perhaps the Cambrian Explosion was triggered when formerly scavenging species realized they didn't have to wait for their friends and neighbors to die to have dinner, and predation was invented.  At that point, there's a hell of a selective pressure for said friends and neighbors to develop structures that protect them from being on the day's menu -- or turn them into predators themselves.

That theory about the origins of the Cambrian Explosion got a significant boost with the recent discovery of a fossil in Charnwood Forest, near Leicester, England, which is the oldest clearly predatory animal known -- and dates to 560 million years ago, so about twenty million years prior to the spike in biodiversity.

It's a relative of modern sea anemones, and was christened Auroralumina attenboroughii, "Attenborough's dawn lantern," after naturalist David Attenborough, who said he was "truly delighted" by the honor.  


[Image licensed under the Creative Commons F. S. Dunn, C. G. Kenchington, L. A. Parry, J. W. Clark, R. S. Kendall & P. R. Wilby, Auroralumina attenboroughii reconstruction, CC BY-SA 4.0]

It's wildly inaccurate to say that "this is the species that caused the Cambrian Explosion," but it certainly is suggestive that predators evolved not that long before the burst in biodiversity began.  "It’s generally held that modern animal groups like jellyfish appeared 540 million years ago, in the Cambrian Explosion, but this predator predates that by twenty million years," said Phil Wilby of the British Geological Survey, who co-authored the study.  "It’s the earliest creature we know of to have a skeleton.  So far we’ve only found one, but it’s massively exciting to know there must be others out there, holding the key to when complex life began on Earth."

It's amazing to think of what the Earth was like back then.  The only life was in the sea, and the vast continents were nothing but bare rock and sand without a single living thing anywhere.  Into that world was born an animal that was one of the first of its kind, a predatory beast with a protective skeleton to make sure that it wouldn't get turned into lunch itself -- launching the evolution of a dizzying array of structures that allowed for fleeing, attacking, and self-protecting, including all of the big, nasty, pointy teeth we see in predatory animals today.

****************************************


Monday, August 1, 2022

The thoughtographer

Twice a year, a nearby town has a Friends of the Library used book sale that has become justly famous all over the region.  It features a quarter of a million books, runs for three weeks, and raises tens of thousands of dollars.  On the first day -- when the true rarities and collectibles are available -- the line to enter starts to form four hours before the doors open, and stretches all the way around the block.

I'm not quite such a fanatic, but it is still one of the high points of my year.  I've picked up some real gems there.  This year's take included the "cult bestseller" (says so right on the cover), Ghosts: True Encounters With the World Beyond by Hans Holzer, which is massive both in popularity and in actual weight.

If you're at all familiar with the field of parapsychology, you've probably heard of Holzer.  He was one of the principal investigators into the famous Amityville Horror (alleged) haunting.  He wrote over a hundred books, mostly on the supernatural and the occult, and for years taught courses in parapsychology at the New York Institute of Technology.  Throughout his life -- and it was a long one, he died in 2009 at age 89 -- he was a vociferous believer in the paranormal, and equally strident denouncer of skeptics and scoffers.

Still, given my interest in beliefs in the supernatural, picking up a copy of this book for a couple of bucks was irresistible.  I'm glad to say it does not disappoint.  Besides containing hundreds of "true tales of ghosts and hauntings," he's not shy about saying what he thinks about the doubters:
To the materialist and the professional skeptic -- that is to say, people who do not wish their belief that death is the end of life as we know it to be disturbed -- the notion of ghosts is unacceptable.  No matter how much evidence is presented to support the reality of the phenomena, these people will argue against it and ascribe it to any of several "natural" causes.  Delusion or hallucination must be the explanation, or perhaps a mirage, if not outright trickery.  Entire professional groups that deal in the manufacturing of illusions have taken it upon themselves to label anything that defies their ability to reproduce it artificially through trickery or manipulation as false or nonexistent.  Especially among photographers and magicians, the notion that ghosts exist has never been popular.
There's a reason for that last bit, of course.  Photographers and magicians know how easy it is to fool people and create effects that look absolutely real.  It's not a coincidence that perhaps the most famous debunker, James Randi, was a professional stage magician before he dedicated his life to going after people like Sylvia Browne, Peter Popoff, and Uri Geller.

This paragraph (and the many others like it scattered throughout the book) shows that Holzer didn't really understand the definition of the word "skeptic."  Skeptics have the highest regard for evidence; in fact, it's the only thing that really convinces us.  But once it does, that's that.  Skeptics are able to say, "Well, I guess I was wrong, then," and turn on a dime if presented with reliable evidence.  However, that word "reliable" is usually the sticking point.  Holzer's compendium is chock-full of what he considers evidence, but which are either anecdotal accounts by people like "Mary G." and "John S.", or else demonstrations of the supernatural which are clearly explainable from the "natural causes" Holzer scoffs at.

The result is that he uncritically fell for people who were clearly frauds, and afterward staunchly stood by his assessment, a practice that was criticized by an article in the Journal of the Society for Psychical Research as "cast(ing) considerable doubt on the objectivity and reliability of his work as a whole."  One of the most egregious examples is his endorsement of the alleged abilities of the man who became known as "The Thoughtographer," Ted Serios.

Serios claimed to be able to use an ordinary camera outfitted with something he called a "gizmo" -- effectively, nothing more than a cardboard tube -- which was then aimed at his forehead.  He then (he said) sent his "thought energy" into the camera, and when the film was developed, it would have an image of what he was thinking about.

Ted Serios in 1967 [Image was released into the Public Domain by photographer Jule Eisenbud]

First, let's see what Holzer has to say about Serios:
A few years ago, Dr. Jules [sic] Eisenbud of the University of Colorado at Denver startled the world with his disclosures of the peculiar talents of a certain Ted Serios, a Chicago bellhop gifted with psychic photography talents.  This man could project images into a camera or television tube, some of which were from the so-called future.  Others were from distant places Mr. Serios had never been to.  The experiments were undertaken under the most rigid test conditions.  They were repeated, which was something the old-line scientists in parapsychology stressed over and over again.  Despite the abundant amount of evidence, produced in the glaring limelight of public attention and under strictest scientific test conditions, some of Dr. Eisenbud's colleagues at the University of Colorado turned away from him whenever he asked them to witness the experiments he was conducting.  So great was the prejudice against anything Eisenbud and his colleagues might find that might oppose existing concepts that men of scientists couldn't bear to find out for themselves.  They were afraid they would have to unlearn a great deal.
What Holzer conveniently fails to mention is that there was a second "gizmo" that Serios required -- a second, smaller tube with a lens at one end.  The other end contained a piece of an old 35-mm film slide, and when the flash went off, the image from the slide was projected right into the camera aperture.  It was small enough to be concealed in the palm of Serios's hand.

A magic trick, in other words.  Sleight-of-hand.

Serios's claims came to the attention of none other than the aforementioned James Randi, who invited Jule Eisenbud, Serios himself, and any other interested parties to come watch him up on stage -- where he replicated Serios's trick flawlessly.  Eisenbud afterward said he was "flabbergasted;" Serios gave a "wan smile" and wouldn't comment.

No mention of that in Holzer's book, either.

Look, I don't really blame Eisenbud for getting suckered; it's not like I wouldn't have been taken in, either.  We've all watched talented stage magicians do their thing and said, in bafflement, "How in the hell...?"  What I do blame Eisenbud for, though, is not pursuing it further -- telling Serios, "Okay, you need a 'gizmo'?  Tell me how it's made, and I'll make one for you -- show me you can do your trick without any props of your own construction."  Now, I also have to admit that working with Serios can't have been easy.  He was clearly mentally ill.  In Nile Root's book Thoughtography, about the Serios case, the author writes
Ted Serios exhibits a behavior pathology with many character disorders.  He does not abide by the laws and customs of our society.  He ignores social amenities and has been arrested many times.  His psychopathic and sociopathic personality manifests itself in many other ways.  He does not exhibit self-control and will blubber, wail and bang his head on the floor when things are not going his way.

He exhibits strong hostility toward figures of authority, such as policemen and scientists.  He is an alcoholic and in psychic experiments he has been encouraged toward the excessive use of alcohol.  He has demonstrated the symptoms of a manic-depressive with manic episodes.  In one hypermaniacal period he acted like a violent madman and could not be restrained.

He often becomes profane and raging, completely reckless.  While depressed he ignores other people, has a far-away look and is disenchanted with everything.  He is always bored with talk unless it is about him. He often imagines himself a hero, and sometimes identifies with a violent known personality.  He also exhibits sadistic behavior, for example by embarrassing Dr. Eisenbud once, giving as his own Dr. Eisenbud's name and his profession (a psychiatrist) when arrested.

In spite of the questionable research methods and the personality quirks of Serios, a number of Denver professional men believed Ted Serios was a psychic, with a unique power to record his thoughts with a Polaroid camera.
So I can see that it wouldn't have been any fun to try and force Serios to conform to adequate scientific control protocols.  Not that this excuses Eisenbud, though; he made the claim, so saying "Serios is impossible to control" doesn't obviate his duty to observe proper experimental procedure prior to publishing any results.

Holzer, though?  He ignored the overwhelming evidence that Serios was a fraud, claiming instead that there was "abundant amount of evidence, produced in the glaring limelight of public attention and under strictest scientific test conditions."  Which is not so much a dodge as it is a flat-out falsehood.  And that, to me, is inexcusable.

And another thing -- Holzer mischaracterizes skeptics and scientists in another way, one that shows that he didn't understand the scientific process at all.  He describes scientists as clinging to their preconceived notions, even in the face of evidence, as if the entire scientific edifice was threatened by new data, and the researchers themselves determined to sit back and keeping the same understanding of the universe they'd had all along.  The truth is, science depends on finding new and puzzling information; that's how science progresses.  Now, scientists are humans, and you can find many examples of people clutching their favorite model with both hands even when the contradictory evidence comes rolling in.  (A good example is how long it took the plate tectonics/continental drift model to be accepted.)  But then it's beholden upon the scientist making the extraordinary claim to produce such incontrovertible evidence that the opposition has no choice but to acquiesce -- which is exactly what happened when Drummond Matthews and Frederick Vine proved seafloor spreading and plate movement beyond a shadow of a doubt.

The truth is that finding new evidence that modifies or overturns a previous model is how careers are made in science.  As astrophysicist Neil deGrasse Tyson put it, "Journalists are always writing articles with headlines that say, 'Scientists have to go back to the drawing board.'  As if we scientists are sitting in our offices, our feet up on the desk, masters of the universe, then suddenly... oops!  Somebody discovered something!  No, we're always back at the drawing board.  If you're not at the drawing board, you're not making discoveries.  You're not doing science."

In my own case, I'm certainly a skeptic, even if I'm not a scientist but only a humble layperson.  And I can say without any hesitation that I would love it if there was hard evidence for the paranormal, and of life after death in particular.  Can you imagine how that would change our understanding of the world, and of ourselves?  Plus the added benefit of knowing that death wasn't the end of us.  Me, I'm not particularly fond of the idea of nonexistence; an afterlife would be awesome, especially if it involved a tropical climate, hammocks, and drinks with little umbrellas.

But be that as it may.  I still find Holzer's book entertaining, at least the parts with the actual ghost stories.  The diatribes about the evil skeptics and narrow-minded scientists, not so much.  It'd be nice to see more of the collaborative efforts to investigate paranormal claims, such as the ones done by the Society of Psychical Research.

But just saying "science is ignoring the evidence," and then presenting evidence that is clearly spurious, is not helping the parapsychologists' claims at all.

****************************************


Saturday, July 30, 2022

First, do no harm

I keep thinking I'm not going to need to write any more about LGBTQ issues, that I've said all I need to say, and yet... here we are.

I'm going to start with a question directed at the people who are so stridently against queer rights, queer visibility, even queer existence.  I doubt many people of that ilk read Skeptophilia, but you never know.  So here goes:

How does guaranteeing that LGBTQ people are treated equally, fairly, and kindly, and are given the same rights as straight people, affect you at all?

It costs you absolutely nothing to say, "I'm not like you, and maybe I don't even understand this part of you, but even so, I respect your right to be who you are without shame or fear."  For example, I'm not trans; I have always felt unequivocally that I am one hundred percent male.  But when I had trans students in my classes, all it required was my crossing out a first name on the roster and writing in the name they'd prefer to be known by, and remembering to use the appropriate pronouns.  A minuscule bit of effort on my part; hugely, and positively, significant on theirs.

What possible justification could I have for refusing?

The reason this whole topic comes up once again is a link sent to me by a loyal reader of Skeptophilia about a rugby team in Australia, the Manly Sea Eagles, which had seven of its players refuse to play in an important match because the owner wanted the team to wear jerseys with a rainbow design meant to promote inclusivity.


Note that the jerseys pretty subtle.  There's not even any text explaining, or calling attention to, the rainbow bands.  But even that level of support was a bridge too far for seven homophobic bigots, who chose to stand down from the game instead.

The whole incident is made even more outrageous by the fact that the owner wasn't wanting the jersey design changed permanently; it was for one damn game, as a sign of solidarity with LGBTQ players and fans.  But no, seven players refused to wear them, saying it violated their "religious beliefs."

Retired Sea Eagles player Ian Roberts, who is the first rugby league player to come out publicly as gay, was devastated by the players' actions.  "I try to see it from all perspectives, but this breaks my heart,” Roberts said.  "It’s sad and uncomfortable.  As an older gay man, this isn’t unfamiliar.  I did wonder whether there would be any religious pushback.  That’s why I think the NRL have never had a Pride round.  I can promise you every young kid on the northern beaches who is dealing with their sexuality would have heard about this."

Matt Bungard, of Wide World of Sports, was blunter still.  "I don’t want to hear one single thing about ‘respecting other people’s opinions’ or using religion as a crutch to hide behind while being homophobic.  No issues playing at a stadium covered in alcohol and gambling sponsors, which is also a sin.  What a joke."

Which I agree with completely, but it brings me back to my initial question; how did wearing the jerseys, for one night, harm those seven players?  The jerseys didn't say, "Hi, I'm Gay."  They were just a sign of support and inclusivity, of treating others the way you'd like to be treated.

Hmm, now where did I hear about that last bit?  Seems like I remember someone famous saying that.  Give me a moment, I'm sure it'll come to me.

A Christian baker creating a wedding cake for a gay couple is saying, "I may not be gay, but I'm happy you've found someone you love and want to spend your life with."  Straight parents who give unconditional support to their trans child are saying, "I love you no matter what, no matter who you are and what you'd like to be called."  A straight teacher having books with queer representation is saying, "Even if I don't experience sexuality like you do, I want you to understand yourself and be happy and confident enough to express your own truth openly."


I remember when I first saw this tweet thinking, "How about creating a world where if Billy did wake up and go ask Brad to the prom, it would be no big deal?"  It costs cis-hetero people nothing, zilch, to say, "I'm fine with who you are."  And to queer kids, it would be life-changing.  Heaven knows, my life would have been different if I'd been able to ask Brad to the prom.

Not you specifically, Brad.  I'm just making a point.

Really, all it requires is the ability to say (1) "Your experience is not the same as my experience, and that's okay," and (2) "I'm committed to treating everyone with kindness, respect, and love."

Instead, far too many people are still choosing bigotry, exclusion, and oppression.  And here in the United States, there is an increasing push to codify all that hatred into law.

If you're against same-sex marriage, if you bristle at Pride events, if you refuse to use a person's chosen name and pronouns, if you think businesses should be able to deny services to queer people, I want you to stop, just for one moment, and ask yourself: how is any of this harming me?  Maybe it's time to pay more attention to the "love thy neighbor" parts of the Bible than to the Book of Leviticus, of which (face it) 99% is ignored by most Christians anyway.  Maybe it's time to put more emphasis on compassion, understanding, and acceptance than on condemning anyone who doesn't think, act, or believe like you do.

After all, Jesus said it himself, in the Gospel of John, chapter thirteen: "By this all people will know that you are my disciples: if you have love for one another."

****************************************


Friday, July 29, 2022

The cost of fraud

My Aunt Florence, my mother's older sister, died of Alzheimer's disease.

Her children, especially my cousin Linda, took care of her as she slowly declined during the last fifteen years of her life.  She finally died in 2008 at the age of ninety, and by that time there was little left of her but a physical shell.  She was unresponsive, the higher parts of her brain destroyed by the agonizing progression of this horrible illness.  She went from being a bright, inquisitive, vital woman, an avid reader who did crossword puzzles in ink and could beat the hell out of me in Scrabble, to being... gone.

My Aunt and Uncle in better times

During this ordeal I lived fifteen hundred miles away, so I wasn't confronted every day by the terrible reality of what Alzheimer's does, both to the people suffering it and to their families.  Even so, it was my aunt's face I kept picturing while I was reading an article in Neoscope sent to me by a friend -- all the while getting angrier and angrier.

If you've kept up at all with the research on Alzheimer's you probably are familiar with the words beta amyloid.  It's a short-chain protein, whose function is unknown, which allegedly is directly toxic to nerve cells (and can cause other proteins to misfold, suggesting an etiology similar to Creutzfeld-Jakob syndrome, better known as "mad cow disease").  A great deal of money and time has been spent investigating the role of beta amyloid in Alzheimer's, and in developing drugs that interfere with its production -- significantly, not a single one of which has been shown to slow down the progression of the disease, much less reverse it.

It turns out this is no coincidence.  There is good evidence that the often-cited papers on the topic by Sylvain Lesné -- who wrote convincingly that a specific beta amyloid species, Aß*56, was the culprit in the devastating destruction you see in Alzheimer's sufferers -- were based on faked data.

Not even well-faked, either.  The images Lesné included from "Western blot" experiments, a commonly-used separation technique used to detect specific proteins in mixtures, were cut-and-pasted, something that can be seen not only in faint cut lines in the images but in the fact that the bands in the photographs have clearly been duplicated and moved (i.e., if you look at the edges of the bands, several of them have identically-shaped edges -- something that would be next to impossible in an actual Western blot).

It's a devastating finding.  About how the hell fraud like this got past peer review, biochemist Derek Lowe writes in Science:

The Lesné stuff should have been caught at the publication stage, but you can say that about every faked paper and every jiggered Western blot.  When I review a paper, I freely admit that I am generally not thinking “What if all of this is based on lies and fakery?”  It’s not the way that we tend to approach scientific manuscripts.  Rather, you ask whether the hypothesis is a sound one and if it was tested in a useful way: were the procedures used sufficient to trust the results and were these results good enough to draw conclusions that can in turn be built upon by further research?  Are there other experiments that would make things stronger?  Other explanations that the authors didn’t consider and should address?  Are there any parts where the story doesn’t hang together?  If so, how would these best be fixed?

There is a good-faith assumption behind all these questions: you are starting by accepting the results as shown.  But if someone comes in with data that have in fact been outright faked, and what’s more, faked in such a way as to answer or forestall just those sorts of reviewing tasks, there’s a good chance that these things will go through, unfortunately.
Lesné's apparently fraudulent research doesn't invalidate the whole beta amyloid hypothesis; other, independent studies support the toxic effects of beta amyloid on nerve cells, and have shown there's beta amyloid present in damaged cells.  But Lesné's contention that Aß*56 was causative of Alzheimer's was apparently a blind alley -- and the presence of the protein in the neurons of Alzheimer's sufferers could as well be a result of the disease as a cause.

What concerns me about this kind of thing, though, is the damage it does to the scientific enterprise as a whole.  Here in the United States, in the last twenty years, we've been dealing with the effects of a surge of anti-science propaganda on a number of fronts, most notably anthropogenic climate change and the efficacy of vaccines.  When highly cited, widely publicized work like Lesné's is shown to be based on fraudulent data, it gives more ammunition to the crowd who is already shrieking about how the scientists are making it all up to get grant money and are fundamentally untrustworthy.

And as my friend who sent me the link pointed out, there is a profit motive involved in science.  The publish-or-perish model of scientific research means that the competition for grant money is intense and often cutthroat.  Producing publishable results doesn't just get you funded, it is also the key to tenure-track research positions and the stability and prestige that come with them.  Don't get me wrong, I still strongly believe that 99% of scientists are rigorously ethical, and the vast majority of research is reliable; but when you set the system up to punish negative results, it gives the unscrupulous a hell of an incentive to cheat, and the naysayers yet another opportunity to tar all scientists with one brush.

But what haunts me most in this case is the human cost.  This disease destroys lives, and does it in a slow, agonizing, dehumanizing way.  The idea that falsified data may have led researchers down a fruitless rabbit hole, wasting huge amounts of time and money while people suffered and died, is horrifying.  I keep picturing my Aunt Florence's face, as she languished for years, her brain decaying while her body lived on, and wonder how the people who perpetrate such fraud can sleep at night.

****************************************


Thursday, July 28, 2022

The scent of memory

There are two very specific scents that will always remind me of my beloved grandma, with whom I lived for a year and a half when I was eight or nine years old.

One is the flowers of the sweet olive tree.  Sweet olive is a small tree with glossy leaves and little, cream-colored flowers with a fruity, spicy smell a little reminiscent of a combination of fresh peaches and vanilla.  My grandma had a beautiful sweet olive, and when it flowered in early summer, it perfumed the entire yard.

The other is the smell of old books.  When I lived with my grandma my bedroom was in the attic, a maze-like twist of rooms and alcoves with sloped ceilings, a wooden floor, and shelves laden with what seemed to my young eyes like thousands of books.  That dusty, dry smell when you turn the cover of a book that hasn't been opened in decades immediately brings me back to that happy time that was, in a lot of ways, like an oasis of calm and happiness in an otherwise troubled and turbulent childhood.

[Image licensed under the Creative Commons Lin Kristensen from New Jersey, USA, Timeless Books, CC BY 2.0]

I suspect a lot of you can relate to my experience of having scents trigger memories, but it's hardly a new observation.  This phenomenon was the impetus for Marcel Proust's Remembrance of Things Past, in which the entire book begins with the protagonist having a memory triggered by the smell and taste of madeleine cake and tea.  What's been less obvious is why smell has such a strong link to memory -- for many people, stronger than any of the other senses.

Two recent papers, one in the journal Cell and the other in Nature, have elucidated why that might be.  One reason is that the sense of smell is so specific -- we have over four hundred different types of olfactory sensors, each responsive to different molecules, and those sensors are connected through the olfactory bulb of the brain directly to the hippocampus (a major memory center) and the parts of the cerebral cortex involved in storage of memories.

There's a lot we still don't understand, however.  In part, our responsiveness to scents seems to have an epigenetic component -- the phenomenon wherein traits can be inheritable without involving alterations in the DNA.  For example, the grandchildren of mice who were given mild electric shocks when exposed to the scent of cherry blossoms have an aversive reaction to the odor even though they were never shocked themselves.  (This may sound like a Lamarckian "inheritance of acquired characteristics" model, and in a way, it is; but epigenetic effects usually happen because the trait involved hormonal alteration of the rate of gene expression, which can affect the children -- and grandchildren -- without the actual DNA being changed.)

What it immediately made me wonder is how this affects the experience of animals with way more sensitive noses -- like dogs.  Dogs' big snouts have fifty times more olfactory receptors than our puny little noses do.

"'Big Nose'?  Who you callin' 'Big Nose,' bub?"

Not only that, a dog's olfactory processing center is forty times bigger relative to the rest of its brain than yours is.  So how does that affect what neuroscientist David Eagleman calls its umwelt -- the sensory world it inhabits?  Imagine if the olfactory landscape you were immersed in every time you took a breath was as vivid as the visual and auditory landscapes most of us experience without even thinking about it.

No wonder my dogs both sniff me thoroughly whenever I come home after being away.  Who knows what information they're gleaning about where I've been, who I've come into contact, and *gasp* what other dogs I may have said hi to along the way?

We're just beginning to parse how our sensory processing centers integrate with the other parts of the brain, and solve the old question of why (and how) the olfactory sense is such a powerful trigger to sometimes long-buried memories.  Scientists are even considering the possible use of scents as a way to decondition traumatic memories in PTSD sufferers -- if there are smells with positive, reassuring connections, being exposed to those while suffering from the resurgence of trauma might blunt the edge of the pain.  In one study, smelling an odor with pleasant associations (in this case, fresh coffee) while recounting a memory of a traumatic experience significantly lowered the emotional distress of the test subjects.

It will be fascinating to see where this research goes, as we untangle layer after layer of the complex network that allows us to perceive our world and recall what we experience.  And, perhaps, explain how after fifty years, there's still a link in my brain between sweet olive flowers, old books, and my grandma's face.

****************************************


Wednesday, July 27, 2022

Redefining the primitive

One of the most interesting, and persistent, misconceptions about evolutionary biology revolves around the use of the word "primitive" to describe certain life forms.

The misunderstanding goes back to Aristotle, really.  The great philosopher proposed a concept usually known by its Latin name of scala naturae, the "scale of nature."  Often called "the great chain of being."  The idea is that life has progressed up some sort of ladder of complexity, starting with something like bacteria, then upwards through jellyfish and worms and bugs and fish and amphibians and reptiles and "lower" mammals, finally arriving at us, who (of course) being the pinnacle of creation, stand proudly at the top of the ladder.

The problem with this, as with many misconceptions, is that in some ways it's kinda sorta almost true.  Something like today's bacteria were the first life forms, and the progression of fish > amphibian > reptile > mammal is pretty well established.  The problems start when you look at life forms earlier than fish; during the famous Cambrian Explosion, most of the phyla of animals branched off and diversified in a relative flash, not only including the ones we have around today but a number of oddball groups that didn't survive.  So with respect to most modern groups of species, that smooth progression up the ladder didn't actually happen.

The problem gets worse when you try to apply the word "primitive" to current life forms.  Is a bug more primitive than a human?  Both are alive right now; each of them has exactly the same length of ancestral history.  It doesn't even work if you link "primitiveness" with "complexity."  Humans and bugs are both complex organisms, they're just complex in different ways.  Certainly, there's a difference in intelligence; most humans are smarter than most bugs.  But intelligence doesn't equal evolutionary success.  By just about any measure, insects are by far the most successful animals on the Earth.

It recalls the famous anecdote about the illustrious biologist J. B. S. Haldane.  Haldane was a zoologist but also rather notorious as an outspoken atheist, and religious people used to go to his talks to heckle him about it.  At one, during the question-and-answer period, a woman asked, "Professor Haldane, what have your studies in biology told you about the nature of God?"

Haldane thought for a moment, and finally said, "All I can say, ma'am, is that he must have an inordinate fondness for beetles."

In any case, you have to be extraordinarily careful how you apply the word "primitive."  In biology it's now used to describe traits (not entire organisms), with a very specific, restricted meaning, defined as "a trait that is shared with the ancestral form."  An example is the vascular tissue -- the internal plumbing -- in plants, which is "advanced" as compared to the trait of "lacking vascular tissue."  Vascular plants evolved from non-vascular ones, so apropos of that trait, mosses (which lack vascular tissue) are primitive as compared to ferns (which have vascular tissue). 

But it still requires caution, because it's all too easy to assume that "primitive traits are less complex" or (worse) that "if an organism is like humans, that means it's advanced," neither of which are true.  For example, take a look at the paper last week in The American Naturalist written by a team from the University of Washington that questions the notion of primitiveness with respect to something most of us take for granted -- reproduction in mammals.

There are three basic modes of reproduction in Class Mammalia.  A couple of modern species are oviparous -- egg-laying (the monotremes, namely the echidna and the platypus).  Another group are marsupials, which give birth to extremely altricial (undeveloped) young, because the mothers have no placentas to interface between themselves and their babies.  Once the offspring are too big (which isn't very big at all; kangaroos are about two centimeters long at birth) they are born, and develop the rest of the way in the mother's pouch.  The third are the placentals such as ourselves (and every mammal you've ever heard of other than the monotremes and marsupials).

Egg-laying certainly is a primitive trait; it's pretty clear that the reptilian ancestors of the earliest mammals were oviparous.  But what about the presence of a placenta?  Once again, the danger is in assuming that it's the "advanced trait" because (1) humans are placentals, (2) there are currently more placentals than marsupials, and (3) somehow the placental method "seems more complicated."  These are all more like smug self-congratulation than they are science, and none are reliable indicators of the primitiveness of a trait.

The current study, in fact, suggests that the placental mode of reproduction may predate the marsupial method -- by a lot.  The researchers studied the odd multituberculates, a group of mammals that were amongst the first to diversify significantly, way back in the Jurassic Period around 170 million years ago.  They were some of the most common mammals for a very long time, finally going extinct about 35 million years ago (for reasons unknown). 

The multituberculate Sunnyodon notleyi [Image licensed under the Creative Commons FunkMonk (Michael B. H.), Sunnyodon, CC BY-SA 3.0]

The salient point here is that the marsupial mammals, including the extinct ones that have been studied, have a very distinctive pattern of bone growth that is connected to their being born so incredibly undeveloped.  A careful analysis of multituberculate bones shows they're a great deal more similar to today's placentals -- despite the fact that they branched off from the rest of Class Mammalia way earlier than the marsupials did.

So it looks like the little multituberculates had placentas and long gestation periods, and our mode of reproduction is actually the primitive one.  Meaning the marsupial lineage lost the ability to form a placenta, rather than our lineage gaining it.  Why that happened isn't known; but as we've seen, a trait doesn't need to be complex to give its owner a selective advantage.  Perhaps in marsupials, the draw on the mother's resources is lowered enough by giving birth early that it allows her a better shot at surviving -- but that's pure speculation.

Whatever it is, both modes function perfectly well.  "Evolution," as biologist Richard Dawkins put it, "is the law of 'whatever works.'"

And it all reinforces the notion that there is no "great chain of being," there's just an enormous tangled web of which we are just a single strand. 

****************************************


Tuesday, July 26, 2022

Seeing through the fog

There's something a little unsettling about the idea that when you're looking outward in space, you're looking backward in time.

If it seems like we're seeing things as they actually are, right now, it's only because (1) the speed of light is so fast, and (2) most of the objects we look at and interact with are relatively close by.  Even the Sun, though, which in astronomical terms is right on top of us, is eight light-minutes away, meaning that the light leaving its surface takes eight minutes to cross the 150 million kilometers between it and us.  If the Sun were suddenly to go dark -- not, mind you, a very likely occurrence -- we would have no way of knowing it for eight minutes.

The farther out you go, the worse it gets.  The nearest star to the Solar System, Proxima Centauri, is about 4.2 light years away.  So the awe-inspiring panorama of stars in a clear night sky is a snapshot of the past.  Some of the stars you're looking at (especially the red supergiants like Antares and Betelgeuse) might actually already have gone supernova, and that information simply hasn't gotten here yet.  None of the stars we see are in exactly the same positions relative to us as they appear to be to us right now.  

Worst of all is when you look way out, as the James Webb Space Telescope is currently doing, because then, you have to account not only for distance, but for the fact that the universe is expanding.  And it hasn't expanded at a uniform rate.  Current models support the inflationary model, which says that between 10^-36 and 10^-32 seconds after the Big Bang the universe expanded by a factor of around 10^26.  This seems like a crazy conjecture, but it immediately solves two perplexing problems in observational astronomy.

The Carina Nebula, as photographed by the James Webb Space Telescope [Image is in the Public Domain courtesy of NASA/JPL]

The first one, the horizon problem, has to do with the homogeneity of space.  Look as far out into space as you can in one direction, then do the same thing in the opposite direction, and what you'll see is essentially the same -- the same distribution of matter and energy.  The difficulty is that those two points are causally disconnected; they're far enough apart that light hasn't had time to travel from one to the other, and therefore no mechanism of communication can exist between them.  By our current understanding of information transfer, once causally disconnected, always causally disconnected.  So if something set the initial conditions in point A, how did point B end up with identical conditions if they've never been in contact with each other?  It seems like a ridiculous coincidence.

The other one is the flatness problem, which has to do with the geometry of space-time.  This subject gets complicated fast, and I'm a layperson myself, but as far as I understand it, the gist is this.  The presence of matter warps the fabric of space locally (as per the General Theory of Relativity), but what is its overall geometry?  From studies of such phenomenal as the cosmic microwave background radiation, it seems like the basic geometry of the universe as a whole is perfectly flat.  Once again, there seems to be no particular reason to expect that could occur by accident.

Both these problems are taken care of simultaneously by the inflationary model.  The horizon problem disappears if you assume that in the first tiny fraction of a second after the Big Bang, the entire universe was small enough to be causally connected, but during inflation the space itself expanded so fast that it carried pieces of it away faster than light can travel.  (This is not forbidden by the Theories of Relativity; matter and energy can't exceed the speed of light, but space-time itself is under no such stricture.)  The flatness problem is solved because the inflationary stretching smoothed out any wrinkles and folds that were in space-time at the moment of the Big Bang, just as taking a bunched-up bedsheet and pulling on all four corners flattens it out.

All of this will be facing some serious tests over the next few years as we get better and better at looking out into the far reaches.  Just last week a team at the University of Cambridge published a paper in Nature Astronomy about a new technique to look out so far that what you're seeing is only 378,000 years after the Big Bang.  (I know that may seem like a long time, but it's only 0.003% of the current age of the universe.)  The problem is that prior to this, the universe was filled with a fog of glowing hydrogen atoms, so it was close to opaque.  The new technique involves filtering out the "white noise" from the hydrogen haze, much the way as you can still see the shadows and contours of the landscape on a foggy day.  It's not going to be easy; the signal emitted by the actual objects that were there in the early universe is estimated to be a hundred thousand times weaker than the interference from the glowing fog.

It's mind-blowing.  I've been learning about this stuff for years, but I'm still boggled by it.  If I think about it too hard I'm a little like the poor woman in a video with science vlogger Hank Green, who is trying to wrap her brain around the idea that anywhere you look, if you go out far enough, you're seeing the same point in space (i.e. all spots currently 13.8 billion light years from us were condensed into a single location at the moment of the Big Bang), and seems to be about to have a nervous breakdown from the implications.  (Hat tip to my friend, the amazing author Robert Chazz Chute, for throwing the video my way.)

So think about all this next time you're looking up into a clear night sky.  It's not a bad thing to be reminded periodically how small we are.  The universe is a grand, beautiful, amazing, weird place, and how fortunate we are to be living in at time where we are finally beginning to understand how it works.

****************************************