Skeptophilia (skep-to-fil-i-a) (n.) - the love of logical thought, skepticism, and thinking critically. Being an exploration of the applications of skeptical thinking to the world at large, with periodic excursions into linguistics, music, politics, cryptozoology, and why people keep seeing the face of Jesus on grilled cheese sandwiches.

Monday, February 28, 2022

"Gotcha" proselytizing

A frequent reader and commenter on Skeptophilia sent me a note a few days ago, with a link and the cryptic comment, "Gordon, I think you need to take a look at this."  At first, I thought the link was to my own website -- but underneath the link was an explanation that the individual had discovered the link by accidentally mistyping the website address as skeptophilia.blogpsot.com.  (Bet it took you a while to figure out the misspelling, didn't it?  It did me.)

So, anyway, I clicked on the link, and was brought here.

To say that I found this a little alarming was an understatement.  Had someone gone to the lengths of purchasing a website name one letter off from mine, to catch off guard the unwary (and possibly uneasy) skeptics and agnostics who thought they were going to visit a site devoted to rationalism?  I've been the target of negative comments before, from angry believers in everything from homeopathy to hauntings, and certainly have gotten my share of hate mail from the vehemently religious contingent who are bothered by the fact that I am an atheist who is completely, and confidently, "out," and am an unapologetic defender of the evolutionary model, Big Bang cosmology, and so on.  But this seemed kind of out there even for those folks.

Fortunately, my wife, who is blessed with a better-than-her-fair-share amount of common sense and a good grounding in technology, suggested that I try to type in SomethingElse.blogpsot.com.  So I did.  I first tried the address for a friend's blog, but put in the deliberate misspelling for "blogspot."  It brought me to the same place.  Then I tried "CreationismIsNonsense.blogpsot.com."  Same thing.

So apparently, the owner of this ultra-fundamentalist website, with its babble about the Rapture and Armageddon and the literal truth of the Bible, had just bought the domain name "blogpsot.com," so that any time anyone makes that particular misspelling in heading to their favorite blog, it takes them to that site.  I was relieved, actually; the thought that someone would go to all that trouble to target me in particular was a little unnerving.  (And evidently the fact that on the homepage of the "blogpsot" site, there is a link for "The World's Biggest Skeptic" is just a coincidence.)

However, you have to wonder if the person who owns the site really is laboring under the mistaken impression that this is an effective proselytizing tool.  Can you really imagine someone who is trying to check out the latest post on his/her favorite blog on, say, sewing, and lands here -- and then suddenly goes all glassy-eyed, and says, "Good heavens. I get it now.  The Bible is true, the Rapture is coming, and I'd better repent right now."

No, neither can I.

And when you think about it, the door-to-door religion salesmen that periodically show up in our neighborhoods are the same kind of thing, aren't they?  A little less covert and sneaky, that's all.  But they're trying to accomplish the same goal -- catching you off guard, getting a foot in the door, spreading the message.

Of course, that approach sometimes backfires.  A couple of years ago a pair of missionaries (Jehovah's Witnesses, if I recall correctly) came up to my front door.  They were both women, the older maybe forty and the younger looking like she was in her twenties.  Both of them were immaculately attired in modest dresses and starched white blouses.  They didn't see that I was working in the garden; I was kind of hidden behind a bush I was pruning.  It was a blistering hot day, and when I heard the knock I walked over to them -- shirtless, covered in grime and sweat.  I acted completely nonchalant, but they were clearly uncomfortable.  The usual spiel was seriously truncated, and they made an excuse to leave after five minutes or so despite my rather over-the-top friendliness.

I gave them a big wave when they left and told them to drop in any time.

Never saw them again.  I guess God's only interested in converts who are clean and fully clothed.

Franciscan missionaries in California (woodcut from Zephyrin Engelhardt's Mission San Juan Capistrano: A Pocket Guide, 1922) [Image is in the Public Domain]

In any case, my previous comment about this sort of thing being an ineffective proselytizing tool is irrelevant, really.  It's like spam emails.  If you send out a million emails, and your success rate is 0.1%, you've still made money, because of the extremely low overhead.  Same here; you get volunteers (in the case of the door-to-door folks) or unsuspecting drop-ins (in the case of the website).  Most of the target individuals say no, or hit the "Back" button -- but the fraction of a percent that don't are your payoff.

The whole thing pisses me off, frankly, because it's so sneaky.  Even if it wasn't targeted at me specifically, it just seems like a skeevy way to get converts.  But to a lot of these folks, how you convert people is unimportant.  The essential thing is to convert them in the first place.  If you can grab people when their rational faculties are not expecting it, all the better -- because, after all, rationality is the last thing they want to engage.

**************************************

Saturday, February 26, 2022

Unity in diversity

It was in my evolutionary biology class in college that I ran into a concept that blew my mind, and in many ways still does.

It was the idea that race is primarily a cultural feature, not a biological or genetic one.  There is more genetic diversity amongst the people of sub-Saharan Africa -- people who many of us would lump together as "Black" -- than there is in the rest of the world combined.  A typical person of western European descent is, our professor told us, closer genetically to a person from Japan than a Tswana man is to the !Kung woman he lives right next door to in Botswana, even though both have dark skin and generally "African features."


To reiterate: I'm not saying race doesn't exist.  It certainly does, and the social, cultural, and political ramifications are abundantly clear.  It's just that what we often think of as race has very close to zero genetic support; we base our racial classifications on a handful of characteristics like skin and eye color, the shape of the nose and mouth, and the color and texture of the hair, all of which can so easily undergo convergent evolution that it triggers us to lump together very distantly-related groups and split ones that lie much closer together on the family tree.

The reason this comes up today is a couple of bits of recent research highlighting the fact that the subject is way more complicated than it seems at first.  The first looks at the fragmentation that happened in Africa, on the order of twenty thousand years ago, that resulted in the enormous genetic diversity still to be found in sub-Saharan Africa today.  By analyzing DNA from both living individuals and the remains of people from long ago, researchers at Harvard University found that this was about the time that our ancestors stopped (for the most part) making extended walkabouts to find mates, and settled into being homebodies.  What triggered this is a matter of conjecture; one possibility is that this was in the middle of the last ice age, it could be that the colder and drier conditions (even in equatorial regions) made food scarcer, so long trips into unknown territory were fraught with more danger than usual.

Whatever the cause, the isolation led to genetic drift.  A general rule of evolutionary biology is that if you prevent genetic mixing, populations will diverge because of the accrual of random mutations, and that seems to be what happened here.  The fact that a Tswana person and a !Kung person (to use my earlier example) are so distinct is because they've been genetically isolated for a very long time -- something facilitated by a tendency to stay at home and partner with the people you've known all your life.

Interestingly, some research last year suggested that there are "ghost lineages" in the human ancestry -- groups that are ancestral to at least some modern humans, but are as yet unidentified from the fossil record.  The one studied in last year's paper were ancestral to the Yoruba and Mende people of west Africa, in which between two and nineteen percent of the genomes come from this ghost lineage -- but the phenomenon isn't limited to them.  The authors found analogous (but different) traces of ghost lineages in people of northern and western European and Han Chinese descent, and the guess is that all human groups have mysterious, unidentified ancestral groups.

The other bit of research that was published last week was an exhaustive study of the genetics of people around the world, with an ambitious goal -- coming up with a genetic family tree for every group of people on Earth. "We have basically built a huge family tree, a genealogy for all of humanity that models as exactly as we can the history that generated all the genetic variation we find in humans today," said Yan Wong of the University of Oxford, who co-authored the study.  "This genealogy allows us to see how every person's genetic sequence relates to every other, along all the points of the genome."

The researchers analyzed 3,609 individual DNA samples representing 215 different ethnic groups, and used software to compare various stretches of the DNA and assemble them using the technique called parsimony -- basically, creating a family tree that requires the fewest random coincidences and ad hoc assumptions.  The result was an enormous genealogy containing 27 million reconstructed common ancestors.  They then linked location data to the DNA samples -- and the program identified not only when the common ancestors probably lived, but where they lived.

I find this absolutely amazing.  Using modern genetic analysis techniques, we can assemble our own family tree, with roots extending backwards tens of thousands of years and encompassing lineages for which we have no archaeological or paleontological records.  With the number of connections the research generated, I have no doubt we'll be studying it for years to come, and have only started to uncover the surprises it contains.

But all part of living up to the maxim inscribed in the Temple of Apollo at Delphi -- γνῶθι σεαυτόν.

"Know thyself."

**************************************

Friday, February 25, 2022

Out of sight, out of mind

Humans have amazingly short memories.

I suppose that there's at least some benefit to this.  Unpleasant events in our lives would be far, far worse if the distress we experienced over them was as fresh every single day as it was the moment it happened.  That's the horror of PTSD; the trauma gets locked in, triggered by anything that is even remotely similar, and is re-experienced over and over again.

So it's probably better that negative emotions lose their punch over time, that we simply don't remember a lot of what happens to us.  But even so, I kind of wish people would keep important stuff more in mind, so we don't repeat the same idiotic mistakes.  Santayana's quote has almost become a cliché -- "Those who don't remember the past are doomed to repeat it" -- but part of the saying's sticking power is its tragic accuracy.

The reason this comes up is because of some research out of Oxford University that appeared in the journal Trends in Ecology and Evolution this week.  A team led by Ivan Jarić looked at the phenomenon of extinction -- but framed it a bit differently than you may have seen it, and in doing so, turned the spotlight on our own unfortunate capacity for forgetting.

There are various kinds of extinction.  Extirpation is when a species is lost from a region, but still exists elsewhere; mountain lions, for example, used to live here in the northeastern United States, but were eradicated in the late nineteenth and early twentieth century (the last confirmed sighting was in Maine in 1938).  They're still holding their own in western North America, however.  Functional extinction is when the population is reduced so much that it either no longer has much impact on the ecosystem, or else would not survive in the wild without signification conservation measures, or both.  Sadly, the northern white rhinoceros, the northern right whale, and the south China tiger are all considered functionally extinct.  

Extinct in the wild is exactly what it sounds like; relict populations may exist in captivity, but it's gone from its original range.  Examples include the beautiful scimitar oryx, the Hawaiian crow, and the franklinia tree (collected in the Altamaha River basin in Georgia in 1803 and never seen in the wild since).  Such species may be reintroduced from captive breeding, but it tends to be difficult, expensive, and is often unsuccessful.

Then there's global extinction.  Gone forever.  There has been some talk about trying to resuscitate species for which we have remains that have intact DNA, Jurassic Park-style, but the hurdles to overcome before that could be a reality are enormous -- and there's an ongoing debate about the ethics of bringing back an extinct species into a changed modern world.

The new research, however, considers yet another form of extinction: societal extinction.  This occurs when a population is reduced to the point that people basically forget it ever existed.  It's amazing both how fast, and how completely, this can happen.  Consider two bird species from North America -- the passenger pigeon (Ectopistes migratorius) and the Carolina parakeet (Conuropsis carolinensis) -- both of which were common in the wild, and both of which went completely extinct, in 1914 and 1918 respectively.

Illustration of the passenger pigeon by naturalist Mark Catesby (1731) [Image is in the Public Domain]

Actually, "common" is a significant understatement.  Up until the mid-nineteenth century, passenger pigeons were the most common bird in North America, with an estimated population of five billion individuals.  Flocks were so huge that a single migratory group could take hours to pass overhead.  Carolina parakeets, though not quite that common, were abundant enough to earn the ire of fruit-growers because of their taste for ripe fruit of various kinds.  Both species were hunted to extinction, something that only fifty years earlier would have been considered inconceivable -- as absurd-sounding as if someone told you that fifty years from now, gray squirrels, robins, house sparrows, and white-tailed deer were going to be gone completely.

What is even more astounding, though, is how quickly those ubiquitous species were almost entirely forgotten.  In my biology classes, a few (very few) students had heard of passenger pigeons; just about no one knew that only 150 years ago, there was a species of parrot that lived from the Gulf of Mexico north to southern New England, and west into the eastern part of Colorado.  As a species, we're amazingly good at living the "out of sight, out of mind" principle.

The scariest part of this collective amnesia is that it makes us unaware of how much things have changed -- and are continuing to change.  Efforts to conserve the biodiversity we still have sometimes don't even get off the ground if when the species is named, the average layperson just shrugs and says, "What's that?"  Consider the snail darter (Percina tanasi), a drab little fish found in freshwater streams in the eastern United States, that became the center of a firestorm of controversy when ecologists found that its survival was jeopardized by the Tellico Dam Hydroelectric Project.  No one but the zoologists seemed to be able to work up much sympathy for it -- the fact that it wasn't wiped out is due only to the fact that a population of the fish was moved to neighboring streams that weren't at risk from the dam, and survived.  (It's currently considered "threatened but stable.")

"It is important to note that the majority of species actually cannot become societally extinct, simply because they never had a societal presence to begin with," said study lead author Ivan Jarić, in an interview with Science Daily.  "This is common in uncharismatic, small, cryptic, or inaccessible species, especially among invertebrates, plants, fungi and microorganisms -- many of which are not yet formally described by scientists or known by humankind.  Their declines and extinctions remain silent and unseen by the people and societies."

Which is honestly kind of terrifying.  It's bad enough to lose species that are, as it were, right in front of our eyes; how many more are we losing that are familiar names only to biologists, or aren't even yet known to science?  And keep in mind that little-known doesn't mean unimportant.  There are plenty of "uncharismatic, small, cryptic, or inaccessible species" that are pretty damn critical.  One that springs to mind immediately are mycorrhizae, a group of underground fungi that form a symbiotic relationship with plant roots.  The relationship is mutually beneficial; the plant has its capacity to absorb minerals and water greatly increased, and the fungus gets a home and a source of food.  By some estimates, 95% of plant species have a mycorrhizal partner, and some -- notably orchids -- are completely dependent on it, and die if they are separated from their fungal symbiont.  Even plants that aren't entirely reliant on them benefit from the relationship; there is increasing evidence that adding mycorrhizal spores to an ordinary vegetable garden can decrease dependence on chemical fertilizers, improve drought resistance, and increase crop yield (some experiments have seen it as much as double).

Incredibly cool.  But not what most of us would consider "charismatic."  I doubt, for example, that micrographs of mycorrhizae will ever usurp the wolves and eagles and elephants on the pages of the calendars we hang on our walls.  I mean, I would buy one, but I suspect I'm in the minority.

What this highlights to me is that we need to fight this tendency to overlook or forget about the organisms in our world that aren't obvious -- the rare, the small, the hidden.  The fact that their plight is not as obvious as the whales and the elephants and the tigers doesn't mean they're unimportant.  We need to become conscious of what's around us, and committed to protecting it.  Another comparison that's become almost a cliché is comparing biodiversity to a tapestry, but the symbolism is apt.

Pull out enough threads, and the entire thing comes to pieces.

**************************************

Thursday, February 24, 2022

Continental mashup

Today's topic comes to us not because it's some earthshattering discovery that overturns what we've understood, but solely because it's really cool.

You probably know the general rule that isolated ecosystems -- islands, especially -- tend to evolve in their own direction, resulting in a flora and fauna that is completely unique.  Two of the most common places cited as illustrations of this general rule are Australia and Madagascar, home to two of the oddest collections of species on Earth.  Australia's species are so different from the (relatively) nearby biomes in southeast Asia that it was noticed over 150 years ago by British naturalist Alfred Russel Wallace, and the boundary was named "Wallace's Line" in his honor.  It's an amazingly sharp edge.  Wallace's Line runs between Borneo (to the west) and Sulawesi (to the east), and between Bali (to the west) and Lombok (to the east); the distance between Bali and Lombok is only 35 kilometers, but their flora and fauna are so different it was apparent to Wallace immediately.  North and west of Wallace's Line, the animals and plants are the typical assemblage you see in all of southeast Asia.  South and east of it, you get the families you find in Australia.

Another case of this was the linkup of North and South America, forming the Isthmus of Panama three million years ago.  Prior to this, South America had been isolated for 150 million years, resulting in the evolution of a completely unique group of living things, including the giant ground sloth (Megatherium) and the armored-tank glyptodons.  When the connection formed, this allowed the North American carnivores (especially dogs, cats, and weasels) to migrate south.  Humans eventually followed.  The result -- extinction of most of the South American megafauna.  (One of the only species to make the return trip successfully is the armadillo.)

The research that brings this topic up is a study showing that this kind of thing has been going on throughout the Earth's history.  In this case, a team of paleontologists and geologists has shown that a similar scenario unfolded forty million years ago, during the Eocene Epoch, when three land masses collided -- what are now Europe and western Asia, and a low-lying island in between that has been named Balkanatolia (because what's left of it now forms the Balkans and Anatolia).  Here is the layout prior to the collision, and where those land masses are today:


So here, we have not two but three assemblages of species suddenly finding themselves being stirred together.  The result has been named the Grande Coupure (the "great break"), when most of the endemic European fauna -- groups like the paleotheres, distantly related to horses, and the European primate family Omomyidae -- vanished completely.  The winners were the ancestors of who you see now, pretty much across the entire region; canids, true perissodactyls and artiodactyls (for example, horses and pigs respectively), squirrels, hamsters, beavers, and hedgehogs.

Sometimes, who wins and who loses is due as much to luck as it is to fitness.  The species that became extinct in these continental fusion events were doing just fine before the land masses linked together; had North and South America not joined, we might well have giant ground sloths and glyptodons today.  (Of course, those kinds of counterfactual speculation are probably pointless.  Any number of things besides predation by North American mammals could have led to the extinctions.  After all, there have been a lot of changes, climatic and otherwise, since then, and megafauna always seem to get hit hardest by rapidly-shifting conditions.)

But it's cool to find another example of this effect.  And it also gives us a hint of what's to come.  The Australian Plate is moving generally northward at about seven centimeters per year, and will inevitably collide with Asia in a hundred million years or so.  This fusion will erase Wallace's Line and allow for mixing of the two faunal and floral assemblages.

Who will win?  No way to tell, but considering how badass some of the Australian animals are (saltwater crocodiles, cassowaries, brown snakes and taipans, and funnel-web spiders, to name just a few), I'm putting my money on the Land Down Under.

**************************************

Wednesday, February 23, 2022

Yanking open the closet door

If you needed another reason to be outraged at the direction the United States is going, a bill currently moving through the state congress of Florida -- and 100% supported by Governor DeSantis -- would not only prohibit teachers from mentioning anything about sexual orientation (their own or anyone else's), but would require them to out LGBTQ students to their parents.

Further support of journalist Adam Serwer's statement that with the GOP, the cruelty is the point.

Nicknamed the "Don't Say Gay" bill, Florida's House Bill 1557 initially was intended to prevent any discussion of queerness in the classroom -- up to and including teachers revealing, even in passing, that they are queer themselves.  So this would, in effect, prevent a gay teacher (for example) from mentioning his partner's name, or even having a photograph of the two of them on his desk.  So what happens when he's seen holding hands with his partner in public, and a student asks him point-blank, "Are you gay?"  Is he supposed to say, "I can't answer that?"  Or "None of your business?"

Joe Harding, a Republican (surprise!) in the state House of Representatives, proposed an amendment on Friday to the bill that made it even worse.  If the bill passes -- and it looks like it will -- teachers who find out a student is LGBTQ are required to tell the parents.  Schools would be compelled to "develop a plan, using all available governmental resources" to out children to their parents "through an open dialogue in a safe, supportive, and judgment-free environment that respects the parent-child relationship and protects the mental, emotional, and physical well-being of the student."

Originally there was a clause providing an exemption "if a reasonably prudent person would believe" that outing the student might cause "abuse, abandonment, or neglect," but Harding took that bit out.

The cruelty is the point.

I'm going to say this as plainly as I know how.  I doubt any Florida Republicans are listening, and even if they are, I doubt even more that they'd care,  but despite that:

No one ever, ever, ever has the right to out a person to anyone, except the person him/herself.  Ever.

[Image licensed under the Creative Commons Benson Kua, Rainbow flag breeze, CC BY-SA 2.0]

While I often have wished that I'd had the courage to come out as bisexual much earlier in my life, I can't even imagine what my life would have been like if one of my high school teachers had outed me to my parents without my consent.  I wouldn't have been physically abused; neither of my parents ever laid a hand on me.  However, I was already enduring so much emotional abuse that now, almost fifty years later, I'm still dealing with the damage.  I shudder to think of what my life would have been like if my conservative, traditional Roman Catholic parents had found out I was bi when I figured it out myself at age fifteen.

Even without this, I was already told enough times what a crashing disappointment I was.  Add this on...  Well, to put things in perspective, as it was I attempted suicide twice, ages seventeen and twenty.  That I didn't succeed was honestly just dumb luck.

Had someone told my parents I was bi?  I have little doubt that I wouldn't be here today.

Oh, and the clause that outs the kid in a "safe, supportive, and judgment-free environment that respects the parent-child relationship and protects the mental, emotional, and physical well-being of the student" is unadulterated bullshit.  I can vouch for this from my own experience.  No one -- no one -- knew about my suicide attempts.  Not family members, not friends, not teachers.  From the outside, my parents looked like they were straight out of The Brady Bunch.  My mom, especially, was very good at being a chameleon, and the way she treated me in public was 180 degrees from the way she treated me at home.  There is no way that anyone would have known that I wasn't in an environment that supported my mental, emotional, and physical well-being.

Once again, let me put this plainly: teachers don't know what students' home life is like.  Not even if they've met the parents, not even if they've talked to the student.  And I can say with complete assurance that if I were a teacher in Florida, they would have to fire me, because no way in hell would I comply with the proposed law.  Putting teachers -- even well-meaning ones -- in charge of revealing a student's sexual orientation isn't just irresponsible, it's actively dangerous.  Queer teenagers already have a four times higher risk of self-harm or suicide than straight teens do; this bill, if it passes, will make it much, much worse.

But I suspect that won't make a difference.

The cruelty is the point.

The only thing that might stop this is if people in Florida contact their representatives and senators and say, "No.  This is unacceptable."  It's all well and good to say, "The blood of every queer teen in Florida who comes to harm after this is on your hands," but by that time, it's too fucking late.  This bill needs to be stopped, and it needs to be stopped now.  Somehow, the most unfeeling, unkind, bigoted people have become the ones who are making the laws, and while there's no easy way to get them out of office until the next election, they sure as hell can get buried by angry letters and emails.

Please.  Do it now.

Lives are at stake, here.

**************************************

Tuesday, February 22, 2022

Splitting the difference

One of the most misunderstood pieces of the evolutionary model is that natural selection is almost always a compromise.

Very few changes that could occur an organism's genes (and thus in its physical makeup) are unequivocally good.  (Plenty of them are unequivocally bad, of course.)  Take, for example, our upright posture, which is usually explained as having been selected for by (1) allowing us to see farther over tall grass and thus spot predators, (2) leaving our hands free for tool use, (3) making it easier to carry our offspring before they can walk on their own, or (4) all of the above.  At the same time, remodeling our spines to accommodate walking upright -- basically, taking a vertebral column that evolved in an animal that supported itself on all fours, and just kind of bending it upwards -- has given us a proneness to lower back injury unmatched in the natural world.  The weakening of the rotator cuff, due to the upper body no longer having to support part of our weight, has predisposed us to shoulder dislocations.

Then there are the bad changes that have beneficial features.  One common question I was asked when teaching evolutionary biology is if selection favors beneficial traits and weeds out maladaptive ones, why do negative traits hang around in populations?  One answer is that a lot of maladaptive gene changes are recessive -- you can carry them without showing an effect, and if you and your partner are both carriers, your child can inherit both copies (and thus the ill effect).  But it's even more interesting than that.  It was recently discovered that being a carrier for the gene for the devastating disease cystic fibrosis gives you resistance to one of the biggest killers of babies in places without medical care -- cholera.  It's well known that being heterozygous for the gene for sickle-cell anemia makes you resistant to malaria.  Weirdest of all, the (dominant) gene for the horrible neurodegenerative disorder Huntington's disease gives you an eighty percent lower likelihood of developing cancer -- offset, of course, by the fact that all it takes is one copy of the gene to doom you by age 55 or so to progressive debility, coma, and death.

So the idea of "selective advantage" is more complex than it seems at first.  The simplest way to put it is that if an inheritable change on balance gives you a greater chance of survival and reproduction, it will be selected for even if it gives you disadvantages in other respects, even some serious ones.

The reason the topic comes up is because of a cool piece of research out of the University of California - Santa Barbara into a curious genetic change in the charming little Colorado blue columbine (Aquilegia caerulea), familiar to anyone who's spent much time in the Rocky Mountains.

Colorado blue columbine (Aquilegia caerulea) [Image licensed under the Creative Commons Rob Duval, Heavycolumbinebloom, CC BY-SA 3.0]

Both the common name and scientific name have to do with birds; columba is Latin for dove, aquila Latin for eagle.  The reason is the graceful, backwards-curved tubular petals, which (viewed from the side) look a little like a bird's foot.  The tubes end in nectar glands, and are there to lure in pollinators -- mostly hummingbirds and butterflies -- whose mouthparts can fit all the way down the long, narrow tubes.

Well, the researchers found that not all of them have these.  In fact, there's a group of them that don't have the central petals and nectar spurs at all.  The loss is due to a single gene, APETALA3-3, which simply halts complete flower development.  So far, nothing too odd; there are a lot of cases where some defective gene or another causes the individual to be missing a structure.  What is more puzzling is that in the study region (an alpine meadow in central Colorado), a quarter of the plants have the defective flowers.

You would think that a plant without its prime method of attracting pollinators would be at a serious disadvantage.  How could this gene be selected strongly enough to result in 25% of the plants having the change?  The answer turned out to be entirely unexpected.  The plants with the defective gene don't get visited by butterflies and hummingbirds as much -- but they are also, for some reason, much less attractive to herbivores, including aphids, caterpillars, rabbits, and deer.  So it may be that the flowers don't get pollinated as readily as those of their petal-ful kin, but they are much less likely to sustain energy-depleting damage to the plant itself (in the case of deer, sometimes chomping the entire plant down to ground level). 

If fewer flowers get pollinated, but the ones that do come from plants that are undamaged and vigorous and able to throw all their energy into seed production, on balance the trait is still advantageous.

Even cooler is that the two different morphs rely on different pollinators.  Species of butterfly with a shorter proboscis tend to favor the spurless variant, while the original spurred morph attracts butterflies and hummingbirds with the ability to reach all the way down into the spur.  What the researchers found is that there is much less cross-pollination between the two morphs than there is between plants of the same morph.

For speciation to occur, there needs to be two things at work: (1) a genetic change that acts as a selecting mechanism, and (2) reproductive isolation between the two different morphs.  This trait checks both boxes.

So it looks like the Colorado blue columbine may be on the way to splitting into two species.

Once again, we have an example from the real world demonstrating the power and depth of the evolutionary model -- and one that's kind of hard to explain if you don't buy it.  This time, it's a pretty little flower that has vindicated Darwin, and shown that right in front of our eyes, evolution is still "creating many forms most beautiful and most wonderful."

**************************************

Monday, February 21, 2022

The lenses of language

When we think of the word "endangered," usually what comes to mind isn't "languages," but there are a staggering number of languages for which the last native speakers will be gone in the next few decades.  Of the seven-thousand-odd languages currently spoken in the world, ten of them -- a little over a tenth of a percent of the total -- are the main language of 4.9 billion people, about sixty percent of the Earth's population.

It's easy to see why biological diversity is critical to an ecosystem; species can evolve such narrow niches that if they become extinct, that niche vanishes, along with everything that depended on it.  It's a little harder to put a finger on why linguistic diversity is critical.  If some obscure language spoken in the Australian Outback disappears, who (other than linguists) should care?

I choose Australia deliberately.  Since the first major contact between indigenous Australians and Europeans, in 1788 when the "First Fleet" of convicts from England and Wales landed in what is now Sydney Harbor, over half of the 250 or so indigenous languages have vanished completely.  About 110 are still in use, primarily by the older generation, and only twenty are in common usage and still being learned by children as their first language.

Language is such an integral part of cultural identity that this is nothing short of tragic.  But the loss goes even deeper than that.  The language(s) we speak change the way we see the world.  Take, for example, the Guugu Yimithirr language, spoken in one small village in the far north of Queensland, which has 775 native speakers left.  This language has the odd characteristic -- shared, so far as I know, only with a handful of languages in Siberia -- of not having words for left, right, in front of, and behind.  The position of an object is always described in terms of the four cardinal directions.  Right now, for example, my laptop wouldn't be "in front of me;" it would be "southeast of me."

When the Guugu Yimithirr people first came into contact with English speakers, they at first were completely baffled by what left and right even meant.  When it finally sunk in what the English speakers were trying to explain, the Guugu Yimithirr thought it was hilarious.  "Everything in the world depends on the position of your body?" they said.  "And when you turn your body, the entire world changes shape?  What an arrogant people you must be."

Every language lost robs us of a unique lens through which to see the universe.

The reason this rather elegiac topic comes up is because of another place that is a hotspot for endangered languages -- South America.  Last week it was announced that Cristina Calderón, of Puerto Williams in southern Chile, died at the age of 93.  Calderón, known to locals as Abuela Cristina, was the last native speaker of Yaghan, an indigenous language in Tierra del Fuego.  Not only was Yaghan down to a single native speaker, the language itself is a linguistic isolate -- a language that shows no relationship to any other language known.

So this isn't like losing a single species; it's like losing an entire family of species.

The government of Chile, in a well-meant but too-little-too-late effort, is funding the development of an educational curriculum in Yaghan, as well as a complete (or complete as it can be) Yaghan-Spanish dictionary.  The problem is -- as anyone who has learned a second language can attest -- there's a world of difference between second-language acquisition and learning your native language.  As Calderón put it, "I'm the last speaker of Yaghan.  Others can understand it but don't speak it or know it like I do."

As far as Yaghan's fascinating characteristics, the one that jumps out at me is the presence of rich sound symbolism.  This isn't onomatopoeia (like the words bang and boom in English), but is when a phonemic feature tends to show up in words with similar meanings.  Sound symbolism of some sort seems to be pretty universal.  The most famous example is the "kiki-bouba effect," discovered in 1929 by linguist Wolfgang Köhler.  Köhler made two simple drawings:


He then asked people of various linguistic and cultural backgrounds one question: a newly-discovered language has names for these two shapes.  One of them is called kiki and the other is called bouba.  Which is which?

Across the board, people identified the left-hand one as kiki and the right-hand one as bouba.  Something about the /k/ and /i/ phonemes in kiki was associated with sharpness and angularity (and negative or harsh concepts), and the /b/ and /u/ phonemes in bouba with softness and roundness (and positive or pleasant concepts).  It shows up in English in words like screech and scream and creep, and bubble and bless and billow -- but it's an effect that has shown up in just about every language where it's been tested.

In Yaghan, the sound symbolism is much richer.  It's usually connected with the beginnings or ends of the words -- words ending in /m/ often connote something rounded or soft (think of lump and bump in English), while /x/ at the end often connects with something dry or brittle.  An initial // (pronounced like the first sound in the word chip) is frequently associated with objects with spines or thorns or sharp edges.  And so on.

How does this shape how a native Yaghan speaker sees, understands, and classifies the world?

I know that language extinction isn't really preventable, at least not in the larger sense; languages have been splitting and evolving and going extinct for as long as our ancestors have had the capacity for speech.  But I can't help but feel that the primacy a handful of languages have achieved over the thousands of other ways to communicate is robbing us of some of the depth of the human experience.  Especially when you consider that a significant component of that primacy has been the determination by colonizers to eradicate the culture of indigenous groups and replace it with their own.

So in the grand scheme of things, it may not mean all that much that the last native speaker of Yaghan is gone.  But I still feel sad about it.  It's only by looking at the world through a new lens that we find out how limited our own view is -- and how much that view can expand by observing our knowledge through a different one.

*********************************

In the long tradition of taking something that works and changing it so it doesn't work anymore, Amazon has seen fit to seriously complicate how content creators (i.e. people like me) incorporate affiliate links in their online content.  I'm trying to see if I can figure out how to get it to work, but until that happens, I am unfortunately going to suspend my Skeptophilia book-of-the-week feature.  If I can get it up and running again with the new system, I'll resume.  I'll keep you updated.


Saturday, February 19, 2022

Remembrance of things past

Like many People Of A Certain Age, I'm finding that my memory isn't what it used to be.

I walk into a room, and then say, "Why did I come in here?"  I'll think, "I don't need a grocery list, I'm just going for a few things," and come back with half of them.  We just had our dogs in for their annual checkups and shots, and there were a few things for each of them we wanted to ask the vet about.  My wife and I dutifully sat down and made a list -- and both of us forgot to put something on the list that we'd talked about only the previous day.

It's shown up, too, in more academic pursuits.  For my birthday last year my wife got me an online course through Udemy in beginning Japanese, a language I've always wanted to learn.  My dad had been stationed in Japan in the 1950s, and he learned enough of the language to get by; I grew up around the Japanese art and music my dad brought back with him, and became a Japanophile for life.  So I was thrilled to have the opportunity to study the country's unique and beautiful language.  The course starts out with a brief pronunciation guide, then launches into the hiragana -- one of three scripts used in written Japanese.  Each of the 46 characters stands for either a phoneme or a syllable, and some of them look quite a bit alike, so it's a lot to remember.  I have flash cards I made for all 46, and there are some I consistently miss, every single time I go through them.

When I flip the card over, my response is always, "Damn!  Of course!  Now I remember it!"  I recognize the character immediately, and can often even remember the mnemonic the teacher suggested to use in recalling it.  I'm getting there -- of the 46, there are about ten that I still struggle with -- but I know that twenty years ago, I'd have them all down cold by now.

Kids playing a memory game [Image is in the Public Domain]

Understandably, there's a nasty little thought in the back of my mind about senility and dementia.  My mother's sister had Alzheimer's -- to my knowledge, the only person in my extended family to suffer from that horrific and debilitating disease -- and I watched her slow slide from a smart, funny woman who could wipe the floor with me at Scrabble, did crossword puzzles in ink, and read voraciously, to a hollow, unresponsive shell.  I can think of no more terrifying fate. 

A new piece of research in Trends in Cognitive Science has to some extent put my mind at ease.  In "Cluttered Memory Representations Shape Cognition in Old Age," psychologists Tarek Amer (of Columbia University), Jordana Wynn (of Harvard University), and Lynn Hasher (of the University of Toronto) found that the forgetfulness a lot of us experience as we age isn't a simple loss of information, it's a loss of access to information that's still there, triggered by the clutter of memories from the past.

The authors write:
Wisdom and knowledge, cognitive functions that surely depend on being able to access and use memory, grow into old age.  Yet, the literature on memory shows that intentional, episodic memory declines with age.  How are we to account for this paradox?  To do so, we need to understand three aspects of memory differences associated with aging, two of which have received extensive investigation: age differences in memory encoding and in retrieval.  A third aspect, differences in the contents of memory representations, has received relatively little empirical attention.  Here, we argue that this aspect is central to a full understanding of age differences in memory and memory-related cognitive functions.  We propose that, relative to younger adults, healthy older adults (typically between 60 and 85 years of age) process and store too much information, the result of reductions in cognitive control or inhibitory mechanisms.  When efficient, these mechanisms enable a focus on target or goal-relevant information to the exclusion (or suppression) of irrelevant information.  Due to poor control (or reduced efficiency), the mnemonic representations of older adults can include: (i) recently activated but no-longer-relevant information; (ii) task-unrelated thoughts and/or prior knowledge elicited by the target information; and/or (iii) task-irrelevant information cued by the immediate environment.  This information is then automatically bound together with target information, creating cluttered memory representations that contain more information than do those of younger adults.

It's like trying to find something in a cluttered, disorganized attic.  Not only is it hard to locate what you're looking for, you get distracted by the other things you run across.  "Wow, it's been years since I've seen this!  I didn't even know this was up here!.... wait, what am looking for?"

I've noticed this exact problem in the kitchen.  I'm the chief cook in our family, and I love to make complex dinners with lots of ingredients.  I've found that unless I want to make a dozen trips to the fridge or cabinets to retrieve three items, I need to focus on one thing at a time.  Get a green pepper from the vegetable crisper.  Find the bottle of cooking sherry.  Go get the bottle of tabasco sauce from the table.  If I try to keep all three in my mind at once, I'm sure to return to the stove and think, "Okay, what the hell do I need, again?"

I wonder if this mental clutter is at the heart of my struggle with memorizing the hiragana characters in Japanese.  I've done at least a cursory study of about a dozen languages -- I'm truly fluent in only a couple, but my master's degree in historical linguistics required me to learn at least the rudiments of the languages whose history I was studying.  Could my difficulty in connecting the Japanese characters to the syllables they represent be because my Language Module is clogged with Old Norse and Welsh and Scottish Gaelic and Icelandic, and those all get in the way?

In any case, it's kind of a relief that I'm (probably) not suffering from early dementia.  It also gives me an excuse the next time my wife gets annoyed at me for forgetting something.  "I'm sorry, dear," I'll say.  "I'd have remembered it, but my brain is full.  But at least I remembered that the character yo looks like a yo-yo hanging from someone's finger!"

Nah, I doubt that'll work, and the fact that I remembered one of the Japanese characters instead of stopping by the store to pick up milk and eggs will only make it worse.  When I want to be sure not to forget something, I guess I'll have to keep making a list.

The only problem is then, I need to remember where I put the list.

***************************************

People made fun of Donald Rumsfeld for his statement that there are "known unknowns" -- things we know we don't know -- but a far larger number of "unknown unknowns," which are all the things we aren't even aware that we don't know.

While he certainly could have phrased it a little more clearly, and understand that I'm not in any way defending Donald Rumsfeld's other actions and statements, he certainly was right in this case.  It's profoundly humbling to find out how much we don't know, even about subjects about which we consider ourselves experts.  One of the most important things we need to do is to keep in mind not only that we might have things wrong, and that additional evidence may completely overturn what we thought we knew -- and more, that there are some things so far out of our ken that we may not even know they exist.

These ideas -- the perimeter of human knowledge, and the importance of being able to learn, relearn, change directions, and accept new information -- are the topic of psychologist Adam Grant's book Think Again: The Power of Knowing What You Don't Know.  In it, he explores not only how we are all riding around with blinders on, but how to take steps toward removing them, starting with not surrounding yourself with an echo chamber of like-minded people who might not even recognize that they have things wrong.  We should hold our own beliefs up to the light of scrutiny.  As Grant puts it, we should approach issues like scientists looking for the truth, not like a campaigning politician trying to convince an audience.

It's a book that challenges us to move past our stance of "clearly I'm right about this" to the more reasoned approach of "let me see if the evidence supports this."  In this era of media spin, fake news, and propaganda, it's a critical message -- and Think Again should be on everyone's to-read list.

[Note: if you purchase this book using the image/link below, part of the proceeds goes to support Skeptophilia!]


Friday, February 18, 2022

Academic predators

Today's topic, which comes to me via a long-time loyal reader of Skeptophilia, has a funny side and a not-so-funny side.

The link my friend sent me was to a paper called "The Psychometric Measurement of God," by one George Hammond, M.S. Physics.  In it, he claims to have used the methods of physics to prove that God exists, which would be a pretty good feat.  So I eagerly read the paper, which turned out to be an enormous mélange of sciency-sounding terms, evidently using a template something like this: "(big word) (big word) (big word) God (big word) (big word) (big word) (big word) matrix (big word) (big word) scientific measurement (big word) (big word) (big word) God exists q.e.d."

Don't believe me? Here's a representative passage:
I had already published in 1994 a peer-reviewed paper in a prominent journal pointing out that there was a decussation in the Papez Loop in Jeffrey Gray’s fornical septo-hippocampal system indicating that it regulated not only Anxiety as he said it did, but in a diagonal mode of operational so regulated his Impulsivity dimension.  In the brain the septum is located dead center in the “X” formed by the fornix thus regulating information to and from all 8 cubic lobes of the brain via the fornical Papez Loop.  Since then the septal area is also dead center in Thurstone’s Box in the brain I eventually realized that Gray’s septo-hippocampal system controls all 13 personality dimensions of the Structural Model of Personality!...  Meanwhile, factorization of this 4 x 4 matrix yields one, single, final top 4th order eigenvector of Psychology.  What could this factor be?...  [T]he final top factor in Psychology is in fact the God of the Bible.  Since this is a scientific measurement, God can actually be measured to 2 decimal point accuracy.
Please note that I didn't select this passage because it sounds ridiculous; it all sounds like this.

Or maybe, with my mere B.S. in Physics, I'm just not smart enough to understand it.

The fact that this is a wee bit on the spurious side is accentuated by the various self-congratulatory statements scattered through it, like "this is nothing less than awesome!" and "if you think discovering the gods is an amazing scientific turn of events, brace yourself!" and "my personal scientific opinion as a graduate physicist is that the possibility [of my being correct] is better than 1 in 3."  Also, the inadvertently hilarious statement that "evolutionary biology discovered the 'airbag theory' millions of years before General Motors did" might clue you in to the possibility that this paper may not have been peer reviewed.

But so far, this is just some loony guy writing some loony stuff, which should come as no big surprise, because after all, that's what loony guys do.  And there's not much to be gained by simply poking fun at what, honestly, are low-hanging fruit.  But that brings us to the less-than-amusing part.

The site where this "paper" was published is academia.edu.  My general thought has been that most .edu sites are pretty reliable, but that may have to be revised.  "Academia" is not only not peer reviewed -- it's barely even moderated.  Literally anyone can publish almost anything.


Basically, it's not a site for valid scientific research; it's purely a money-making operation.  If you poke around on the site a little, you'll find you're quickly asked to sign up and give them your email, and badgered to subscribe (for a monthly fee, of course).  I probably don't need to say this, but do not give these people your email.  It turns out there's a page on Quora devoted to the topic of academia.edu, and the comments left by people who have actually interacted with them are nothing short of scathing.  Here's a sampler:
  • If you sign up, the people who upload the pdf files will give you exactly what it seemed like they would give you, a paper pdf that makes you sign up using another link, which is also fake!  If you ask to contact the person who wrote it, they will either ignore you or block you. Don’t sign up for Academia, because when you do they just take you to another link, which is ridiculous.  Academia is a public research company, they don’t review anything or enforce rules.
  • I found it very unsettling that the ONLY permission they ask for is to….VIEW AND DOWNLOAD YOUR CONTACTS!  That was a SERIOUS tip-off to me that something wasn’t right.
  • It’s a scam, they try every trick in the book to get you to sign up; according to them I must be one of the most famous people on the planet.
  • I hate this site.  Looks like scammy trash.  I tried to sign up and after receiving my e-mail (use an account you don’t care about), then it looks like I can only proceed if I sign up for a bulk download account, and that costs money.  Fuck 'em.
  • They are scammers trying to get your money.  They told me I was cited in more than 2k papers.  My name is not common and I don't participate in the academic world.
  • Be careful with this.  Academia.edu was flagged by gmail and seems to have full access to my Google Account, not even partial access.  Given some of the other privacy and IP considerations with sharing your content on this site I would steer clear of it in future regardless - it’s basically a LinkedIn with similar commercial ambitions to make VCs a ton of money so there are the common concerns of “you’re the product” and “your content is now their content”.  Regardless this level of access to gmail is unwarranted and an invasion of privacy and was not clearly disclosed when I signed up (quick sign up to download a document).
So, the sad truth is that just because a site has .edu in its address, it's not necessarily reliable.  I always say "check sources, then check them again," but this is becoming harder and harder with pay-to-play sites (often called "predatory journals") that will publish any damn thing people submit.  From what I found, it seems like academia.edu isn't exactly pay-to-play; there's apparently not a fee for uploading your paper, and the money they make is from people naïve enough to sign up for a subscription.  (Of course, I couldn't dig into their actual rules and policies, because then I would have had to sign up, and I'm damned if I'm letting them get anywhere near my email address, much less my money.)  Even so, what this means is that the papers you find there, like the one by the estimable Mr. Hammond (M.S. Physics) have not passed any kind of gatekeeper.  There may be legitimate papers on the site; it's possible some younger researchers, trying to establish their names in their fields, are lured in by the possibility of getting their work in print somewhere.  Those papers are probably okay.

But as Hammond's "(big word) (big word) (big word) I proved that God exists!  I'm awesome!" paper illustrates, it would be decidedly unwise to trust everything on their site.

So once again: check your sources.  Don't just do a search to find out if what you're looking into has been published somewhere; find out where it's been published, and by whom, and then see if you can find out whether the author and the publication are legitimate.

It may seem like a lot of work, but if you want to stem the rising tide of false claims circulating madly about -- and I hope we all do -- it's well worth the time.

***************************************

People made fun of Donald Rumsfeld for his statement that there are "known unknowns" -- things we know we don't know -- but a far larger number of "unknown unknowns," which are all the things we aren't even aware that we don't know.

While he certainly could have phrased it a little more clearly, and understand that I'm not in any way defending Donald Rumsfeld's other actions and statements, he certainly was right in this case.  It's profoundly humbling to find out how much we don't know, even about subjects about which we consider ourselves experts.  One of the most important things we need to do is to keep in mind not only that we might have things wrong, and that additional evidence may completely overturn what we thought we knew -- and more, that there are some things so far out of our ken that we may not even know they exist.

These ideas -- the perimeter of human knowledge, and the importance of being able to learn, relearn, change directions, and accept new information -- are the topic of psychologist Adam Grant's book Think Again: The Power of Knowing What You Don't Know.  In it, he explores not only how we are all riding around with blinders on, but how to take steps toward removing them, starting with not surrounding yourself with an echo chamber of like-minded people who might not even recognize that they have things wrong.  We should hold our own beliefs up to the light of scrutiny.  As Grant puts it, we should approach issues like scientists looking for the truth, not like a campaigning politician trying to convince an audience.

It's a book that challenges us to move past our stance of "clearly I'm right about this" to the more reasoned approach of "let me see if the evidence supports this."  In this era of media spin, fake news, and propaganda, it's a critical message -- and Think Again should be on everyone's to-read list.

[Note: if you purchase this book using the image/link below, part of the proceeds goes to support Skeptophilia!]


Thursday, February 17, 2022

Big geology

It's easy to get overwhelmed when you start looking into geology.

Both the size scale and the time scale are so immense that it's hard to wrap your brain around them.  Huge forces at work, that have been at work for billions of years -- and will continue to work for another billion.  Makes me feel awfully... insignificant.

The topic comes up because of three recent bits of research into just how powerful geological processes can be.  In the first, scientists were studying a crater field in Wyoming that dates to the Permian Period, around 280 million years ago (28 million years, give or take, before the biggest mass extinction the Earth has ever experienced).  The craters are between ten and seventy meters in diameter, and there are several dozen of them, all dating from right around the same time.  The thought was that they were created when an asteroid exploded in the upper atmosphere, raining debris of various sizes on the impact site.

The recent research, though, shows that what happened was even more dramatic.

"Many of the craters are clustered in groups and are aligned along rays," said Thomas Kenkmann of the University of Freiburg, who led the project.  "Furthermore, several craters are elliptical, allowing the reconstruction of the incoming paths of the impactors.  The reconstructed trajectories have a radial pattern.  The trajectories indicate a single source and show that the craters were formed by ejected blocks from a large primary crater."

So what appears to have happened is this.

A large meteorite hit the Earth -- triangulating from the pattern of impact craters, something like 150 and 200 kilometers away -- and the blast flung pieces of rock (both from the meteorite and from the impact site) into the air, which then arced back down and struck at speeds estimated to be up to a thousand meters per second.  The craters were formed by impacts from rocks between four and eight meters across, and the primary impact crater (which has not been found, but is thought to be buried under sediments somewhere near the Wyoming-Nebraska border) is thought to be fifty kilometers or more across.

Imagine it.  A huge rock from space hits a spot two hundred kilometers from where you are, and five minutes later you're bombarded by boulders traveling at a kilometer per second. 

This is called "having a bad day."

[Image licensed under the Creative Commons State Farm, Asteroid falling to Earth, CC BY 2.0]

The second link was to research about the geology of Japan -- second only to Indonesia as one of the most dangerously active tectonic regions on Earth -- which showed the presence of a pluton (a large underground blob of rock different from the rocks that surround it) that sits right near the Nankai Subduction Zone.  This pluton is so large that it actually deforms the crust -- causing the bit above it to bulge and the bit below it to sag.  This creates cracks down which groundwater can seep.

And groundwater acts as a lubricant.  So this blob of rock is, apparently, acting as a focal point for enormous earthquakes.

The Kumano pluton (the red bulge in the middle of the image).  The Nankai Subduction Zone is immediately to the left.

Slipping in this subduction zone caused two earthquakes of above magnitude 8, in 1944 and 1946.  Understanding the structure of this complex region might help predict when and where the next one will come.

If that doesn't make you feel small enough, the third piece of research was into the Missoula Megaflood -- a tremendous flood (thus the name) that occurred 18,000 years ago.

During the last ice age, a glacial ice dam formed across what is now the northern Idaho Rockies.  As the climate warmed, the ice melted, and the water backed up into an enormous lake -- called Lake Missoula -- that covered a good bit of what is now western Montana.  Further warming eventually caused the ice dam to collapse, and all that water drained out, sweeping across what is now eastern Washington, and literally scouring the place down to bedrock.  You can still see the effects today; the area is called the "Channeled Scablands," and is formed of teardrop-shaped pockets of relatively intact topsoil surrounded by gullies floored with bare rock.  (If you've ever seen what a shallow stream does to a sandy beach as it flows into sea, you can picture exactly what it looks like.)

The recent research has made the story even more interesting.  One thing that a lot of laypeople have never heard of is the concept of isostasy -- that the tectonic plates, the chunks of the Earth's crust, are actually floating in the liquid mantle beneath them, and the level they float is dependent upon how heavy they are, just as putting heavy weights in a boat make it float lower in the water.  Well, as the Cordilleran Ice Sheet melted, that weight was removed, and the flat piece of crust underneath it tilted upward on the eastern edge.

It's like having a full bowl of water on a table, and lifting one end of the table.  The bowl will dump over, spilling out the water, and it will flow downhill and run off the edge -- just as Lake Missoula did.

Interestingly, exactly the same thing is going on right now underneath Great Britain.  During the last ice age, Scotland was completely glaciated; southern England was not.  The melting of those glaciers has resulted in isostatic rebound, lifting the northern edge of the island by ten centimeters per century.  At the same time, the tilt is pushing southern England downward, and it's sinking, at about five centimeters per century.  (Fortunately, there's no giant lake waiting to spill across the country.)

We humans get a bit cocky at times, don't we?  We're powerful, masters of the planet.  Well... not really.  We're dwarfed by structures and processes we're only beginning to understand.  Probably a good thing, that.  Arrogance never did anyone any favors.  There's nothing wrong with finding out we're not invincible -- and that there are a lot of things out there way, way bigger than we are, that don't give a rat's ass for our little concerns.

***************************************

People made fun of Donald Rumsfeld for his statement that there are "known unknowns" -- things we know we don't know -- but a far larger number of "unknown unknowns," which are all the things we aren't even aware that we don't know.

While he certainly could have phrased it a little more clearly, and understand that I'm not in any way defending Donald Rumsfeld's other actions and statements, he certainly was right in this case.  It's profoundly humbling to find out how much we don't know, even about subjects about which we consider ourselves experts.  One of the most important things we need to do is to keep in mind not only that we might have things wrong, and that additional evidence may completely overturn what we thought we knew -- and more, that there are some things so far out of our ken that we may not even know they exist.

These ideas -- the perimeter of human knowledge, and the importance of being able to learn, relearn, change directions, and accept new information -- are the topic of psychologist Adam Grant's book Think Again: The Power of Knowing What You Don't Know.  In it, he explores not only how we are all riding around with blinders on, but how to take steps toward removing them, starting with not surrounding yourself with an echo chamber of like-minded people who might not even recognize that they have things wrong.  We should hold our own beliefs up to the light of scrutiny.  As Grant puts it, we should approach issues like scientists looking for the truth, not like a campaigning politician trying to convince an audience.

It's a book that challenges us to move past our stance of "clearly I'm right about this" to the more reasoned approach of "let me see if the evidence supports this."  In this era of media spin, fake news, and propaganda, it's a critical message -- and Think Again should be on everyone's to-read list.

[Note: if you purchase this book using the image/link below, part of the proceeds goes to support Skeptophilia!]


Wednesday, February 16, 2022

Goldilocks next door

Springboarding off yesterday's post, about how easy it is to form organic compounds abiotically, today we have: our nearest neighbor might be a decent candidate for the search for extraterrestrial life.

At only 4.24 light years away, Proxima Centauri is the closest star to our own Sun.  It's captured the imagination ever since it was discovered how close it is; if you'll recall, the intrepid Robinson family of Lost in Space was heading toward Alpha Centauri, the brightest star in this triple-star system, which is a little father away (4.37 light years) but still more or less right next door, as these things go.

It was discovered in 2016 that Proxima Centauri has a planet in orbit around it -- and more exciting still, it's only a little larger than Earth (1.17 times Earth's mass, to be precise), and is in the star's "Goldilocks zone," where water can exist in liquid form.  The discovery of this exoplanet (Proxima Centauri b) was followed in 2020 by the discovery of Proxima Centauri c, thought to be a "mini-Neptune" at seven times Earth's mass, so probably not habitable by life as we know it.

And now, a paper in Nature has presented research indicating that Proxima Centauri has a third exoplanet -- somewhere between a quarter and three-quarters of the Earth's mass, and right in the middle of the Goldilocks zone as well.

"It is fascinating to know that our Sun’s nearest stellar neighbor is the host to three small planets," said Elisa Quintana, an astrophysicist at NASA’s Goddard Space Flight Center in Greenbelt, Maryland, who co-authored the paper.  "Their proximity make this a prime system for further study, to understand their nature and how they likely formed."

The newly-discovered planet was detected by observing shifts in the light spectrum emitted by the star as the planet's gravitational field interacted with it -- shifts in wavelength as little as 10 ^-5 ångströms, or one ten-thousandth the diameter of a hydrogen atom.  The device that accomplished this is the Echelle Spectrograph for Rocky Exoplanets and Stable Spectroscopic Observations (ESPRESSO -- because you can't have an astronomical device without a clever acronym) at the European Southern Observatory in Cerro Paranal, Chile.  

"It’s showing that the nearest star probably has a very rich planetary system," said co-author Guillem Anglada-Escudé, of the Institute of Space Sciences in Barcelona.  "It always has a little bit of mystique, being the closest one."

What this brings home to me is how incredibly common planets in the Goldilocks zone must be.  It's estimated that around two percent of spectral class F, G, and K stars -- the ones most like the Sun -- have planets in the habitable zone.  If this estimate is accurate -- and if anything, most astrophysicists think it's on the conservative side -- that means there's five hundred million habitable planets in the Milky Way alone.

Of course, "habitable" comes with several caveats.  Average temperature and proximity to the host star isn't the only thing that determines if a place is actually habitable.  Remember, for example, that Venus is technically in the Goldilocks zone, but because of its atmospheric composition it has a surface temperature hot enough to melt lead, and an atmosphere made mostly of carbon dioxide and sulfuric acid.  Being at the right distance to theoretically have liquid water doesn't mean it actually does.  Besides atmospheric composition, other things that could interfere with a planet having a clement climate are the eccentricity of the orbit (high eccentricity would result in wild temperature fluctuations between summer and winter), the planet being tidally locked (the same side always facing the star), and how stable the star itself is.  Some stars are prone to stellar storms that make the ones our Sun has seem like gentle breezes, and would irradiate the surface of any planets orbiting them in such a way as to damage or destroy anything unlucky enough to be exposed.

But still -- come back to the "life as we know it" part.  Yeah, a tidally-locked planet that gets fried by stellar storms would be uninhabitable for us, but perhaps there are life forms that evolved to avoid the dangers.  As I pointed out yesterday, the oxygen we depend on is actually a highly reactive toxin -- we use it to make our cellular respiration reactions highly efficient, but it's also destructive to tissues unless you have ways to mitigate the damage.  (Recall that burning is just rapid oxidation.)  My hunch -- and it is just a hunch -- is that just as we find life even in the most inhospitable places on Earth, it'll be pretty ubiquitous out in space.

After all, remember what we learned from Ian Malcolm in Jurassic Park:



***************************************

People made fun of Donald Rumsfeld for his statement that there are "known unknowns" -- things we know we don't know -- but a far larger number of "unknown unknowns," which are all the things we aren't even aware that we don't know.

While he certainly could have phrased it a little more clearly, and understand that I'm not in any way defending Donald Rumsfeld's other actions and statements, he certainly was right in this case.  It's profoundly humbling to find out how much we don't know, even about subjects about which we consider ourselves experts.  One of the most important things we need to do is to keep in mind not only that we might have things wrong, and that additional evidence may completely overturn what we thought we knew -- and more, that there are some things so far out of our ken that we may not even know they exist.

These ideas -- the perimeter of human knowledge, and the importance of being able to learn, relearn, change directions, and accept new information -- are the topic of psychologist Adam Grant's book Think Again: The Power of Knowing What You Don't Know.  In it, he explores not only how we are all riding around with blinders on, but how to take steps toward removing them, starting with not surrounding yourself with an echo chamber of like-minded people who might not even recognize that they have things wrong.  We should hold our own beliefs up to the light of scrutiny.  As Grant puts it, we should approach issues like scientists looking for the truth, not like a campaigning politician trying to convince an audience.

It's a book that challenges us to move past our stance of "clearly I'm right about this" to the more reasoned approach of "let me see if the evidence supports this."  In this era of media spin, fake news, and propaganda, it's a critical message -- and Think Again should be on everyone's to-read list.

[Note: if you purchase this book using the image/link below, part of the proceeds goes to support Skeptophilia!]


Tuesday, February 15, 2022

The recipe for life

Back in my teaching days, I was all too aware of how hard it was to generate any kind of enthusiasm for the details of biology in a bunch of teenagers.  But there were a few guaranteed oh-wow moments -- and one that I always introduced by saying, "If this doesn't blow your mind, you're not paying attention."

What I was referring to was the Miller-Urey experiment.  This phenomenal piece of research was an attempt to see if it was possible to create organic compounds abiotically -- with clear implications for the origins of life.  Back in the early twentieth century, when people started to consider seriously the possibility that life started on Earth without the intervention of a deity, the obvious question was, "How?"  So they created apparatus to take collections of inorganic compounds surmised to be abundant on the early Earth, subject them to various energy sources, and waited to see what happened.

What happened was that they basically created smog and dirty water.  No organic compounds.  In 1922, Soviet biochemist Alexander Oparin suggested that the problem might be that they were starting with the assumption that the Earth's atmosphere hadn't changed much -- and looking at (then) new information about the atmosphere of Jupiter, he suggested that perhaps, the early Earth's atmosphere had no free oxygen.  In chemistry terms, it was a reducing atmosphereOxygen, after all, is a highly reactive substance, good at tearing apart organic molecules.  (There's decent evidence that the pathways of aerobic cellular respiration originally evolved as a way of detoxifying oxygen, and only secondarily gained a use at increasing the efficiency of releasing the energy in food molecules.)

It wasn't until thirty years later that anyone tested Oparin's hunch.  Stanley Miller and Harold Urey, of the University of Chicago, created an apparatus made of sealed, interconnected glass globes, and filled them with their best guess at the gases present in the atmosphere of the early Earth -- carbon monoxide, methane, hydrogen sulfide, sulfur dioxide, water vapor, various nitrogen oxides, hydrogen cyanide (HCN), and so on.  No free (diatomic) oxygen.  They then introduced an energy source -- essentially, artificial lightning -- and sat back to wait.

No one expected fast results.  After all, the Earth had millions of years to generate enough organic compounds to (presumably) self-assemble into the earliest cells.  No one was more shocked than Miller and Urey when they came in the next day to find that the water in their apparatus had turned blood red.  Three days later, it was black, like crude oil.  At that point, they couldn't contain their curiosity, and opened it up to see what was there.

All twenty amino acids, plus several amino acids not typically found in living things on Earth.  Simple sugars.  Fatty acids.  Glycerol.  DNA and RNA nucleotides.  Basically, all the building blocks it takes to make a living organism.

In three days.

A scale model of the Miller-Urey apparatus, made for me by my son, who is a professional scientific glassblower

This glop, now nicknamed the "primordial soup," is thought to have filled the early oceans.  Imagine it -- you're standing on the shore of the Precambrian sea (wearing a breathing apparatus, of course).  On land is absolutely nothing alive -- a continent full of nothing but rock and sand.  In front of you is an ocean that appears to be composed of thick, dark oil.

It'd be hard to convince yourself this was actually Earth.

Since then, scientists have re-run the experiment hundreds of times, checking to see if perhaps Miller and Urey had just happened by luck on the exact right recipe, but it turns out this experiment is remarkably insensitive to initial conditions.  As long as you have three things -- (1) the right inorganic building blocks, (2) a source of energy, and (3) no free oxygen -- you can make as much of this rather unappealing soup as you want.

So, it turns out, generating biochemicals is a piece of cake.  And a piece of research at Friedrich Schiller University and the Max Planck Institute have shown that it's even easier than that -- the reactions that create amino acids can happen out in space.

"Water plays an important role in the conventional way in which peptides are created," said Serge Krasnokutski, who co-authored the paper.  "Our quantum chemical calculations have now shown that the amino acid glycine can be formed through a chemical precursor – called an amino ketene – combining with a water molecule.  Put simply: in this case, water must be added for the first reaction step, and water must be removed for the second...  [So] instead of taking the chemical detour in which amino acids are formed, we wanted to find out whether amino ketene molecules could not be formed instead and combine directly to form peptides.  And we did this under the conditions that prevail in cosmic molecular clouds, that is to say on dust particles in a vacuum, where the corresponding chemicals are present in abundance: carbon, ammonia, and carbon monoxide."

The more we look into this, the simpler it seems to be to generate the chemicals of life -- further elucidating how the first organisms formed on Earth, and (even more excitingly) suggesting that life might be common in the cosmos.  In fact, it may not even take an Earth-like planet to be a home for life; as long as a planet is in the "Goldilocks zone" (the distance from its parent star where water can exist in liquid form), getting from there to an organic-compound-rich environment may not be much of a hurdle.

That's still a long way from intelligent life, of course; chances are, the planets with extraterrestrial life mostly have much simpler organisms.  But how exciting is that?  Setting foot on a planet covered with life -- none of which has any common ancestry with terrestrial organisms.

I can think of very little that would be more thrilling than that.

***************************************

People made fun of Donald Rumsfeld for his statement that there are "known unknowns" -- things we know we don't know -- but a far larger number of "unknown unknowns," which are all the things we aren't even aware that we don't know.

While he certainly could have phrased it a little more clearly, and understand that I'm not in any way defending Donald Rumsfeld's other actions and statements, he certainly was right in this case.  It's profoundly humbling to find out how much we don't know, even about subjects about which we consider ourselves experts.  One of the most important things we need to do is to keep in mind not only that we might have things wrong, and that additional evidence may completely overturn what we thought we knew -- and more, that there are some things so far out of our ken that we may not even know they exist.

These ideas -- the perimeter of human knowledge, and the importance of being able to learn, relearn, change directions, and accept new information -- are the topic of psychologist Adam Grant's book Think Again: The Power of Knowing What You Don't Know.  In it, he explores not only how we are all riding around with blinders on, but how to take steps toward removing them, starting with not surrounding yourself with an echo chamber of like-minded people who might not even recognize that they have things wrong.  We should hold our own beliefs up to the light of scrutiny.  As Grant puts it, we should approach issues like scientists looking for the truth, not like a campaigning politician trying to convince an audience.

It's a book that challenges us to move past our stance of "clearly I'm right about this" to the more reasoned approach of "let me see if the evidence supports this."  In this era of media spin, fake news, and propaganda, it's a critical message -- and Think Again should be on everyone's to-read list.

[Note: if you purchase this book using the image/link below, part of the proceeds goes to support Skeptophilia!]