Skeptophilia (skep-to-fil-i-a) (n.) - the love of logical thought, skepticism, and thinking critically. Being an exploration of the applications of skeptical thinking to the world at large, with periodic excursions into linguistics, music, politics, cryptozoology, and why people keep seeing the face of Jesus on grilled cheese sandwiches.

Tuesday, February 22, 2022

Splitting the difference

One of the most misunderstood pieces of the evolutionary model is that natural selection is almost always a compromise.

Very few changes that could occur an organism's genes (and thus in its physical makeup) are unequivocally good.  (Plenty of them are unequivocally bad, of course.)  Take, for example, our upright posture, which is usually explained as having been selected for by (1) allowing us to see farther over tall grass and thus spot predators, (2) leaving our hands free for tool use, (3) making it easier to carry our offspring before they can walk on their own, or (4) all of the above.  At the same time, remodeling our spines to accommodate walking upright -- basically, taking a vertebral column that evolved in an animal that supported itself on all fours, and just kind of bending it upwards -- has given us a proneness to lower back injury unmatched in the natural world.  The weakening of the rotator cuff, due to the upper body no longer having to support part of our weight, has predisposed us to shoulder dislocations.

Then there are the bad changes that have beneficial features.  One common question I was asked when teaching evolutionary biology is if selection favors beneficial traits and weeds out maladaptive ones, why do negative traits hang around in populations?  One answer is that a lot of maladaptive gene changes are recessive -- you can carry them without showing an effect, and if you and your partner are both carriers, your child can inherit both copies (and thus the ill effect).  But it's even more interesting than that.  It was recently discovered that being a carrier for the gene for the devastating disease cystic fibrosis gives you resistance to one of the biggest killers of babies in places without medical care -- cholera.  It's well known that being heterozygous for the gene for sickle-cell anemia makes you resistant to malaria.  Weirdest of all, the (dominant) gene for the horrible neurodegenerative disorder Huntington's disease gives you an eighty percent lower likelihood of developing cancer -- offset, of course, by the fact that all it takes is one copy of the gene to doom you by age 55 or so to progressive debility, coma, and death.

So the idea of "selective advantage" is more complex than it seems at first.  The simplest way to put it is that if an inheritable change on balance gives you a greater chance of survival and reproduction, it will be selected for even if it gives you disadvantages in other respects, even some serious ones.

The reason the topic comes up is because of a cool piece of research out of the University of California - Santa Barbara into a curious genetic change in the charming little Colorado blue columbine (Aquilegia caerulea), familiar to anyone who's spent much time in the Rocky Mountains.

Colorado blue columbine (Aquilegia caerulea) [Image licensed under the Creative Commons Rob Duval, Heavycolumbinebloom, CC BY-SA 3.0]

Both the common name and scientific name have to do with birds; columba is Latin for dove, aquila Latin for eagle.  The reason is the graceful, backwards-curved tubular petals, which (viewed from the side) look a little like a bird's foot.  The tubes end in nectar glands, and are there to lure in pollinators -- mostly hummingbirds and butterflies -- whose mouthparts can fit all the way down the long, narrow tubes.

Well, the researchers found that not all of them have these.  In fact, there's a group of them that don't have the central petals and nectar spurs at all.  The loss is due to a single gene, APETALA3-3, which simply halts complete flower development.  So far, nothing too odd; there are a lot of cases where some defective gene or another causes the individual to be missing a structure.  What is more puzzling is that in the study region (an alpine meadow in central Colorado), a quarter of the plants have the defective flowers.

You would think that a plant without its prime method of attracting pollinators would be at a serious disadvantage.  How could this gene be selected strongly enough to result in 25% of the plants having the change?  The answer turned out to be entirely unexpected.  The plants with the defective gene don't get visited by butterflies and hummingbirds as much -- but they are also, for some reason, much less attractive to herbivores, including aphids, caterpillars, rabbits, and deer.  So it may be that the flowers don't get pollinated as readily as those of their petal-ful kin, but they are much less likely to sustain energy-depleting damage to the plant itself (in the case of deer, sometimes chomping the entire plant down to ground level). 

If fewer flowers get pollinated, but the ones that do come from plants that are undamaged and vigorous and able to throw all their energy into seed production, on balance the trait is still advantageous.

Even cooler is that the two different morphs rely on different pollinators.  Species of butterfly with a shorter proboscis tend to favor the spurless variant, while the original spurred morph attracts butterflies and hummingbirds with the ability to reach all the way down into the spur.  What the researchers found is that there is much less cross-pollination between the two morphs than there is between plants of the same morph.

For speciation to occur, there needs to be two things at work: (1) a genetic change that acts as a selecting mechanism, and (2) reproductive isolation between the two different morphs.  This trait checks both boxes.

So it looks like the Colorado blue columbine may be on the way to splitting into two species.

Once again, we have an example from the real world demonstrating the power and depth of the evolutionary model -- and one that's kind of hard to explain if you don't buy it.  This time, it's a pretty little flower that has vindicated Darwin, and shown that right in front of our eyes, evolution is still "creating many forms most beautiful and most wonderful."

**************************************

Monday, February 21, 2022

The lenses of language

When we think of the word "endangered," usually what comes to mind isn't "languages," but there are a staggering number of languages for which the last native speakers will be gone in the next few decades.  Of the seven-thousand-odd languages currently spoken in the world, ten of them -- a little over a tenth of a percent of the total -- are the main language of 4.9 billion people, about sixty percent of the Earth's population.

It's easy to see why biological diversity is critical to an ecosystem; species can evolve such narrow niches that if they become extinct, that niche vanishes, along with everything that depended on it.  It's a little harder to put a finger on why linguistic diversity is critical.  If some obscure language spoken in the Australian Outback disappears, who (other than linguists) should care?

I choose Australia deliberately.  Since the first major contact between indigenous Australians and Europeans, in 1788 when the "First Fleet" of convicts from England and Wales landed in what is now Sydney Harbor, over half of the 250 or so indigenous languages have vanished completely.  About 110 are still in use, primarily by the older generation, and only twenty are in common usage and still being learned by children as their first language.

Language is such an integral part of cultural identity that this is nothing short of tragic.  But the loss goes even deeper than that.  The language(s) we speak change the way we see the world.  Take, for example, the Guugu Yimithirr language, spoken in one small village in the far north of Queensland, which has 775 native speakers left.  This language has the odd characteristic -- shared, so far as I know, only with a handful of languages in Siberia -- of not having words for left, right, in front of, and behind.  The position of an object is always described in terms of the four cardinal directions.  Right now, for example, my laptop wouldn't be "in front of me;" it would be "southeast of me."

When the Guugu Yimithirr people first came into contact with English speakers, they at first were completely baffled by what left and right even meant.  When it finally sunk in what the English speakers were trying to explain, the Guugu Yimithirr thought it was hilarious.  "Everything in the world depends on the position of your body?" they said.  "And when you turn your body, the entire world changes shape?  What an arrogant people you must be."

Every language lost robs us of a unique lens through which to see the universe.

The reason this rather elegiac topic comes up is because of another place that is a hotspot for endangered languages -- South America.  Last week it was announced that Cristina Calderón, of Puerto Williams in southern Chile, died at the age of 93.  Calderón, known to locals as Abuela Cristina, was the last native speaker of Yaghan, an indigenous language in Tierra del Fuego.  Not only was Yaghan down to a single native speaker, the language itself is a linguistic isolate -- a language that shows no relationship to any other language known.

So this isn't like losing a single species; it's like losing an entire family of species.

The government of Chile, in a well-meant but too-little-too-late effort, is funding the development of an educational curriculum in Yaghan, as well as a complete (or complete as it can be) Yaghan-Spanish dictionary.  The problem is -- as anyone who has learned a second language can attest -- there's a world of difference between second-language acquisition and learning your native language.  As Calderón put it, "I'm the last speaker of Yaghan.  Others can understand it but don't speak it or know it like I do."

As far as Yaghan's fascinating characteristics, the one that jumps out at me is the presence of rich sound symbolism.  This isn't onomatopoeia (like the words bang and boom in English), but is when a phonemic feature tends to show up in words with similar meanings.  Sound symbolism of some sort seems to be pretty universal.  The most famous example is the "kiki-bouba effect," discovered in 1929 by linguist Wolfgang Köhler.  Köhler made two simple drawings:


He then asked people of various linguistic and cultural backgrounds one question: a newly-discovered language has names for these two shapes.  One of them is called kiki and the other is called bouba.  Which is which?

Across the board, people identified the left-hand one as kiki and the right-hand one as bouba.  Something about the /k/ and /i/ phonemes in kiki was associated with sharpness and angularity (and negative or harsh concepts), and the /b/ and /u/ phonemes in bouba with softness and roundness (and positive or pleasant concepts).  It shows up in English in words like screech and scream and creep, and bubble and bless and billow -- but it's an effect that has shown up in just about every language where it's been tested.

In Yaghan, the sound symbolism is much richer.  It's usually connected with the beginnings or ends of the words -- words ending in /m/ often connote something rounded or soft (think of lump and bump in English), while /x/ at the end often connects with something dry or brittle.  An initial // (pronounced like the first sound in the word chip) is frequently associated with objects with spines or thorns or sharp edges.  And so on.

How does this shape how a native Yaghan speaker sees, understands, and classifies the world?

I know that language extinction isn't really preventable, at least not in the larger sense; languages have been splitting and evolving and going extinct for as long as our ancestors have had the capacity for speech.  But I can't help but feel that the primacy a handful of languages have achieved over the thousands of other ways to communicate is robbing us of some of the depth of the human experience.  Especially when you consider that a significant component of that primacy has been the determination by colonizers to eradicate the culture of indigenous groups and replace it with their own.

So in the grand scheme of things, it may not mean all that much that the last native speaker of Yaghan is gone.  But I still feel sad about it.  It's only by looking at the world through a new lens that we find out how limited our own view is -- and how much that view can expand by observing our knowledge through a different one.

*********************************

In the long tradition of taking something that works and changing it so it doesn't work anymore, Amazon has seen fit to seriously complicate how content creators (i.e. people like me) incorporate affiliate links in their online content.  I'm trying to see if I can figure out how to get it to work, but until that happens, I am unfortunately going to suspend my Skeptophilia book-of-the-week feature.  If I can get it up and running again with the new system, I'll resume.  I'll keep you updated.


Saturday, February 19, 2022

Remembrance of things past

Like many People Of A Certain Age, I'm finding that my memory isn't what it used to be.

I walk into a room, and then say, "Why did I come in here?"  I'll think, "I don't need a grocery list, I'm just going for a few things," and come back with half of them.  We just had our dogs in for their annual checkups and shots, and there were a few things for each of them we wanted to ask the vet about.  My wife and I dutifully sat down and made a list -- and both of us forgot to put something on the list that we'd talked about only the previous day.

It's shown up, too, in more academic pursuits.  For my birthday last year my wife got me an online course through Udemy in beginning Japanese, a language I've always wanted to learn.  My dad had been stationed in Japan in the 1950s, and he learned enough of the language to get by; I grew up around the Japanese art and music my dad brought back with him, and became a Japanophile for life.  So I was thrilled to have the opportunity to study the country's unique and beautiful language.  The course starts out with a brief pronunciation guide, then launches into the hiragana -- one of three scripts used in written Japanese.  Each of the 46 characters stands for either a phoneme or a syllable, and some of them look quite a bit alike, so it's a lot to remember.  I have flash cards I made for all 46, and there are some I consistently miss, every single time I go through them.

When I flip the card over, my response is always, "Damn!  Of course!  Now I remember it!"  I recognize the character immediately, and can often even remember the mnemonic the teacher suggested to use in recalling it.  I'm getting there -- of the 46, there are about ten that I still struggle with -- but I know that twenty years ago, I'd have them all down cold by now.

Kids playing a memory game [Image is in the Public Domain]

Understandably, there's a nasty little thought in the back of my mind about senility and dementia.  My mother's sister had Alzheimer's -- to my knowledge, the only person in my extended family to suffer from that horrific and debilitating disease -- and I watched her slow slide from a smart, funny woman who could wipe the floor with me at Scrabble, did crossword puzzles in ink, and read voraciously, to a hollow, unresponsive shell.  I can think of no more terrifying fate. 

A new piece of research in Trends in Cognitive Science has to some extent put my mind at ease.  In "Cluttered Memory Representations Shape Cognition in Old Age," psychologists Tarek Amer (of Columbia University), Jordana Wynn (of Harvard University), and Lynn Hasher (of the University of Toronto) found that the forgetfulness a lot of us experience as we age isn't a simple loss of information, it's a loss of access to information that's still there, triggered by the clutter of memories from the past.

The authors write:
Wisdom and knowledge, cognitive functions that surely depend on being able to access and use memory, grow into old age.  Yet, the literature on memory shows that intentional, episodic memory declines with age.  How are we to account for this paradox?  To do so, we need to understand three aspects of memory differences associated with aging, two of which have received extensive investigation: age differences in memory encoding and in retrieval.  A third aspect, differences in the contents of memory representations, has received relatively little empirical attention.  Here, we argue that this aspect is central to a full understanding of age differences in memory and memory-related cognitive functions.  We propose that, relative to younger adults, healthy older adults (typically between 60 and 85 years of age) process and store too much information, the result of reductions in cognitive control or inhibitory mechanisms.  When efficient, these mechanisms enable a focus on target or goal-relevant information to the exclusion (or suppression) of irrelevant information.  Due to poor control (or reduced efficiency), the mnemonic representations of older adults can include: (i) recently activated but no-longer-relevant information; (ii) task-unrelated thoughts and/or prior knowledge elicited by the target information; and/or (iii) task-irrelevant information cued by the immediate environment.  This information is then automatically bound together with target information, creating cluttered memory representations that contain more information than do those of younger adults.

It's like trying to find something in a cluttered, disorganized attic.  Not only is it hard to locate what you're looking for, you get distracted by the other things you run across.  "Wow, it's been years since I've seen this!  I didn't even know this was up here!.... wait, what am looking for?"

I've noticed this exact problem in the kitchen.  I'm the chief cook in our family, and I love to make complex dinners with lots of ingredients.  I've found that unless I want to make a dozen trips to the fridge or cabinets to retrieve three items, I need to focus on one thing at a time.  Get a green pepper from the vegetable crisper.  Find the bottle of cooking sherry.  Go get the bottle of tabasco sauce from the table.  If I try to keep all three in my mind at once, I'm sure to return to the stove and think, "Okay, what the hell do I need, again?"

I wonder if this mental clutter is at the heart of my struggle with memorizing the hiragana characters in Japanese.  I've done at least a cursory study of about a dozen languages -- I'm truly fluent in only a couple, but my master's degree in historical linguistics required me to learn at least the rudiments of the languages whose history I was studying.  Could my difficulty in connecting the Japanese characters to the syllables they represent be because my Language Module is clogged with Old Norse and Welsh and Scottish Gaelic and Icelandic, and those all get in the way?

In any case, it's kind of a relief that I'm (probably) not suffering from early dementia.  It also gives me an excuse the next time my wife gets annoyed at me for forgetting something.  "I'm sorry, dear," I'll say.  "I'd have remembered it, but my brain is full.  But at least I remembered that the character yo looks like a yo-yo hanging from someone's finger!"

Nah, I doubt that'll work, and the fact that I remembered one of the Japanese characters instead of stopping by the store to pick up milk and eggs will only make it worse.  When I want to be sure not to forget something, I guess I'll have to keep making a list.

The only problem is then, I need to remember where I put the list.

***************************************

People made fun of Donald Rumsfeld for his statement that there are "known unknowns" -- things we know we don't know -- but a far larger number of "unknown unknowns," which are all the things we aren't even aware that we don't know.

While he certainly could have phrased it a little more clearly, and understand that I'm not in any way defending Donald Rumsfeld's other actions and statements, he certainly was right in this case.  It's profoundly humbling to find out how much we don't know, even about subjects about which we consider ourselves experts.  One of the most important things we need to do is to keep in mind not only that we might have things wrong, and that additional evidence may completely overturn what we thought we knew -- and more, that there are some things so far out of our ken that we may not even know they exist.

These ideas -- the perimeter of human knowledge, and the importance of being able to learn, relearn, change directions, and accept new information -- are the topic of psychologist Adam Grant's book Think Again: The Power of Knowing What You Don't Know.  In it, he explores not only how we are all riding around with blinders on, but how to take steps toward removing them, starting with not surrounding yourself with an echo chamber of like-minded people who might not even recognize that they have things wrong.  We should hold our own beliefs up to the light of scrutiny.  As Grant puts it, we should approach issues like scientists looking for the truth, not like a campaigning politician trying to convince an audience.

It's a book that challenges us to move past our stance of "clearly I'm right about this" to the more reasoned approach of "let me see if the evidence supports this."  In this era of media spin, fake news, and propaganda, it's a critical message -- and Think Again should be on everyone's to-read list.

[Note: if you purchase this book using the image/link below, part of the proceeds goes to support Skeptophilia!]


Friday, February 18, 2022

Academic predators

Today's topic, which comes to me via a long-time loyal reader of Skeptophilia, has a funny side and a not-so-funny side.

The link my friend sent me was to a paper called "The Psychometric Measurement of God," by one George Hammond, M.S. Physics.  In it, he claims to have used the methods of physics to prove that God exists, which would be a pretty good feat.  So I eagerly read the paper, which turned out to be an enormous mélange of sciency-sounding terms, evidently using a template something like this: "(big word) (big word) (big word) God (big word) (big word) (big word) (big word) matrix (big word) (big word) scientific measurement (big word) (big word) (big word) God exists q.e.d."

Don't believe me? Here's a representative passage:
I had already published in 1994 a peer-reviewed paper in a prominent journal pointing out that there was a decussation in the Papez Loop in Jeffrey Gray’s fornical septo-hippocampal system indicating that it regulated not only Anxiety as he said it did, but in a diagonal mode of operational so regulated his Impulsivity dimension.  In the brain the septum is located dead center in the “X” formed by the fornix thus regulating information to and from all 8 cubic lobes of the brain via the fornical Papez Loop.  Since then the septal area is also dead center in Thurstone’s Box in the brain I eventually realized that Gray’s septo-hippocampal system controls all 13 personality dimensions of the Structural Model of Personality!...  Meanwhile, factorization of this 4 x 4 matrix yields one, single, final top 4th order eigenvector of Psychology.  What could this factor be?...  [T]he final top factor in Psychology is in fact the God of the Bible.  Since this is a scientific measurement, God can actually be measured to 2 decimal point accuracy.
Please note that I didn't select this passage because it sounds ridiculous; it all sounds like this.

Or maybe, with my mere B.S. in Physics, I'm just not smart enough to understand it.

The fact that this is a wee bit on the spurious side is accentuated by the various self-congratulatory statements scattered through it, like "this is nothing less than awesome!" and "if you think discovering the gods is an amazing scientific turn of events, brace yourself!" and "my personal scientific opinion as a graduate physicist is that the possibility [of my being correct] is better than 1 in 3."  Also, the inadvertently hilarious statement that "evolutionary biology discovered the 'airbag theory' millions of years before General Motors did" might clue you in to the possibility that this paper may not have been peer reviewed.

But so far, this is just some loony guy writing some loony stuff, which should come as no big surprise, because after all, that's what loony guys do.  And there's not much to be gained by simply poking fun at what, honestly, are low-hanging fruit.  But that brings us to the less-than-amusing part.

The site where this "paper" was published is academia.edu.  My general thought has been that most .edu sites are pretty reliable, but that may have to be revised.  "Academia" is not only not peer reviewed -- it's barely even moderated.  Literally anyone can publish almost anything.


Basically, it's not a site for valid scientific research; it's purely a money-making operation.  If you poke around on the site a little, you'll find you're quickly asked to sign up and give them your email, and badgered to subscribe (for a monthly fee, of course).  I probably don't need to say this, but do not give these people your email.  It turns out there's a page on Quora devoted to the topic of academia.edu, and the comments left by people who have actually interacted with them are nothing short of scathing.  Here's a sampler:
  • If you sign up, the people who upload the pdf files will give you exactly what it seemed like they would give you, a paper pdf that makes you sign up using another link, which is also fake!  If you ask to contact the person who wrote it, they will either ignore you or block you. Don’t sign up for Academia, because when you do they just take you to another link, which is ridiculous.  Academia is a public research company, they don’t review anything or enforce rules.
  • I found it very unsettling that the ONLY permission they ask for is to….VIEW AND DOWNLOAD YOUR CONTACTS!  That was a SERIOUS tip-off to me that something wasn’t right.
  • It’s a scam, they try every trick in the book to get you to sign up; according to them I must be one of the most famous people on the planet.
  • I hate this site.  Looks like scammy trash.  I tried to sign up and after receiving my e-mail (use an account you don’t care about), then it looks like I can only proceed if I sign up for a bulk download account, and that costs money.  Fuck 'em.
  • They are scammers trying to get your money.  They told me I was cited in more than 2k papers.  My name is not common and I don't participate in the academic world.
  • Be careful with this.  Academia.edu was flagged by gmail and seems to have full access to my Google Account, not even partial access.  Given some of the other privacy and IP considerations with sharing your content on this site I would steer clear of it in future regardless - it’s basically a LinkedIn with similar commercial ambitions to make VCs a ton of money so there are the common concerns of “you’re the product” and “your content is now their content”.  Regardless this level of access to gmail is unwarranted and an invasion of privacy and was not clearly disclosed when I signed up (quick sign up to download a document).
So, the sad truth is that just because a site has .edu in its address, it's not necessarily reliable.  I always say "check sources, then check them again," but this is becoming harder and harder with pay-to-play sites (often called "predatory journals") that will publish any damn thing people submit.  From what I found, it seems like academia.edu isn't exactly pay-to-play; there's apparently not a fee for uploading your paper, and the money they make is from people naïve enough to sign up for a subscription.  (Of course, I couldn't dig into their actual rules and policies, because then I would have had to sign up, and I'm damned if I'm letting them get anywhere near my email address, much less my money.)  Even so, what this means is that the papers you find there, like the one by the estimable Mr. Hammond (M.S. Physics) have not passed any kind of gatekeeper.  There may be legitimate papers on the site; it's possible some younger researchers, trying to establish their names in their fields, are lured in by the possibility of getting their work in print somewhere.  Those papers are probably okay.

But as Hammond's "(big word) (big word) (big word) I proved that God exists!  I'm awesome!" paper illustrates, it would be decidedly unwise to trust everything on their site.

So once again: check your sources.  Don't just do a search to find out if what you're looking into has been published somewhere; find out where it's been published, and by whom, and then see if you can find out whether the author and the publication are legitimate.

It may seem like a lot of work, but if you want to stem the rising tide of false claims circulating madly about -- and I hope we all do -- it's well worth the time.

***************************************

People made fun of Donald Rumsfeld for his statement that there are "known unknowns" -- things we know we don't know -- but a far larger number of "unknown unknowns," which are all the things we aren't even aware that we don't know.

While he certainly could have phrased it a little more clearly, and understand that I'm not in any way defending Donald Rumsfeld's other actions and statements, he certainly was right in this case.  It's profoundly humbling to find out how much we don't know, even about subjects about which we consider ourselves experts.  One of the most important things we need to do is to keep in mind not only that we might have things wrong, and that additional evidence may completely overturn what we thought we knew -- and more, that there are some things so far out of our ken that we may not even know they exist.

These ideas -- the perimeter of human knowledge, and the importance of being able to learn, relearn, change directions, and accept new information -- are the topic of psychologist Adam Grant's book Think Again: The Power of Knowing What You Don't Know.  In it, he explores not only how we are all riding around with blinders on, but how to take steps toward removing them, starting with not surrounding yourself with an echo chamber of like-minded people who might not even recognize that they have things wrong.  We should hold our own beliefs up to the light of scrutiny.  As Grant puts it, we should approach issues like scientists looking for the truth, not like a campaigning politician trying to convince an audience.

It's a book that challenges us to move past our stance of "clearly I'm right about this" to the more reasoned approach of "let me see if the evidence supports this."  In this era of media spin, fake news, and propaganda, it's a critical message -- and Think Again should be on everyone's to-read list.

[Note: if you purchase this book using the image/link below, part of the proceeds goes to support Skeptophilia!]


Thursday, February 17, 2022

Big geology

It's easy to get overwhelmed when you start looking into geology.

Both the size scale and the time scale are so immense that it's hard to wrap your brain around them.  Huge forces at work, that have been at work for billions of years -- and will continue to work for another billion.  Makes me feel awfully... insignificant.

The topic comes up because of three recent bits of research into just how powerful geological processes can be.  In the first, scientists were studying a crater field in Wyoming that dates to the Permian Period, around 280 million years ago (28 million years, give or take, before the biggest mass extinction the Earth has ever experienced).  The craters are between ten and seventy meters in diameter, and there are several dozen of them, all dating from right around the same time.  The thought was that they were created when an asteroid exploded in the upper atmosphere, raining debris of various sizes on the impact site.

The recent research, though, shows that what happened was even more dramatic.

"Many of the craters are clustered in groups and are aligned along rays," said Thomas Kenkmann of the University of Freiburg, who led the project.  "Furthermore, several craters are elliptical, allowing the reconstruction of the incoming paths of the impactors.  The reconstructed trajectories have a radial pattern.  The trajectories indicate a single source and show that the craters were formed by ejected blocks from a large primary crater."

So what appears to have happened is this.

A large meteorite hit the Earth -- triangulating from the pattern of impact craters, something like 150 and 200 kilometers away -- and the blast flung pieces of rock (both from the meteorite and from the impact site) into the air, which then arced back down and struck at speeds estimated to be up to a thousand meters per second.  The craters were formed by impacts from rocks between four and eight meters across, and the primary impact crater (which has not been found, but is thought to be buried under sediments somewhere near the Wyoming-Nebraska border) is thought to be fifty kilometers or more across.

Imagine it.  A huge rock from space hits a spot two hundred kilometers from where you are, and five minutes later you're bombarded by boulders traveling at a kilometer per second. 

This is called "having a bad day."

[Image licensed under the Creative Commons State Farm, Asteroid falling to Earth, CC BY 2.0]

The second link was to research about the geology of Japan -- second only to Indonesia as one of the most dangerously active tectonic regions on Earth -- which showed the presence of a pluton (a large underground blob of rock different from the rocks that surround it) that sits right near the Nankai Subduction Zone.  This pluton is so large that it actually deforms the crust -- causing the bit above it to bulge and the bit below it to sag.  This creates cracks down which groundwater can seep.

And groundwater acts as a lubricant.  So this blob of rock is, apparently, acting as a focal point for enormous earthquakes.

The Kumano pluton (the red bulge in the middle of the image).  The Nankai Subduction Zone is immediately to the left.

Slipping in this subduction zone caused two earthquakes of above magnitude 8, in 1944 and 1946.  Understanding the structure of this complex region might help predict when and where the next one will come.

If that doesn't make you feel small enough, the third piece of research was into the Missoula Megaflood -- a tremendous flood (thus the name) that occurred 18,000 years ago.

During the last ice age, a glacial ice dam formed across what is now the northern Idaho Rockies.  As the climate warmed, the ice melted, and the water backed up into an enormous lake -- called Lake Missoula -- that covered a good bit of what is now western Montana.  Further warming eventually caused the ice dam to collapse, and all that water drained out, sweeping across what is now eastern Washington, and literally scouring the place down to bedrock.  You can still see the effects today; the area is called the "Channeled Scablands," and is formed of teardrop-shaped pockets of relatively intact topsoil surrounded by gullies floored with bare rock.  (If you've ever seen what a shallow stream does to a sandy beach as it flows into sea, you can picture exactly what it looks like.)

The recent research has made the story even more interesting.  One thing that a lot of laypeople have never heard of is the concept of isostasy -- that the tectonic plates, the chunks of the Earth's crust, are actually floating in the liquid mantle beneath them, and the level they float is dependent upon how heavy they are, just as putting heavy weights in a boat make it float lower in the water.  Well, as the Cordilleran Ice Sheet melted, that weight was removed, and the flat piece of crust underneath it tilted upward on the eastern edge.

It's like having a full bowl of water on a table, and lifting one end of the table.  The bowl will dump over, spilling out the water, and it will flow downhill and run off the edge -- just as Lake Missoula did.

Interestingly, exactly the same thing is going on right now underneath Great Britain.  During the last ice age, Scotland was completely glaciated; southern England was not.  The melting of those glaciers has resulted in isostatic rebound, lifting the northern edge of the island by ten centimeters per century.  At the same time, the tilt is pushing southern England downward, and it's sinking, at about five centimeters per century.  (Fortunately, there's no giant lake waiting to spill across the country.)

We humans get a bit cocky at times, don't we?  We're powerful, masters of the planet.  Well... not really.  We're dwarfed by structures and processes we're only beginning to understand.  Probably a good thing, that.  Arrogance never did anyone any favors.  There's nothing wrong with finding out we're not invincible -- and that there are a lot of things out there way, way bigger than we are, that don't give a rat's ass for our little concerns.

***************************************

People made fun of Donald Rumsfeld for his statement that there are "known unknowns" -- things we know we don't know -- but a far larger number of "unknown unknowns," which are all the things we aren't even aware that we don't know.

While he certainly could have phrased it a little more clearly, and understand that I'm not in any way defending Donald Rumsfeld's other actions and statements, he certainly was right in this case.  It's profoundly humbling to find out how much we don't know, even about subjects about which we consider ourselves experts.  One of the most important things we need to do is to keep in mind not only that we might have things wrong, and that additional evidence may completely overturn what we thought we knew -- and more, that there are some things so far out of our ken that we may not even know they exist.

These ideas -- the perimeter of human knowledge, and the importance of being able to learn, relearn, change directions, and accept new information -- are the topic of psychologist Adam Grant's book Think Again: The Power of Knowing What You Don't Know.  In it, he explores not only how we are all riding around with blinders on, but how to take steps toward removing them, starting with not surrounding yourself with an echo chamber of like-minded people who might not even recognize that they have things wrong.  We should hold our own beliefs up to the light of scrutiny.  As Grant puts it, we should approach issues like scientists looking for the truth, not like a campaigning politician trying to convince an audience.

It's a book that challenges us to move past our stance of "clearly I'm right about this" to the more reasoned approach of "let me see if the evidence supports this."  In this era of media spin, fake news, and propaganda, it's a critical message -- and Think Again should be on everyone's to-read list.

[Note: if you purchase this book using the image/link below, part of the proceeds goes to support Skeptophilia!]


Wednesday, February 16, 2022

Goldilocks next door

Springboarding off yesterday's post, about how easy it is to form organic compounds abiotically, today we have: our nearest neighbor might be a decent candidate for the search for extraterrestrial life.

At only 4.24 light years away, Proxima Centauri is the closest star to our own Sun.  It's captured the imagination ever since it was discovered how close it is; if you'll recall, the intrepid Robinson family of Lost in Space was heading toward Alpha Centauri, the brightest star in this triple-star system, which is a little father away (4.37 light years) but still more or less right next door, as these things go.

It was discovered in 2016 that Proxima Centauri has a planet in orbit around it -- and more exciting still, it's only a little larger than Earth (1.17 times Earth's mass, to be precise), and is in the star's "Goldilocks zone," where water can exist in liquid form.  The discovery of this exoplanet (Proxima Centauri b) was followed in 2020 by the discovery of Proxima Centauri c, thought to be a "mini-Neptune" at seven times Earth's mass, so probably not habitable by life as we know it.

And now, a paper in Nature has presented research indicating that Proxima Centauri has a third exoplanet -- somewhere between a quarter and three-quarters of the Earth's mass, and right in the middle of the Goldilocks zone as well.

"It is fascinating to know that our Sun’s nearest stellar neighbor is the host to three small planets," said Elisa Quintana, an astrophysicist at NASA’s Goddard Space Flight Center in Greenbelt, Maryland, who co-authored the paper.  "Their proximity make this a prime system for further study, to understand their nature and how they likely formed."

The newly-discovered planet was detected by observing shifts in the light spectrum emitted by the star as the planet's gravitational field interacted with it -- shifts in wavelength as little as 10 ^-5 ångströms, or one ten-thousandth the diameter of a hydrogen atom.  The device that accomplished this is the Echelle Spectrograph for Rocky Exoplanets and Stable Spectroscopic Observations (ESPRESSO -- because you can't have an astronomical device without a clever acronym) at the European Southern Observatory in Cerro Paranal, Chile.  

"It’s showing that the nearest star probably has a very rich planetary system," said co-author Guillem Anglada-Escudé, of the Institute of Space Sciences in Barcelona.  "It always has a little bit of mystique, being the closest one."

What this brings home to me is how incredibly common planets in the Goldilocks zone must be.  It's estimated that around two percent of spectral class F, G, and K stars -- the ones most like the Sun -- have planets in the habitable zone.  If this estimate is accurate -- and if anything, most astrophysicists think it's on the conservative side -- that means there's five hundred million habitable planets in the Milky Way alone.

Of course, "habitable" comes with several caveats.  Average temperature and proximity to the host star isn't the only thing that determines if a place is actually habitable.  Remember, for example, that Venus is technically in the Goldilocks zone, but because of its atmospheric composition it has a surface temperature hot enough to melt lead, and an atmosphere made mostly of carbon dioxide and sulfuric acid.  Being at the right distance to theoretically have liquid water doesn't mean it actually does.  Besides atmospheric composition, other things that could interfere with a planet having a clement climate are the eccentricity of the orbit (high eccentricity would result in wild temperature fluctuations between summer and winter), the planet being tidally locked (the same side always facing the star), and how stable the star itself is.  Some stars are prone to stellar storms that make the ones our Sun has seem like gentle breezes, and would irradiate the surface of any planets orbiting them in such a way as to damage or destroy anything unlucky enough to be exposed.

But still -- come back to the "life as we know it" part.  Yeah, a tidally-locked planet that gets fried by stellar storms would be uninhabitable for us, but perhaps there are life forms that evolved to avoid the dangers.  As I pointed out yesterday, the oxygen we depend on is actually a highly reactive toxin -- we use it to make our cellular respiration reactions highly efficient, but it's also destructive to tissues unless you have ways to mitigate the damage.  (Recall that burning is just rapid oxidation.)  My hunch -- and it is just a hunch -- is that just as we find life even in the most inhospitable places on Earth, it'll be pretty ubiquitous out in space.

After all, remember what we learned from Ian Malcolm in Jurassic Park:



***************************************

People made fun of Donald Rumsfeld for his statement that there are "known unknowns" -- things we know we don't know -- but a far larger number of "unknown unknowns," which are all the things we aren't even aware that we don't know.

While he certainly could have phrased it a little more clearly, and understand that I'm not in any way defending Donald Rumsfeld's other actions and statements, he certainly was right in this case.  It's profoundly humbling to find out how much we don't know, even about subjects about which we consider ourselves experts.  One of the most important things we need to do is to keep in mind not only that we might have things wrong, and that additional evidence may completely overturn what we thought we knew -- and more, that there are some things so far out of our ken that we may not even know they exist.

These ideas -- the perimeter of human knowledge, and the importance of being able to learn, relearn, change directions, and accept new information -- are the topic of psychologist Adam Grant's book Think Again: The Power of Knowing What You Don't Know.  In it, he explores not only how we are all riding around with blinders on, but how to take steps toward removing them, starting with not surrounding yourself with an echo chamber of like-minded people who might not even recognize that they have things wrong.  We should hold our own beliefs up to the light of scrutiny.  As Grant puts it, we should approach issues like scientists looking for the truth, not like a campaigning politician trying to convince an audience.

It's a book that challenges us to move past our stance of "clearly I'm right about this" to the more reasoned approach of "let me see if the evidence supports this."  In this era of media spin, fake news, and propaganda, it's a critical message -- and Think Again should be on everyone's to-read list.

[Note: if you purchase this book using the image/link below, part of the proceeds goes to support Skeptophilia!]


Tuesday, February 15, 2022

The recipe for life

Back in my teaching days, I was all too aware of how hard it was to generate any kind of enthusiasm for the details of biology in a bunch of teenagers.  But there were a few guaranteed oh-wow moments -- and one that I always introduced by saying, "If this doesn't blow your mind, you're not paying attention."

What I was referring to was the Miller-Urey experiment.  This phenomenal piece of research was an attempt to see if it was possible to create organic compounds abiotically -- with clear implications for the origins of life.  Back in the early twentieth century, when people started to consider seriously the possibility that life started on Earth without the intervention of a deity, the obvious question was, "How?"  So they created apparatus to take collections of inorganic compounds surmised to be abundant on the early Earth, subject them to various energy sources, and waited to see what happened.

What happened was that they basically created smog and dirty water.  No organic compounds.  In 1922, Soviet biochemist Alexander Oparin suggested that the problem might be that they were starting with the assumption that the Earth's atmosphere hadn't changed much -- and looking at (then) new information about the atmosphere of Jupiter, he suggested that perhaps, the early Earth's atmosphere had no free oxygen.  In chemistry terms, it was a reducing atmosphereOxygen, after all, is a highly reactive substance, good at tearing apart organic molecules.  (There's decent evidence that the pathways of aerobic cellular respiration originally evolved as a way of detoxifying oxygen, and only secondarily gained a use at increasing the efficiency of releasing the energy in food molecules.)

It wasn't until thirty years later that anyone tested Oparin's hunch.  Stanley Miller and Harold Urey, of the University of Chicago, created an apparatus made of sealed, interconnected glass globes, and filled them with their best guess at the gases present in the atmosphere of the early Earth -- carbon monoxide, methane, hydrogen sulfide, sulfur dioxide, water vapor, various nitrogen oxides, hydrogen cyanide (HCN), and so on.  No free (diatomic) oxygen.  They then introduced an energy source -- essentially, artificial lightning -- and sat back to wait.

No one expected fast results.  After all, the Earth had millions of years to generate enough organic compounds to (presumably) self-assemble into the earliest cells.  No one was more shocked than Miller and Urey when they came in the next day to find that the water in their apparatus had turned blood red.  Three days later, it was black, like crude oil.  At that point, they couldn't contain their curiosity, and opened it up to see what was there.

All twenty amino acids, plus several amino acids not typically found in living things on Earth.  Simple sugars.  Fatty acids.  Glycerol.  DNA and RNA nucleotides.  Basically, all the building blocks it takes to make a living organism.

In three days.

A scale model of the Miller-Urey apparatus, made for me by my son, who is a professional scientific glassblower

This glop, now nicknamed the "primordial soup," is thought to have filled the early oceans.  Imagine it -- you're standing on the shore of the Precambrian sea (wearing a breathing apparatus, of course).  On land is absolutely nothing alive -- a continent full of nothing but rock and sand.  In front of you is an ocean that appears to be composed of thick, dark oil.

It'd be hard to convince yourself this was actually Earth.

Since then, scientists have re-run the experiment hundreds of times, checking to see if perhaps Miller and Urey had just happened by luck on the exact right recipe, but it turns out this experiment is remarkably insensitive to initial conditions.  As long as you have three things -- (1) the right inorganic building blocks, (2) a source of energy, and (3) no free oxygen -- you can make as much of this rather unappealing soup as you want.

So, it turns out, generating biochemicals is a piece of cake.  And a piece of research at Friedrich Schiller University and the Max Planck Institute have shown that it's even easier than that -- the reactions that create amino acids can happen out in space.

"Water plays an important role in the conventional way in which peptides are created," said Serge Krasnokutski, who co-authored the paper.  "Our quantum chemical calculations have now shown that the amino acid glycine can be formed through a chemical precursor – called an amino ketene – combining with a water molecule.  Put simply: in this case, water must be added for the first reaction step, and water must be removed for the second...  [So] instead of taking the chemical detour in which amino acids are formed, we wanted to find out whether amino ketene molecules could not be formed instead and combine directly to form peptides.  And we did this under the conditions that prevail in cosmic molecular clouds, that is to say on dust particles in a vacuum, where the corresponding chemicals are present in abundance: carbon, ammonia, and carbon monoxide."

The more we look into this, the simpler it seems to be to generate the chemicals of life -- further elucidating how the first organisms formed on Earth, and (even more excitingly) suggesting that life might be common in the cosmos.  In fact, it may not even take an Earth-like planet to be a home for life; as long as a planet is in the "Goldilocks zone" (the distance from its parent star where water can exist in liquid form), getting from there to an organic-compound-rich environment may not be much of a hurdle.

That's still a long way from intelligent life, of course; chances are, the planets with extraterrestrial life mostly have much simpler organisms.  But how exciting is that?  Setting foot on a planet covered with life -- none of which has any common ancestry with terrestrial organisms.

I can think of very little that would be more thrilling than that.

***************************************

People made fun of Donald Rumsfeld for his statement that there are "known unknowns" -- things we know we don't know -- but a far larger number of "unknown unknowns," which are all the things we aren't even aware that we don't know.

While he certainly could have phrased it a little more clearly, and understand that I'm not in any way defending Donald Rumsfeld's other actions and statements, he certainly was right in this case.  It's profoundly humbling to find out how much we don't know, even about subjects about which we consider ourselves experts.  One of the most important things we need to do is to keep in mind not only that we might have things wrong, and that additional evidence may completely overturn what we thought we knew -- and more, that there are some things so far out of our ken that we may not even know they exist.

These ideas -- the perimeter of human knowledge, and the importance of being able to learn, relearn, change directions, and accept new information -- are the topic of psychologist Adam Grant's book Think Again: The Power of Knowing What You Don't Know.  In it, he explores not only how we are all riding around with blinders on, but how to take steps toward removing them, starting with not surrounding yourself with an echo chamber of like-minded people who might not even recognize that they have things wrong.  We should hold our own beliefs up to the light of scrutiny.  As Grant puts it, we should approach issues like scientists looking for the truth, not like a campaigning politician trying to convince an audience.

It's a book that challenges us to move past our stance of "clearly I'm right about this" to the more reasoned approach of "let me see if the evidence supports this."  In this era of media spin, fake news, and propaganda, it's a critical message -- and Think Again should be on everyone's to-read list.

[Note: if you purchase this book using the image/link below, part of the proceeds goes to support Skeptophilia!]