Skeptophilia (skep-to-fil-i-a) (n.) - the love of logical thought, skepticism, and thinking critically. Being an exploration of the applications of skeptical thinking to the world at large, with periodic excursions into linguistics, music, politics, cryptozoology, and why people keep seeing the face of Jesus on grilled cheese sandwiches.

Saturday, February 19, 2022

Remembrance of things past

Like many People Of A Certain Age, I'm finding that my memory isn't what it used to be.

I walk into a room, and then say, "Why did I come in here?"  I'll think, "I don't need a grocery list, I'm just going for a few things," and come back with half of them.  We just had our dogs in for their annual checkups and shots, and there were a few things for each of them we wanted to ask the vet about.  My wife and I dutifully sat down and made a list -- and both of us forgot to put something on the list that we'd talked about only the previous day.

It's shown up, too, in more academic pursuits.  For my birthday last year my wife got me an online course through Udemy in beginning Japanese, a language I've always wanted to learn.  My dad had been stationed in Japan in the 1950s, and he learned enough of the language to get by; I grew up around the Japanese art and music my dad brought back with him, and became a Japanophile for life.  So I was thrilled to have the opportunity to study the country's unique and beautiful language.  The course starts out with a brief pronunciation guide, then launches into the hiragana -- one of three scripts used in written Japanese.  Each of the 46 characters stands for either a phoneme or a syllable, and some of them look quite a bit alike, so it's a lot to remember.  I have flash cards I made for all 46, and there are some I consistently miss, every single time I go through them.

When I flip the card over, my response is always, "Damn!  Of course!  Now I remember it!"  I recognize the character immediately, and can often even remember the mnemonic the teacher suggested to use in recalling it.  I'm getting there -- of the 46, there are about ten that I still struggle with -- but I know that twenty years ago, I'd have them all down cold by now.

Kids playing a memory game [Image is in the Public Domain]

Understandably, there's a nasty little thought in the back of my mind about senility and dementia.  My mother's sister had Alzheimer's -- to my knowledge, the only person in my extended family to suffer from that horrific and debilitating disease -- and I watched her slow slide from a smart, funny woman who could wipe the floor with me at Scrabble, did crossword puzzles in ink, and read voraciously, to a hollow, unresponsive shell.  I can think of no more terrifying fate. 

A new piece of research in Trends in Cognitive Science has to some extent put my mind at ease.  In "Cluttered Memory Representations Shape Cognition in Old Age," psychologists Tarek Amer (of Columbia University), Jordana Wynn (of Harvard University), and Lynn Hasher (of the University of Toronto) found that the forgetfulness a lot of us experience as we age isn't a simple loss of information, it's a loss of access to information that's still there, triggered by the clutter of memories from the past.

The authors write:
Wisdom and knowledge, cognitive functions that surely depend on being able to access and use memory, grow into old age.  Yet, the literature on memory shows that intentional, episodic memory declines with age.  How are we to account for this paradox?  To do so, we need to understand three aspects of memory differences associated with aging, two of which have received extensive investigation: age differences in memory encoding and in retrieval.  A third aspect, differences in the contents of memory representations, has received relatively little empirical attention.  Here, we argue that this aspect is central to a full understanding of age differences in memory and memory-related cognitive functions.  We propose that, relative to younger adults, healthy older adults (typically between 60 and 85 years of age) process and store too much information, the result of reductions in cognitive control or inhibitory mechanisms.  When efficient, these mechanisms enable a focus on target or goal-relevant information to the exclusion (or suppression) of irrelevant information.  Due to poor control (or reduced efficiency), the mnemonic representations of older adults can include: (i) recently activated but no-longer-relevant information; (ii) task-unrelated thoughts and/or prior knowledge elicited by the target information; and/or (iii) task-irrelevant information cued by the immediate environment.  This information is then automatically bound together with target information, creating cluttered memory representations that contain more information than do those of younger adults.

It's like trying to find something in a cluttered, disorganized attic.  Not only is it hard to locate what you're looking for, you get distracted by the other things you run across.  "Wow, it's been years since I've seen this!  I didn't even know this was up here!.... wait, what am looking for?"

I've noticed this exact problem in the kitchen.  I'm the chief cook in our family, and I love to make complex dinners with lots of ingredients.  I've found that unless I want to make a dozen trips to the fridge or cabinets to retrieve three items, I need to focus on one thing at a time.  Get a green pepper from the vegetable crisper.  Find the bottle of cooking sherry.  Go get the bottle of tabasco sauce from the table.  If I try to keep all three in my mind at once, I'm sure to return to the stove and think, "Okay, what the hell do I need, again?"

I wonder if this mental clutter is at the heart of my struggle with memorizing the hiragana characters in Japanese.  I've done at least a cursory study of about a dozen languages -- I'm truly fluent in only a couple, but my master's degree in historical linguistics required me to learn at least the rudiments of the languages whose history I was studying.  Could my difficulty in connecting the Japanese characters to the syllables they represent be because my Language Module is clogged with Old Norse and Welsh and Scottish Gaelic and Icelandic, and those all get in the way?

In any case, it's kind of a relief that I'm (probably) not suffering from early dementia.  It also gives me an excuse the next time my wife gets annoyed at me for forgetting something.  "I'm sorry, dear," I'll say.  "I'd have remembered it, but my brain is full.  But at least I remembered that the character yo looks like a yo-yo hanging from someone's finger!"

Nah, I doubt that'll work, and the fact that I remembered one of the Japanese characters instead of stopping by the store to pick up milk and eggs will only make it worse.  When I want to be sure not to forget something, I guess I'll have to keep making a list.

The only problem is then, I need to remember where I put the list.

***************************************

People made fun of Donald Rumsfeld for his statement that there are "known unknowns" -- things we know we don't know -- but a far larger number of "unknown unknowns," which are all the things we aren't even aware that we don't know.

While he certainly could have phrased it a little more clearly, and understand that I'm not in any way defending Donald Rumsfeld's other actions and statements, he certainly was right in this case.  It's profoundly humbling to find out how much we don't know, even about subjects about which we consider ourselves experts.  One of the most important things we need to do is to keep in mind not only that we might have things wrong, and that additional evidence may completely overturn what we thought we knew -- and more, that there are some things so far out of our ken that we may not even know they exist.

These ideas -- the perimeter of human knowledge, and the importance of being able to learn, relearn, change directions, and accept new information -- are the topic of psychologist Adam Grant's book Think Again: The Power of Knowing What You Don't Know.  In it, he explores not only how we are all riding around with blinders on, but how to take steps toward removing them, starting with not surrounding yourself with an echo chamber of like-minded people who might not even recognize that they have things wrong.  We should hold our own beliefs up to the light of scrutiny.  As Grant puts it, we should approach issues like scientists looking for the truth, not like a campaigning politician trying to convince an audience.

It's a book that challenges us to move past our stance of "clearly I'm right about this" to the more reasoned approach of "let me see if the evidence supports this."  In this era of media spin, fake news, and propaganda, it's a critical message -- and Think Again should be on everyone's to-read list.

[Note: if you purchase this book using the image/link below, part of the proceeds goes to support Skeptophilia!]


Friday, February 18, 2022

Academic predators

Today's topic, which comes to me via a long-time loyal reader of Skeptophilia, has a funny side and a not-so-funny side.

The link my friend sent me was to a paper called "The Psychometric Measurement of God," by one George Hammond, M.S. Physics.  In it, he claims to have used the methods of physics to prove that God exists, which would be a pretty good feat.  So I eagerly read the paper, which turned out to be an enormous mélange of sciency-sounding terms, evidently using a template something like this: "(big word) (big word) (big word) God (big word) (big word) (big word) (big word) matrix (big word) (big word) scientific measurement (big word) (big word) (big word) God exists q.e.d."

Don't believe me? Here's a representative passage:
I had already published in 1994 a peer-reviewed paper in a prominent journal pointing out that there was a decussation in the Papez Loop in Jeffrey Gray’s fornical septo-hippocampal system indicating that it regulated not only Anxiety as he said it did, but in a diagonal mode of operational so regulated his Impulsivity dimension.  In the brain the septum is located dead center in the “X” formed by the fornix thus regulating information to and from all 8 cubic lobes of the brain via the fornical Papez Loop.  Since then the septal area is also dead center in Thurstone’s Box in the brain I eventually realized that Gray’s septo-hippocampal system controls all 13 personality dimensions of the Structural Model of Personality!...  Meanwhile, factorization of this 4 x 4 matrix yields one, single, final top 4th order eigenvector of Psychology.  What could this factor be?...  [T]he final top factor in Psychology is in fact the God of the Bible.  Since this is a scientific measurement, God can actually be measured to 2 decimal point accuracy.
Please note that I didn't select this passage because it sounds ridiculous; it all sounds like this.

Or maybe, with my mere B.S. in Physics, I'm just not smart enough to understand it.

The fact that this is a wee bit on the spurious side is accentuated by the various self-congratulatory statements scattered through it, like "this is nothing less than awesome!" and "if you think discovering the gods is an amazing scientific turn of events, brace yourself!" and "my personal scientific opinion as a graduate physicist is that the possibility [of my being correct] is better than 1 in 3."  Also, the inadvertently hilarious statement that "evolutionary biology discovered the 'airbag theory' millions of years before General Motors did" might clue you in to the possibility that this paper may not have been peer reviewed.

But so far, this is just some loony guy writing some loony stuff, which should come as no big surprise, because after all, that's what loony guys do.  And there's not much to be gained by simply poking fun at what, honestly, are low-hanging fruit.  But that brings us to the less-than-amusing part.

The site where this "paper" was published is academia.edu.  My general thought has been that most .edu sites are pretty reliable, but that may have to be revised.  "Academia" is not only not peer reviewed -- it's barely even moderated.  Literally anyone can publish almost anything.


Basically, it's not a site for valid scientific research; it's purely a money-making operation.  If you poke around on the site a little, you'll find you're quickly asked to sign up and give them your email, and badgered to subscribe (for a monthly fee, of course).  I probably don't need to say this, but do not give these people your email.  It turns out there's a page on Quora devoted to the topic of academia.edu, and the comments left by people who have actually interacted with them are nothing short of scathing.  Here's a sampler:
  • If you sign up, the people who upload the pdf files will give you exactly what it seemed like they would give you, a paper pdf that makes you sign up using another link, which is also fake!  If you ask to contact the person who wrote it, they will either ignore you or block you. Don’t sign up for Academia, because when you do they just take you to another link, which is ridiculous.  Academia is a public research company, they don’t review anything or enforce rules.
  • I found it very unsettling that the ONLY permission they ask for is to….VIEW AND DOWNLOAD YOUR CONTACTS!  That was a SERIOUS tip-off to me that something wasn’t right.
  • It’s a scam, they try every trick in the book to get you to sign up; according to them I must be one of the most famous people on the planet.
  • I hate this site.  Looks like scammy trash.  I tried to sign up and after receiving my e-mail (use an account you don’t care about), then it looks like I can only proceed if I sign up for a bulk download account, and that costs money.  Fuck 'em.
  • They are scammers trying to get your money.  They told me I was cited in more than 2k papers.  My name is not common and I don't participate in the academic world.
  • Be careful with this.  Academia.edu was flagged by gmail and seems to have full access to my Google Account, not even partial access.  Given some of the other privacy and IP considerations with sharing your content on this site I would steer clear of it in future regardless - it’s basically a LinkedIn with similar commercial ambitions to make VCs a ton of money so there are the common concerns of “you’re the product” and “your content is now their content”.  Regardless this level of access to gmail is unwarranted and an invasion of privacy and was not clearly disclosed when I signed up (quick sign up to download a document).
So, the sad truth is that just because a site has .edu in its address, it's not necessarily reliable.  I always say "check sources, then check them again," but this is becoming harder and harder with pay-to-play sites (often called "predatory journals") that will publish any damn thing people submit.  From what I found, it seems like academia.edu isn't exactly pay-to-play; there's apparently not a fee for uploading your paper, and the money they make is from people naïve enough to sign up for a subscription.  (Of course, I couldn't dig into their actual rules and policies, because then I would have had to sign up, and I'm damned if I'm letting them get anywhere near my email address, much less my money.)  Even so, what this means is that the papers you find there, like the one by the estimable Mr. Hammond (M.S. Physics) have not passed any kind of gatekeeper.  There may be legitimate papers on the site; it's possible some younger researchers, trying to establish their names in their fields, are lured in by the possibility of getting their work in print somewhere.  Those papers are probably okay.

But as Hammond's "(big word) (big word) (big word) I proved that God exists!  I'm awesome!" paper illustrates, it would be decidedly unwise to trust everything on their site.

So once again: check your sources.  Don't just do a search to find out if what you're looking into has been published somewhere; find out where it's been published, and by whom, and then see if you can find out whether the author and the publication are legitimate.

It may seem like a lot of work, but if you want to stem the rising tide of false claims circulating madly about -- and I hope we all do -- it's well worth the time.

***************************************

People made fun of Donald Rumsfeld for his statement that there are "known unknowns" -- things we know we don't know -- but a far larger number of "unknown unknowns," which are all the things we aren't even aware that we don't know.

While he certainly could have phrased it a little more clearly, and understand that I'm not in any way defending Donald Rumsfeld's other actions and statements, he certainly was right in this case.  It's profoundly humbling to find out how much we don't know, even about subjects about which we consider ourselves experts.  One of the most important things we need to do is to keep in mind not only that we might have things wrong, and that additional evidence may completely overturn what we thought we knew -- and more, that there are some things so far out of our ken that we may not even know they exist.

These ideas -- the perimeter of human knowledge, and the importance of being able to learn, relearn, change directions, and accept new information -- are the topic of psychologist Adam Grant's book Think Again: The Power of Knowing What You Don't Know.  In it, he explores not only how we are all riding around with blinders on, but how to take steps toward removing them, starting with not surrounding yourself with an echo chamber of like-minded people who might not even recognize that they have things wrong.  We should hold our own beliefs up to the light of scrutiny.  As Grant puts it, we should approach issues like scientists looking for the truth, not like a campaigning politician trying to convince an audience.

It's a book that challenges us to move past our stance of "clearly I'm right about this" to the more reasoned approach of "let me see if the evidence supports this."  In this era of media spin, fake news, and propaganda, it's a critical message -- and Think Again should be on everyone's to-read list.

[Note: if you purchase this book using the image/link below, part of the proceeds goes to support Skeptophilia!]


Thursday, February 17, 2022

Big geology

It's easy to get overwhelmed when you start looking into geology.

Both the size scale and the time scale are so immense that it's hard to wrap your brain around them.  Huge forces at work, that have been at work for billions of years -- and will continue to work for another billion.  Makes me feel awfully... insignificant.

The topic comes up because of three recent bits of research into just how powerful geological processes can be.  In the first, scientists were studying a crater field in Wyoming that dates to the Permian Period, around 280 million years ago (28 million years, give or take, before the biggest mass extinction the Earth has ever experienced).  The craters are between ten and seventy meters in diameter, and there are several dozen of them, all dating from right around the same time.  The thought was that they were created when an asteroid exploded in the upper atmosphere, raining debris of various sizes on the impact site.

The recent research, though, shows that what happened was even more dramatic.

"Many of the craters are clustered in groups and are aligned along rays," said Thomas Kenkmann of the University of Freiburg, who led the project.  "Furthermore, several craters are elliptical, allowing the reconstruction of the incoming paths of the impactors.  The reconstructed trajectories have a radial pattern.  The trajectories indicate a single source and show that the craters were formed by ejected blocks from a large primary crater."

So what appears to have happened is this.

A large meteorite hit the Earth -- triangulating from the pattern of impact craters, something like 150 and 200 kilometers away -- and the blast flung pieces of rock (both from the meteorite and from the impact site) into the air, which then arced back down and struck at speeds estimated to be up to a thousand meters per second.  The craters were formed by impacts from rocks between four and eight meters across, and the primary impact crater (which has not been found, but is thought to be buried under sediments somewhere near the Wyoming-Nebraska border) is thought to be fifty kilometers or more across.

Imagine it.  A huge rock from space hits a spot two hundred kilometers from where you are, and five minutes later you're bombarded by boulders traveling at a kilometer per second. 

This is called "having a bad day."

[Image licensed under the Creative Commons State Farm, Asteroid falling to Earth, CC BY 2.0]

The second link was to research about the geology of Japan -- second only to Indonesia as one of the most dangerously active tectonic regions on Earth -- which showed the presence of a pluton (a large underground blob of rock different from the rocks that surround it) that sits right near the Nankai Subduction Zone.  This pluton is so large that it actually deforms the crust -- causing the bit above it to bulge and the bit below it to sag.  This creates cracks down which groundwater can seep.

And groundwater acts as a lubricant.  So this blob of rock is, apparently, acting as a focal point for enormous earthquakes.

The Kumano pluton (the red bulge in the middle of the image).  The Nankai Subduction Zone is immediately to the left.

Slipping in this subduction zone caused two earthquakes of above magnitude 8, in 1944 and 1946.  Understanding the structure of this complex region might help predict when and where the next one will come.

If that doesn't make you feel small enough, the third piece of research was into the Missoula Megaflood -- a tremendous flood (thus the name) that occurred 18,000 years ago.

During the last ice age, a glacial ice dam formed across what is now the northern Idaho Rockies.  As the climate warmed, the ice melted, and the water backed up into an enormous lake -- called Lake Missoula -- that covered a good bit of what is now western Montana.  Further warming eventually caused the ice dam to collapse, and all that water drained out, sweeping across what is now eastern Washington, and literally scouring the place down to bedrock.  You can still see the effects today; the area is called the "Channeled Scablands," and is formed of teardrop-shaped pockets of relatively intact topsoil surrounded by gullies floored with bare rock.  (If you've ever seen what a shallow stream does to a sandy beach as it flows into sea, you can picture exactly what it looks like.)

The recent research has made the story even more interesting.  One thing that a lot of laypeople have never heard of is the concept of isostasy -- that the tectonic plates, the chunks of the Earth's crust, are actually floating in the liquid mantle beneath them, and the level they float is dependent upon how heavy they are, just as putting heavy weights in a boat make it float lower in the water.  Well, as the Cordilleran Ice Sheet melted, that weight was removed, and the flat piece of crust underneath it tilted upward on the eastern edge.

It's like having a full bowl of water on a table, and lifting one end of the table.  The bowl will dump over, spilling out the water, and it will flow downhill and run off the edge -- just as Lake Missoula did.

Interestingly, exactly the same thing is going on right now underneath Great Britain.  During the last ice age, Scotland was completely glaciated; southern England was not.  The melting of those glaciers has resulted in isostatic rebound, lifting the northern edge of the island by ten centimeters per century.  At the same time, the tilt is pushing southern England downward, and it's sinking, at about five centimeters per century.  (Fortunately, there's no giant lake waiting to spill across the country.)

We humans get a bit cocky at times, don't we?  We're powerful, masters of the planet.  Well... not really.  We're dwarfed by structures and processes we're only beginning to understand.  Probably a good thing, that.  Arrogance never did anyone any favors.  There's nothing wrong with finding out we're not invincible -- and that there are a lot of things out there way, way bigger than we are, that don't give a rat's ass for our little concerns.

***************************************

People made fun of Donald Rumsfeld for his statement that there are "known unknowns" -- things we know we don't know -- but a far larger number of "unknown unknowns," which are all the things we aren't even aware that we don't know.

While he certainly could have phrased it a little more clearly, and understand that I'm not in any way defending Donald Rumsfeld's other actions and statements, he certainly was right in this case.  It's profoundly humbling to find out how much we don't know, even about subjects about which we consider ourselves experts.  One of the most important things we need to do is to keep in mind not only that we might have things wrong, and that additional evidence may completely overturn what we thought we knew -- and more, that there are some things so far out of our ken that we may not even know they exist.

These ideas -- the perimeter of human knowledge, and the importance of being able to learn, relearn, change directions, and accept new information -- are the topic of psychologist Adam Grant's book Think Again: The Power of Knowing What You Don't Know.  In it, he explores not only how we are all riding around with blinders on, but how to take steps toward removing them, starting with not surrounding yourself with an echo chamber of like-minded people who might not even recognize that they have things wrong.  We should hold our own beliefs up to the light of scrutiny.  As Grant puts it, we should approach issues like scientists looking for the truth, not like a campaigning politician trying to convince an audience.

It's a book that challenges us to move past our stance of "clearly I'm right about this" to the more reasoned approach of "let me see if the evidence supports this."  In this era of media spin, fake news, and propaganda, it's a critical message -- and Think Again should be on everyone's to-read list.

[Note: if you purchase this book using the image/link below, part of the proceeds goes to support Skeptophilia!]


Wednesday, February 16, 2022

Goldilocks next door

Springboarding off yesterday's post, about how easy it is to form organic compounds abiotically, today we have: our nearest neighbor might be a decent candidate for the search for extraterrestrial life.

At only 4.24 light years away, Proxima Centauri is the closest star to our own Sun.  It's captured the imagination ever since it was discovered how close it is; if you'll recall, the intrepid Robinson family of Lost in Space was heading toward Alpha Centauri, the brightest star in this triple-star system, which is a little father away (4.37 light years) but still more or less right next door, as these things go.

It was discovered in 2016 that Proxima Centauri has a planet in orbit around it -- and more exciting still, it's only a little larger than Earth (1.17 times Earth's mass, to be precise), and is in the star's "Goldilocks zone," where water can exist in liquid form.  The discovery of this exoplanet (Proxima Centauri b) was followed in 2020 by the discovery of Proxima Centauri c, thought to be a "mini-Neptune" at seven times Earth's mass, so probably not habitable by life as we know it.

And now, a paper in Nature has presented research indicating that Proxima Centauri has a third exoplanet -- somewhere between a quarter and three-quarters of the Earth's mass, and right in the middle of the Goldilocks zone as well.

"It is fascinating to know that our Sun’s nearest stellar neighbor is the host to three small planets," said Elisa Quintana, an astrophysicist at NASA’s Goddard Space Flight Center in Greenbelt, Maryland, who co-authored the paper.  "Their proximity make this a prime system for further study, to understand their nature and how they likely formed."

The newly-discovered planet was detected by observing shifts in the light spectrum emitted by the star as the planet's gravitational field interacted with it -- shifts in wavelength as little as 10 ^-5 ångströms, or one ten-thousandth the diameter of a hydrogen atom.  The device that accomplished this is the Echelle Spectrograph for Rocky Exoplanets and Stable Spectroscopic Observations (ESPRESSO -- because you can't have an astronomical device without a clever acronym) at the European Southern Observatory in Cerro Paranal, Chile.  

"It’s showing that the nearest star probably has a very rich planetary system," said co-author Guillem Anglada-Escudé, of the Institute of Space Sciences in Barcelona.  "It always has a little bit of mystique, being the closest one."

What this brings home to me is how incredibly common planets in the Goldilocks zone must be.  It's estimated that around two percent of spectral class F, G, and K stars -- the ones most like the Sun -- have planets in the habitable zone.  If this estimate is accurate -- and if anything, most astrophysicists think it's on the conservative side -- that means there's five hundred million habitable planets in the Milky Way alone.

Of course, "habitable" comes with several caveats.  Average temperature and proximity to the host star isn't the only thing that determines if a place is actually habitable.  Remember, for example, that Venus is technically in the Goldilocks zone, but because of its atmospheric composition it has a surface temperature hot enough to melt lead, and an atmosphere made mostly of carbon dioxide and sulfuric acid.  Being at the right distance to theoretically have liquid water doesn't mean it actually does.  Besides atmospheric composition, other things that could interfere with a planet having a clement climate are the eccentricity of the orbit (high eccentricity would result in wild temperature fluctuations between summer and winter), the planet being tidally locked (the same side always facing the star), and how stable the star itself is.  Some stars are prone to stellar storms that make the ones our Sun has seem like gentle breezes, and would irradiate the surface of any planets orbiting them in such a way as to damage or destroy anything unlucky enough to be exposed.

But still -- come back to the "life as we know it" part.  Yeah, a tidally-locked planet that gets fried by stellar storms would be uninhabitable for us, but perhaps there are life forms that evolved to avoid the dangers.  As I pointed out yesterday, the oxygen we depend on is actually a highly reactive toxin -- we use it to make our cellular respiration reactions highly efficient, but it's also destructive to tissues unless you have ways to mitigate the damage.  (Recall that burning is just rapid oxidation.)  My hunch -- and it is just a hunch -- is that just as we find life even in the most inhospitable places on Earth, it'll be pretty ubiquitous out in space.

After all, remember what we learned from Ian Malcolm in Jurassic Park:



***************************************

People made fun of Donald Rumsfeld for his statement that there are "known unknowns" -- things we know we don't know -- but a far larger number of "unknown unknowns," which are all the things we aren't even aware that we don't know.

While he certainly could have phrased it a little more clearly, and understand that I'm not in any way defending Donald Rumsfeld's other actions and statements, he certainly was right in this case.  It's profoundly humbling to find out how much we don't know, even about subjects about which we consider ourselves experts.  One of the most important things we need to do is to keep in mind not only that we might have things wrong, and that additional evidence may completely overturn what we thought we knew -- and more, that there are some things so far out of our ken that we may not even know they exist.

These ideas -- the perimeter of human knowledge, and the importance of being able to learn, relearn, change directions, and accept new information -- are the topic of psychologist Adam Grant's book Think Again: The Power of Knowing What You Don't Know.  In it, he explores not only how we are all riding around with blinders on, but how to take steps toward removing them, starting with not surrounding yourself with an echo chamber of like-minded people who might not even recognize that they have things wrong.  We should hold our own beliefs up to the light of scrutiny.  As Grant puts it, we should approach issues like scientists looking for the truth, not like a campaigning politician trying to convince an audience.

It's a book that challenges us to move past our stance of "clearly I'm right about this" to the more reasoned approach of "let me see if the evidence supports this."  In this era of media spin, fake news, and propaganda, it's a critical message -- and Think Again should be on everyone's to-read list.

[Note: if you purchase this book using the image/link below, part of the proceeds goes to support Skeptophilia!]


Tuesday, February 15, 2022

The recipe for life

Back in my teaching days, I was all too aware of how hard it was to generate any kind of enthusiasm for the details of biology in a bunch of teenagers.  But there were a few guaranteed oh-wow moments -- and one that I always introduced by saying, "If this doesn't blow your mind, you're not paying attention."

What I was referring to was the Miller-Urey experiment.  This phenomenal piece of research was an attempt to see if it was possible to create organic compounds abiotically -- with clear implications for the origins of life.  Back in the early twentieth century, when people started to consider seriously the possibility that life started on Earth without the intervention of a deity, the obvious question was, "How?"  So they created apparatus to take collections of inorganic compounds surmised to be abundant on the early Earth, subject them to various energy sources, and waited to see what happened.

What happened was that they basically created smog and dirty water.  No organic compounds.  In 1922, Soviet biochemist Alexander Oparin suggested that the problem might be that they were starting with the assumption that the Earth's atmosphere hadn't changed much -- and looking at (then) new information about the atmosphere of Jupiter, he suggested that perhaps, the early Earth's atmosphere had no free oxygen.  In chemistry terms, it was a reducing atmosphereOxygen, after all, is a highly reactive substance, good at tearing apart organic molecules.  (There's decent evidence that the pathways of aerobic cellular respiration originally evolved as a way of detoxifying oxygen, and only secondarily gained a use at increasing the efficiency of releasing the energy in food molecules.)

It wasn't until thirty years later that anyone tested Oparin's hunch.  Stanley Miller and Harold Urey, of the University of Chicago, created an apparatus made of sealed, interconnected glass globes, and filled them with their best guess at the gases present in the atmosphere of the early Earth -- carbon monoxide, methane, hydrogen sulfide, sulfur dioxide, water vapor, various nitrogen oxides, hydrogen cyanide (HCN), and so on.  No free (diatomic) oxygen.  They then introduced an energy source -- essentially, artificial lightning -- and sat back to wait.

No one expected fast results.  After all, the Earth had millions of years to generate enough organic compounds to (presumably) self-assemble into the earliest cells.  No one was more shocked than Miller and Urey when they came in the next day to find that the water in their apparatus had turned blood red.  Three days later, it was black, like crude oil.  At that point, they couldn't contain their curiosity, and opened it up to see what was there.

All twenty amino acids, plus several amino acids not typically found in living things on Earth.  Simple sugars.  Fatty acids.  Glycerol.  DNA and RNA nucleotides.  Basically, all the building blocks it takes to make a living organism.

In three days.

A scale model of the Miller-Urey apparatus, made for me by my son, who is a professional scientific glassblower

This glop, now nicknamed the "primordial soup," is thought to have filled the early oceans.  Imagine it -- you're standing on the shore of the Precambrian sea (wearing a breathing apparatus, of course).  On land is absolutely nothing alive -- a continent full of nothing but rock and sand.  In front of you is an ocean that appears to be composed of thick, dark oil.

It'd be hard to convince yourself this was actually Earth.

Since then, scientists have re-run the experiment hundreds of times, checking to see if perhaps Miller and Urey had just happened by luck on the exact right recipe, but it turns out this experiment is remarkably insensitive to initial conditions.  As long as you have three things -- (1) the right inorganic building blocks, (2) a source of energy, and (3) no free oxygen -- you can make as much of this rather unappealing soup as you want.

So, it turns out, generating biochemicals is a piece of cake.  And a piece of research at Friedrich Schiller University and the Max Planck Institute have shown that it's even easier than that -- the reactions that create amino acids can happen out in space.

"Water plays an important role in the conventional way in which peptides are created," said Serge Krasnokutski, who co-authored the paper.  "Our quantum chemical calculations have now shown that the amino acid glycine can be formed through a chemical precursor – called an amino ketene – combining with a water molecule.  Put simply: in this case, water must be added for the first reaction step, and water must be removed for the second...  [So] instead of taking the chemical detour in which amino acids are formed, we wanted to find out whether amino ketene molecules could not be formed instead and combine directly to form peptides.  And we did this under the conditions that prevail in cosmic molecular clouds, that is to say on dust particles in a vacuum, where the corresponding chemicals are present in abundance: carbon, ammonia, and carbon monoxide."

The more we look into this, the simpler it seems to be to generate the chemicals of life -- further elucidating how the first organisms formed on Earth, and (even more excitingly) suggesting that life might be common in the cosmos.  In fact, it may not even take an Earth-like planet to be a home for life; as long as a planet is in the "Goldilocks zone" (the distance from its parent star where water can exist in liquid form), getting from there to an organic-compound-rich environment may not be much of a hurdle.

That's still a long way from intelligent life, of course; chances are, the planets with extraterrestrial life mostly have much simpler organisms.  But how exciting is that?  Setting foot on a planet covered with life -- none of which has any common ancestry with terrestrial organisms.

I can think of very little that would be more thrilling than that.

***************************************

People made fun of Donald Rumsfeld for his statement that there are "known unknowns" -- things we know we don't know -- but a far larger number of "unknown unknowns," which are all the things we aren't even aware that we don't know.

While he certainly could have phrased it a little more clearly, and understand that I'm not in any way defending Donald Rumsfeld's other actions and statements, he certainly was right in this case.  It's profoundly humbling to find out how much we don't know, even about subjects about which we consider ourselves experts.  One of the most important things we need to do is to keep in mind not only that we might have things wrong, and that additional evidence may completely overturn what we thought we knew -- and more, that there are some things so far out of our ken that we may not even know they exist.

These ideas -- the perimeter of human knowledge, and the importance of being able to learn, relearn, change directions, and accept new information -- are the topic of psychologist Adam Grant's book Think Again: The Power of Knowing What You Don't Know.  In it, he explores not only how we are all riding around with blinders on, but how to take steps toward removing them, starting with not surrounding yourself with an echo chamber of like-minded people who might not even recognize that they have things wrong.  We should hold our own beliefs up to the light of scrutiny.  As Grant puts it, we should approach issues like scientists looking for the truth, not like a campaigning politician trying to convince an audience.

It's a book that challenges us to move past our stance of "clearly I'm right about this" to the more reasoned approach of "let me see if the evidence supports this."  In this era of media spin, fake news, and propaganda, it's a critical message -- and Think Again should be on everyone's to-read list.

[Note: if you purchase this book using the image/link below, part of the proceeds goes to support Skeptophilia!]


Monday, February 14, 2022

Rehabilitating our cousins

The Neanderthals have gotten an undeservedly bad reputation.

"Neanderthal" has become an insult for someone perceived as dumb, crude, vulgar, lacking in any sort of refinement.  This perception wasn't helped any by Jean Auel's Clan of the Cave Bear and its sequels, where the "Clan" (the Neanderthals) are primitive, ugly, brutal people (personified by the violent and cruel Broud), contrasted to the beautiful, heroic, sophisticated Cro Magnons (such as virile, handsome, blue-eyed Jondalar, who is portrayed basically as a prehistoric Liam Hemsworth, only sexier).  The truth is way more complex than that; they certainly weren't unintelligent, and in fact, the most recent Neanderthals had an average brain size larger than a modern human's.  (I'm aware that brain size doesn't necessarily correlate with higher intelligence, but the depiction of them as tiny-brained primitives is almost certainly false.)

They had culture; the Mousterian tool complex, with its beautifully fluted stone arrowhead points, was Neanderthal in origin.  They were builders, weavers, jewelry-makers, and knew the use of medicinal plants.  There's evidence that they made music -- the Divje Babe flute, from 43,000 years ago, was made by Neanderthals from a bear femur, was probably made by Neanderthals.  The structure of the Neanderthal hyoid bone, and the presence of the FoxP2 gene, strongly suggests that they had the capacity for language, although cognitive scientist Philip Lieberman suggests that their mouth morphology would have made it difficult or impossible to articulate nasal sounds and the phonemes /a/, /i/, /u/, /ɔ/, /g/, and /k/, so any language they had probably wasn't as rich phonetically as ours are.  (Although that, too, is an overgeneralization; as I pointed out in a post only a month ago, the phonemic inventory of modern languages varies all over the place, with some only having a dozen or so distinct sounds.)

Another thing that people tend to get wrong is that the Neanderthals were displaced by modern Homo sapiens, and eventually driven to extinction, because we were smarter, faster, and more sophisticated.  Which is not only false, it carries that hint of self-congratulation that should be an immediate tipoff that there's more to it.  It's undeniable that our Neanderthal cousins did diminish and eventually disappear something on the order of forty thousand years ago, but what caused it is uncertain at best.  Other hypotheses regarding why they declined are climatic shifts, disease, and loss of food sources... and, most interestingly, that they interbred with, and were eventually subsumed by, modern humans.  Genetic analysis shows that a great many of us -- including most people of European ancestry -- contain genetic markers indicating Neanderthal ancestry.  Current estimates are that western Europeans have the highest percentage of Neanderthal DNA (at around four percent), while some groups, most notably sub-Saharan Africans, have almost none.

My own DNA apparently has 284 distinct Neanderthal markers, putting me in the sixtieth percentile.  So at least I'm above average in something.

[Image licensed under the Creative Commons Neanderthal-Museum, Mettmann, Homo sapiens neanderthalensis-Mr. N, CC BY-SA 4.0]

What brings this up is some new research indicating that the overlap between Neanderthals and anatomically modern humans may have lasted longer than we thought, and completely upends the picture of hordes of highly-advanced humans (led, of course, by Liam Hemsworth) sweeping over and destroying the primitive knuckle-dragging Neanderthal cave-dwellers.  Archaeologists working in the cave complex Grotte Mandrin, in the Rhône Valley of France, came upon a child's tooth and some stone tools (both of a distinctly modern sort), that date from 54,000 years ago -- a good twelve thousand years earlier than previous estimates.

"It wasn't an overnight takeover by modern humans," said Chris Stringer, of the Natural History Museum of London, who co-authored the paper.  "Sometimes Neanderthals had the advantage, sometimes modern humans had the advantage, so it was more finely balanced...  We have this ebb and flow.  The modern humans appear briefly, then there's a gap where maybe the climate just finished them off and then the Neanderthals come back again..  [W]e don't know the full story yet.  But with more data and with more DNA, more discoveries, we will get closer to the truth about what really happened at the end of the Neanderthal era."

Human history (and prehistory) are a lot more complex than you'd think on first glance, and it bears keeping in mind that usually when we build up a picture of something that happened in the past, we're working on (very) incomplete data filled in with guesses and surmises.  If a time machine is ever invented, and we can go back and look for ourselves, I think we'd be astonished at how much we missed -- or got flat wrong.

It reminds me of the famous quote by H. L. Mencken -- "For every problem, there is an answer that is clear, simple... and wrong."

***************************************

People made fun of Donald Rumsfeld for his statement that there are "known unknowns" -- things we know we don't know -- but a far larger number of "unknown unknowns," which are all the things we aren't even aware that we don't know.

While he certainly could have phrased it a little more clearly, and understand that I'm not in any way defending Donald Rumsfeld's other actions and statements, he certainly was right in this case.  It's profoundly humbling to find out how much we don't know, even about subjects about which we consider ourselves experts.  One of the most important things we need to do is to keep in mind not only that we might have things wrong, and that additional evidence may completely overturn what we thought we knew -- and more, that there are some things so far out of our ken that we may not even know they exist.

These ideas -- the perimeter of human knowledge, and the importance of being able to learn, relearn, change directions, and accept new information -- are the topic of psychologist Adam Grant's book Think Again: The Power of Knowing What You Don't Know.  In it, he explores not only how we are all riding around with blinders on, but how to take steps toward removing them, starting with not surrounding yourself with an echo chamber of like-minded people who might not even recognize that they have things wrong.  We should hold our own beliefs up to the light of scrutiny.  As Grant puts it, we should approach issues like scientists looking for the truth, not like a campaigning politician trying to convince an audience.

It's a book that challenges us to move past our stance of "clearly I'm right about this" to the more reasoned approach of "let me see if the evidence supports this."  In this era of media spin, fake news, and propaganda, it's a critical message -- and Think Again should be on everyone's to-read list.

[Note: if you purchase this book using the image/link below, part of the proceeds goes to support Skeptophilia!]


Saturday, February 12, 2022

Artifishal

In the wonderful Star Trek: The Next Generation episode "Tapestry," Captain Picard's life is in danger because an accident damaged his artificial heart.  He'd received the biomechanical prosthesis decades earlier, because his original heart was irreparably damaged in a fight he got in when he was a young, cocky student at Starfleet Academy.  The inimitable Q offers Picard a choice -- to go back in time and change the circumstances that led to the fight -- meaning he'd have his own original heart, and the accident wouldn't lead to his death.  But, as such stories usually go, Picard finds out that rectifying one mistake doesn't necessarily lead to his life having a better trajectory -- and that perhaps a shorter, richer life, facing risks head-on, is better than one that lasts longer because of always playing it safe.


The replacement of the human heart by a machine, or by a biological and mechanical composite, is still in its earliest stages, and even a heart transplant from a compatible donor is iffy (although admittedly better than the alternative).  A study in 2013 found that the survival rate for heart recipients past twenty years post-surgery was about 26%, although that number has been rising steadily as the tissue matching protocols and the management of complications improve.  The hitch for heart recipients, of course, is that they have to wait for a matched donor to die; and not only that, to die in such a way that the heart itself isn't too damaged to transplant.

But what if someone who needed a heart could have one grown from the person's own cells?

That's where a fascinating bit of research out of the University of Harvard is pointing.  In a paper published in Science this week, a team led by Keel Yong Lee showed proof-of concept -- by creating an artificial fish made of human heart stem cells.

The "biohybrid" was made by creating a finely-grooved, two-sided scaffolding on which were laid cardiac cells.  The cells aligned themselves with the grooves, growing into a pair of parallel sheets.  Grafted onto this was an autonomous pacing node -- a little like the heart's pacemaker -- which stimulated rhythmic contractions on opposite sides, allowing the "fish" to swim.  Best of all, as the cells matured, the fish got better and better at swimming, eventually reaching speeds and maneuverability comparable to a zebrafish, the species the biohybrid was modeled on.

"Our ultimate goal is to build an artificial heart to replace a malformed heart in a child," said Kit Parker, who was senior author of the paper, in a press release.  "Most of the work in building heart tissue or hearts, including some work we have done, is focused on replicating the anatomical features or replicating the simple beating of the heart in the engineered tissues.  But here, we are drawing design inspiration from the biophysics of the heart, which is harder to do.  Now, rather than using heart imaging as a blueprint, we are identifying the key biophysical principles that make the heart work, using them as design criteria, and replicating them in a system, a living, swimming fish, where it is much easier to see if we are successful."

We are nearing the point where faulty organs in our bodies can simply be replaced by biomechanical devices, not so far away from Jean-Luc Picard's heart.  This would obviate the nerve-wracking trauma of waiting an indefinite amount of time for a donor, and also the potential for tissue compatibility issues, as the organ would be built out of your own cells.

We're still a ways out, though.  The Lee et al. research demonstrates that it's possible to build functional, coordinated contractile tissue -- the first step in generating a working heart -- and, as we've seen so many times before, it's often an amazingly short time between showing that something is theoretically possible and its becoming a reality.

So once again, Star Trek has shown itself to be prescient.  I hope this keeps happening -- communicators (the ones in the original series even looked like flip-phones), voice-activated software, translation programs, and videoconferincing all appeared on Star Trek long before they became household items.  Now, I wish the scientists would get to work on transporters, replicators, and the holodeck.  Because those would be all kinds of cool, especially the holodeck, although I'd stand a significant chance of finding a program I liked better than reality and disappearing permanently.

*********************************

This week's Skeptophilia book-of-the-week combines cutting-edge astrophysics and cosmology with razor-sharp social commentary, challenging our knowledge of science and the edifice of scientific research itself: Chanda Prescod-Weinsten's The Disordered Cosmos: A Journey into Dark Matter, Spacetime, and Dreams Deferred.

Prescod-Weinsten is a groundbreaker; she's a theoretical cosmologist, and the first Black woman to achieve a tenure-track position in the field (at the University of New Hampshire).  Her book -- indeed, her whole career -- is born from a deep love of the mysteries of the night sky, but along the way she has had to get past roadblocks that were set in front of her based only on her gender and race.  The Disordered Cosmos is both a tribute to the science she loves and a challenge to the establishment to do better -- to face head on the centuries-long horrible waste of talent and energy of anyone not a straight White male.

It's a powerful book, and should be on the to-read list for anyone interested in astronomy or the human side of science, or (hopefully) both.  And watch for Prescod-Weinsten's name in the science news.  Her powerful voice is one we'll be hearing a lot more from.

[Note: if you purchase this book using the image/link below, part of the proceeds goes to support Skeptophilia!]