Skeptophilia (skep-to-fil-i-a) (n.) - the love of logical thought, skepticism, and thinking critically. Being an exploration of the applications of skeptical thinking to the world at large, with periodic excursions into linguistics, music, politics, cryptozoology, and why people keep seeing the face of Jesus on grilled cheese sandwiches.

Thursday, January 9, 2020

Lying down with dogs

We love our dogs, but there are times when we look straight into their big brown eyes and say, "You are the reason we can't have nice things."

Of course, given that neither of our dogs are known for their excessive brain power, they usually respond by wagging cheerfully.


Sometimes I think our dogs are not so much pets as a pair of home demolition experts.  Both of them track mud everywhere, a problem made worse by Guinness's love of swimming in our pond.


There's also the issue that when he chases his tennis ball -- his all-time favorite occupation, one that he is capable of doing for hours on end -- he performs his catches with all the grace and subtlety of a baseball player sliding into home.  The result is that he has torn our back lawn into a wasteland of rutted dirt, which in early spring when the snow melts turns into a giant mud puddle.  We've been renovating our walk-out basement, and while considering what flooring to put in, I suggested that we simply spread an enormous plastic tarp on the floor and call it good.

Carol felt that this didn't set the right aesthetic for our home, and I suppose she's right,  but it would certainly be easier to keep clean.

And it'd be nice if tracking dirt everywhere was all they did.  Guinness (code name: El Destructo) has a great love of chewing stuff, and despite having approximately 1,485 chew toys, he is constantly finding stuff to tear up that isn't technically his.  So far, we've lost shoes, slippers, books, paintbrushes, pieces of unopened mail, a set of iPod headphones, and so many cardboard boxes that I've lost count.  He's also an accomplished counter surfer, and just a couple of weeks ago he snagged a half-pound of expensive French brie, something we still haven't quite forgiven him for.

I guess I didn't realize that when we picked him out at the shelter, we were on the Bad Doggie Aisle.  That'll teach me not to read the signs more carefully.

Anyhow, when we ask our dogs, "So, what good are you two, anyway?", they don't generally have any answer unless you count cheerful wagging.  But maybe they will now -- because two papers, one in the Journal of Pediatric Allergy and Immunology and the other in PLOS-One, have shown that dogs actually do have a positive contribution to make (above and beyond companionship), especially to the health of children.

In the first, "Early Exposure to Cats, Dogs and Farm Animals and the Risk of Childhood Asthma and Allergy," by a team led by Vincent Ojwang of Tampere University (Finland), we find that children living with dogs and/or cats when they're very young have a statistically significant lower chance of allergies, asthma, and eczema than children who don't.  The mechanism is poorly understood -- it may have something to do with early exposure to dirt and pet dander desensitizing children to harmless antigens -- but the effect was clear.  The sample size was nearly four thousand, so it's not an inconsequential result.  (Interestingly, the correlation with farm animals was uncertain, perhaps because farm animals aren't in the home and exposure to them is not only more limited, it's more likely to occur in the open air where concentrations of dust and dander are lower.)

The second, "Exposure to Household Pet Cats and Dogs in Childhood and Risk of Subsequent Diagnosis of Schizophrenia or Bipolar Disorder," by a team led by Robert Yolken of Johns Hopkins, found that (even when you control for other factors), a child who lived with a pet dog for a significant amount of time before age thirteen was 24% less likely to be diagnosed later with schizophrenia.  (There was no similar correlation with cat ownership; the reason is unclear.)

As with the allergy/asthma study, the mechanism behind this correlation is uncertain.  "Serious psychiatric disorders have been associated with alterations in the immune system linked to environmental exposures in early life, and since household pets are often among the first things with which children have close contact, it was logical for us to explore the possibilities of a connection between the two," Yolken said in an interview with Science Daily.  "Previous studies have identified early life exposures to pet cats and dogs as environmental factors that may alter the immune system through various means, including allergic responses, contact with zoonotic bacteria and viruses, changes in a home's microbiome, and pet-induced stress reduction effects on human brain chemistry...  [Some researchers] suspect that this immune modulation may alter the risk of developing psychiatric disorders to which a person is genetically or otherwise predisposed."

So I suppose I must grudgingly admit that our dogs actually might serve some purpose other than getting hair all over the sofa, barking at the UPS guy, and chasing away terrifying intruders like chipmunks.  Maybe we should credit their dirt-spreading capacity with the fact that our sons are both completely healthy and allergy-free.  At this point, though, since Carol and I are both clearly adults, they can lay off changing our home's microbiome.  I'll accept the risk of developing an allergy if I don't have to put up with Lena lying down next to me after having rolled in a rancid squirrel carcass.

******************************

This week's Skeptophilia book of the week is simultaneously one of the most dismal books I've ever read, and one of the funniest; Tom Phillips's wonderful Humans: A Brief History of How We Fucked It All Up.

I picked up a copy of it at the wonderful book store The Strand when I was in Manhattan last week, and finished it in three days flat (and I'm not a fast reader).  To illustrate why, here's a quick passage that'll give you a flavor of it:
Humans see patterns in the world, we can communicate this to other humans and we have the capacity to imagine futures that don't yet exist: how if we just changed this thing, then that thing would happen, and the world would be a slightly better place. 
The only trouble is... well, we're not terribly good at any of those things.  Any honest assessment of humanity's previous performance on those fronts reads like a particularly brutal annual review from a boss who hates you.  We imagine patterns where they don't exist.  Our communication skills are, uh, sometimes lacking.  And we have an extraordinarily poor track record of failing to realize that changing this thing will also lead to the other thing, and that even worse thing, and oh God no now this thing is happening how do we stop it.
Phillips's clear-eyed look at our own unfortunate history is kept from sinking under its own weight by a sparkling wit, calling our foibles into humorous focus but simultaneously sounding the call that "Okay, guys, it's time to pay attention."  Stupidity, they say, consists of doing the same thing over and over and expecting different results; Phillips's wonderful book points out how crucial that realization is -- and how we need to get up off our asses and, for god's sake, do something.

And you -- and everyone else -- should start by reading this book.

[Note: if you purchase this book using the image/link below, part of the proceeds goes to support Skeptophilia!]





Wednesday, January 8, 2020

In the dark

You've all heard of dark matter, the strange stuff that comprises 85% of the total matter in the universe and about a quarter of its overall mass-energy, and the nature of which -- although its presence has been shown in a variety of ways -- we're no nearer to understanding than we were when investigations of galactic rotation rates demonstrated its existence to astronomer Vera Rubin in 1978 (as I mentioned in yesterday's post).

Less well-known, and even more mysterious, is dark energy.  It's a little unfortunate the monikers of these two strange phenomena sound so similar, because dark energy is entirely different from dark matter (both obtained the sobriquet "dark" mainly because they've resisted all methods for direct detection, so we still have not a damn clue what they are).  Dark energy is a peculiar (hypothesized) form of energy that permeates all of space, and is responsible for the observation that the rate of expansion of the universe is accelerating.  Dark energy, whatever it is, acts on matter as if something were pushing it, working opposite to the pull of gravity that otherwise would cause the expansion to reverse eventually, ending the universe in a "Big Crunch."

Oh, and whatever it is, looks like it's common.  Measurements based on the expansion rate of the universe put estimates in the range of 68% of the total mass-energy of the universe.  So that places ordinary matter and energy -- the kind we are made of and interact with on a daily basis -- at a mere 7% of the stuff in the universe.

Kind of humbling, isn't it?  If the data are correct, 93% of the mass-energy of the universe is made up of stuff we can't detect and don't understand.

[Image licensed under the Creative Commons Design Alex Mittelmann, Coldcreation, Lambda-Cold Dark Matter, Accelerated Expansion of the Universe, Big Bang-Inflation, CC BY-SA 3.0]

Well, maybe.  According to a press release two days ago from Yonsei University (Seoul, South Korea), scientists at the Center for Galaxy Evolution and Research are suggesting that the foundational assumption that led to the "discovery" of dark energy may simply be wrong.

I'm no astrophysicist, so I won't try to summarize the press release, but simply quote the salient paragraphs:
The most direct and strongest evidence for the accelerating universe with dark energy is provided by the distance measurements using type Ia supernovae (SN Ia) for the galaxies at high redshift.  This result is based on the assumption that the corrected luminosity of SN Ia through the empirical standardization would not evolve with redshift.

New observations and analysis made by a team of astronomers at Yonsei University (Seoul, South Korea), together with their collaborators at Lyon University and KASI, show, however, that this key assumption is most likely in error.  The team has performed very high-quality (signal-to-noise ratio ~175) spectroscopic observations to cover most of the reported nearby early-type host galaxies of SN Ia, from which they obtained the most direct and reliable measurements of population ages for these host galaxies.  They find a significant correlation between SN luminosity and stellar population age at a 99.5% confidence level.  As such, this is the most direct and stringent test ever made for the luminosity evolution of SN Ia.  Since SN progenitors in host galaxies are getting younger with redshift (look-back time), this result inevitably indicates a serious systematic bias with redshift in SN cosmology.  Taken at face values, the luminosity evolution of SN is significant enough to question the very existence of dark energy.  When the luminosity evolution of SN is properly taken into account, the team found that the evidence for the existence of dark energy simply goes away.
I don't know about you, but I read this with my mouth hanging open.  The idea that 68% of the mass-energy density of the universe could disappear if you alter the assumptions came as a bit of a shock.

It probably shouldn't have, of course, because this sort of thing has happened before.  There was phlogiston (the mysterious substance inherent in combustible matter) and the luminiferous aether (the mysterious substance through which light propagates in the vacuum of space), both of which turned out to be not so much mysterious as nonexistent.  Both of these vanished when the baseline assumptions changed -- in the first case, when a good theory of chemical energy was developed, and in the second when Einstein showed that light didn't act like an ordinary wave.

And honestly, even if I'm shocked by the way the dark energy scenario is playing out, I've been half expecting something like this to happen.  A physicist friend of mine was chatting with me one day about dark matter and dark energy (as one does), and she said that just like the aether stuck around until Einstein came and blew away the need for it by changing the perspective, the same would happen with the strange and undetectable dark matter and dark energy.

"We're just waiting for this century's Einstein," she said.

But it seems like it might not even require something as groundbreaking as a Theory of Relativity, here, at least in the case of dark energy.  All it might take is reevaluating the data on supernova luminosity to remove the need for the hypothesis.

Also would explain why we haven't detected it.

But this, like any scientific claim, is bound to be challenged, especially consider that it's nixing 68% of the universe in one fell swoop.  So keep your eyes on the physics journals -- I'm sure you haven't heard the last of this.

And you can count on the new research casting some light on the darkness -- whatever the ultimate outcome.

******************************

This week's Skeptophilia book of the week is simultaneously one of the most dismal books I've ever read, and one of the funniest; Tom Phillips's wonderful Humans: A Brief History of How We Fucked It All Up.

I picked up a copy of it at the wonderful book store The Strand when I was in Manhattan last week, and finished it in three days flat (and I'm not a fast reader).  To illustrate why, here's a quick passage that'll give you a flavor of it:
Humans see patterns in the world, we can communicate this to other humans and we have the capacity to imagine futures that don't yet exist: how if we just changed this thing, then that thing would happen, and the world would be a slightly better place. 
The only trouble is... well, we're not terribly good at any of those things.  Any honest assessment of humanity's previous performance on those fronts reads like a particularly brutal annual review from a boss who hates you.  We imagine patterns where they don't exist.  Our communication skills are, uh, sometimes lacking.  And we have an extraordinarily poor track record of failing to realize that changing this thing will also lead to the other thing, and that even worse thing, and oh God no now this thing is happening how do we stop it.
Phillips's clear-eyed look at our own unfortunate history is kept from sinking under its own weight by a sparkling wit, calling our foibles into humorous focus but simultaneously sounding the call that "Okay, guys, it's time to pay attention."  Stupidity, they say, consists of doing the same thing over and over and expecting different results; Phillips's wonderful book points out how crucial that realization is -- and how we need to get up off our asses and, for god's sake, do something.

And you -- and everyone else -- should start by reading this book.

[Note: if you purchase this book using the image/link below, part of the proceeds goes to support Skeptophilia!]





Tuesday, January 7, 2020

Stretching the boundaries

Be honest, can you tell me anything about the following people?
  • Annie Jump Cannon
  • Jocelyn Bell Burnell
  • Henrietta Swan Leavitt
  • Willamina Fleming
  • Maria Mitchell
  • Ruby Payne-Scott
  • Nancy Roman
  • Vera Rubin
Okay, what about the following?
  • Nikolaus Copernicus
  • Johannes Kepler
  • Neil DeGrasse Tyson
  • Stephen Hawking
  • William Herschel
  • Christiaan Huygens
  • Carl Sagan
  • Edwin Hubble
My guess is that the typical reader recognized six or seven people on the second list, and could probably have named a major contribution for at least five of them.  I'd also wager that the average recognition for the first list is one or two -- and that most people couldn't tell you what the accomplishments were for the ones they did recognize.

Okay, I admit, it's pretty obvious what I'm driving at, here.  I'm not known for my subtlety.  And lest you think I'm deliberately comparing some chosen-to-be-minor female astronomers with a list of male Big Names, here are the major contributions for the women on the first list.

Annie Jump Cannon (1863-1941) is responsible for the current stellar classification system, in which stars are categorized by their spectral output and temperature -- an achievement that was critical for our understanding of stellar evolution.  So when you're watching Star Trek: The Next Generation and Commander Data says, "It is a typical M-class star" -- yeah, that was Annie Jump Cannon's invention.  Oh, and did I mention that she wasn't just female in a time when women were virtually prohibited from becoming scientists, but she was almost completely deaf?  Remember that when you think about the obstacles you have to overcome to reach your goals and dreams.

Jocelyn Bell Burnell (b. 1943) is an astrophysicist from Northern Ireland who was responsible for the discovery and explanation of pulsars in 1967.  Her claim that they were rapidly-rotating neutron stars was at first dismissed -- some scientists even derided the data itself, calling her discovery of the flashing star "LGM" (Little Green Men) -- and she wasn't included in the 1974 Nobel Prize awarded to scientists involved in the research that confirmed her hypothesis.  (Her other awards, though, are too numerous to list here, and she showed her typical graciousness in accepting her exclusion from the Nobel, but it pissed off a slew of influential people and opened a lot of eyes about the struggles of women in science.)

Henrietta Swan Leavitt (1868-1921) was an American astronomer who discovered a seemingly trivial fact -- that the bright/dark periodicity of a type of variable star, Cepheid variables, is directly proportional to its intrinsic brightness.  She very quickly realized that this meant Cepheids could be used as "standard candles" -- a light source with a known actual brightness -- to allow astronomers to figure out how far away stars are.  This understanding was half of the solution to the question of the age of the universe, which added to red shift proved that the universe is expanding, and ultimately led to the Big Bang theory.

Willamina Fleming (1857-1911) was a Scottish astronomer who discovered (literally) thousands of astronomical objects, including the now-famous Horsehead Nebula.  She was one of the founding members of the "Harvard Computers," a group of women who took on the task of doing mathematical calculations using data from the Harvard Observatory -- after Fleming noted that the work their male counterparts had been doing could have been bettered by her housekeeper.

Maria Mitchell (1818-1889) was an American astronomer whose accomplishments were so many and varied that I could go on for pages just about her.  She was the first female professor of astronomy at an American college (Vassar), the first female editor of a column in Scientific American, was director of Vassar's observatory for twenty years, came up with the first good explanation for sunspots, pioneered investigations into stellar composition, and discovered (among other things) a comet before it was visible to the naked eye.  She was an incredibly inspiring teacher -- twenty-five of her students went on to be listed in Who's Who.  "I cannot expect to make astronomers," she once said to her class, "but I do expect that you will invigorate your minds by the effort at healthy modes of thinking.  When we are chafed and fretted by small cares, a look at the stars will show us the littleness of our own interests."

Ruby Payne-Scott (1912-1981) was an Australian scientist who became the first female radioastronomer, who was responsible for linking the appearance of sunspots with radio bursts from the Sun and was also instrumental in developing radar for detecting enemy planes during World War II.  She was not only an astronomer but a gifted physicist and electrical engineer, and made use of all three in her research -- but opportunities for women in science were so limited that in 1963 she resigned as an astronomer and became a secondary school teacher.  But she never ceased fighting for women's voices in science, and in 2008 the Commonwealth Scientific and Industrial Research Organization began the Payne-Scott Award in her honor to support women in science, especially those returning to the research world after taking time for maternity leave.

Nancy Roman (1925-2018) was an American astronomer who was one of the first female executives at NASA, and who has been nicknamed the "Mother of Hubble" for her instrumental role in developing the Hubble Space Telescope.  She did pioneering work in the calculation of stellar velocities -- all this despite having been actively discouraged from pursuing a science career, most notably by a high school counselor when she suggested she'd like to take algebra instead of Latin.  The counselor sneered, "What kind of lady would take mathematics instead of Latin?"  Well, this lady would, and went on to be the recipient of four honorary doctorates (as well as the one she earned), received an Exceptional Scientific Achievement Medal from NASA and a fellowship with the American Association for the Advancement of Science, and was the recipient of many other awards.

Vera Rubin (1928-2016) was an American astronomer whose observation of anomalies in galactic rotation rates led to what might be the weirdest discovery in physics in the last hundred years -- "dark matter."  Her work, according to the New York Times, "usher[ed] in a Copernican-style change in astronomy," and the Carnegie Institute said after her death that the United States had "lost a national treasure."

Honestly, it's Rubin who got me thinking about all of this gender inequity, because I found out that last month the Large Synoptic Survey Telescope was renamed the Vera C. Rubin Observatory, and when I posted on social media how awesome this was, I had several people respond, "Okay, cool, but who is she?"  We like to pride ourselves on how far we've come in terms of equity, but man, we have a long way to go.  Famous straight white male scientists become household names; equally prestigious scientists who are women, LGBTQ, or people of color often become poorly-recognized footnotes.

Don't you think it's time for this to change?

The amazing Vera Rubin in 2009 [Image is in the Public Domain]

I know this is a battle we won't win overnight, but the dominance of straight white males in science has resulted in the stifling of so incredibly much talent, hope, and skill that we ought to all be working toward greater access and opportunity regardless of our own gender, skin color, or sexual orientation.  My little exercise in considering some female astronomers probably won't count for that much, but I'm hoping that it might open a few eyes, invert a few stereotypes, and stretch a few boundaries -- and whatever motion we can have in that direction is nothing but positive.

******************************

This week's Skeptophilia book of the week is simultaneously one of the most dismal books I've ever read, and one of the funniest; Tom Phillips's wonderful Humans: A Brief History of How We Fucked It All Up.

I picked up a copy of it at the wonderful book store The Strand when I was in Manhattan last week, and finished it in three days flat (and I'm not a fast reader).  To illustrate why, here's a quick passage that'll give you a flavor of it:
Humans see patterns in the world, we can communicate this to other humans and we have the capacity to imagine futures that don't yet exist: how if we just changed this thing, then that thing would happen, and the world would be a slightly better place. 
The only trouble is... well, we're not terribly good at any of those things.  Any honest assessment of humanity's previous performance on those fronts reads like a particularly brutal annual review from a boss who hates you.  We imagine patterns where they don't exist.  Our communication skills are, uh, sometimes lacking.  And we have an extraordinarily poor track record of failing to realize that changing this thing will also lead to the other thing, and that even worse thing, and oh God no now this thing is happening how do we stop it.
Phillips's clear-eyed look at our own unfortunate history is kept from sinking under its own weight by a sparkling wit, calling our foibles into humorous focus but simultaneously sounding the call that "Okay, guys, it's time to pay attention."  Stupidity, they say, consists of doing the same thing over and over and expecting different results; Phillips's wonderful book points out how crucial that realization is -- and how we need to get up off our asses and, for god's sake, do something.

And you -- and everyone else -- should start by reading this book.

[Note: if you purchase this book using the image/link below, part of the proceeds goes to support Skeptophilia!]





Monday, January 6, 2020

The weather report

Anyone who paid attention in ninth grade Earth Science class knows that climate and weather are not the same thing.

This, of course, means that we should be scrutinizing the high school transcripts of Donald Trump and the majority of his administration, because without fail you can count on a sneering comment about there being no such thing as anthropogenic climate change every time it snows in Buffalo.

The difference isn't even that hard to understand.  Climate is what you expect to get, weather is what you actually get.  Put more scientifically, climate is the overall averages and trends in a geographical region, and weather is the conditions that occur in a place at a particular time.  So a hot day no more proves the reality of climate change than a cold day disproves it; it's the changes of the average conditions over time that demonstrate to anyone with an IQ larger than their shoe size that something is going drastically wrong with the global climate, and that our penchant for burning fossil fuels is largely the cause.

Well, we might have to amend that last paragraph.  Because a paper that came out last week in Nature has shown pretty conclusively that you can detect the fingerprint of climate change in the weather -- if you look at a large enough sampling on a particular day.

In "Climate Change Now Detectable from Any Single Day of Weather at Global Scale," climatologists Sebastian Sippel, Nicolai Meinshausen, Erich M. Fischer, EnikÅ‘ Székely, and Reto Knutti, of the Institute for Atmospheric and Climate Science of ETH Zürich decided to look at the assumptions implicit in Donald Trump's incessant tweeting every time there's a hard frost that climate change doesn't exist, and see if it really is possible to see the effects of climate change on a small scale.

And terrifyingly, it turns out that it is.

The authors write:
For generations, climate scientists have educated the public that ‘weather is not climate’, and climate change has been framed as the change in the distribution of weather that slowly emerges from large variability over decades.  However, weather when considered globally is now in uncharted territory.  Here we show that on the basis of a single day of globally observed temperature and moisture, we detect the fingerprint of externally driven climate change, and conclude that Earth as a whole is warming.  Our detection approach invokes statistical learning and climate model simulations to encapsulate the relationship between spatial patterns of daily temperature and humidity, and key climate change metrics such as annual global mean temperature or Earth’s energy imbalance.  Observations are projected onto this relationship to detect climate change.  The fingerprint of climate change is detected from any single day in the observed global record since early 2012, and since 1999 on the basis of a year of data.  Detection is robust even when ignoring the long-term global warming trend.  This complements traditional climate change detection, but also opens broader perspectives for the communication of regional weather events, modifying the climate change narrative: while changes in weather locally are emerging over decades, global climate change is now detected instantaneously.
So Trump's method of "look out of the window and check what the weather's like today" turns out to prove exactly the opposite of what he'd like everyone to believe.

I am simultaneously appalled and fascinated by the fact that there are still people who doubt anthropogenic climate change.  To start with, there is a universal consensus amongst the climatologists (i.e., the people who know what the hell they're talking about) that man-made global warming is a reality.  Note, by the way, that the scientists have always erred on the cautious side; back when I started my teaching career in the 1980s, the truthful stance was that there was suspicion that anthropogenic climate change was happening, but very few scientists were willing to state it with certainty.

Now?  There's hardly a dissenting voice, with the exception of the "scientists" at the Heartland Institute, who coincidentally get their paychecks from the petroleum industry.

Hmm, I wonder why they're still arguing against it?  Funny thing, that.

But even more persuasive than the scientists -- after all, we're not known as a species for trusting the experts when the experts are saying something inconvenient -- there's the evidence of our own eyes.  In my own home of upstate New York, stop by our local coffee shop any morning you like and ask one of the old-timers if winters now are as bad as what they remember as a child.  One and all, they'll tell you about snowstorms and blizzards and so on, the last serious one of which happened back in 1993.  Yeah, we've had snowfalls since then -- this is the Northeast, after all -- but if you look back through the meteorological records from the early to mid 20th century, there is no question that we've trended toward milder winters.

Then there are the summertime droughts and heat waves, the most extreme of which is happening right now in Australia.  Large parts of Australia are currently burning to a crisp in the worst and most widespread series of wildfires in human history.  Whole towns are being evacuated, and in some places the only safety people have found is piling their family members, pets, and belongings onto boats and waiting out the fires offshore.  The latest estimates are that 12.3 million acres have been charred in the last few months, and that half a billion wild animals have died.  Given the threatened status of a great many of Australia's endemic species, the fact is that we might be witnessing in a few months the simultaneous extinction of dozens of endangered plants and animals.

[Image is in the Public Domain]

But Trump and his administration, and their media mouthpieces at Fox News, have continued to feed people the lie that everything's okay, that we can continue polluting and burning gasoline and coal without any repercussions whatsoever.  Deregulate everything has become the battle cry.  Industry, they say, will regulate itself, no need to worry.

Because that worked out so well in the 1950s and 1960s, when the air in big cities was barely breathable, and there was so much industrial waste in the Cuyahoga River in Cleveland, Ohio that it caught fire not once but thirteen times.

The scientists, and concerned laypeople like myself, have been screaming "Will you people please wake up and do something!" for years now, to seemingly little effect.  "Everything's fine" is a comforting lie, especially since rejecting it means putting a crimp in our generally lavish lifestyles.

The problem is, the natural world has a nasty way of having the last word.  We often forget that there is no reason whatsoever that we couldn't completely wipe ourselves out, either through accident or neglect or outright willful fuckery, or some combination thereof.  For my kids' sake I hope we as a species get pulled up short in the very near future and come together to work toward a solution, because my sense is that time is short.  There comes a point when an avalanche has started and no power on Earth can stop it.  I just hope we're not there yet.

But such a point definitely exists, whether it's behind us or ahead of us.  And that by itself should scare the absolute shit out of every citizen of this Earth.

Maybe you still find yourself shrugging and saying, "Meh."  If so, you should shut off Fox News (permanently) and read a scientific paper or two.  Start with the Sippel et al. study I linked above.

If that doesn't convince you, I don't know what would.

******************************

This week's Skeptophilia book of the week is simultaneously one of the most dismal books I've ever read, and one of the funniest; Tom Phillips's wonderful Humans: A Brief History of How We Fucked It All Up.

I picked up a copy of it at the wonderful book store The Strand when I was in Manhattan last week, and finished it in three days flat (and I'm not a fast reader).  To illustrate why, here's a quick passage that'll give you a flavor of it:
Humans see patterns in the world, we can communicate this to other humans and we have the capacity to imagine futures that don't yet exist: how if we just changed this thing, then that thing would happen, and the world would be a slightly better place. 
The only trouble is... well, we're not terribly good at any of those things.  Any honest assessment of humanity's previous performance on those fronts reads like a particularly brutal annual review from a boss who hates you.  We imagine patterns where they don't exist.  Our communication skills are, uh, sometimes lacking.  And we have an extraordinarily poor track record of failing to realize that changing this thing will also lead to the other thing, and that even worse thing, and oh God no now this thing is happening how do we stop it.
Phillips's clear-eyed look at our own unfortunate history is kept from sinking under its own weight by a sparkling wit, calling our foibles into humorous focus but simultaneously sounding the call that "Okay, guys, it's time to pay attention."  Stupidity, they say, consists of doing the same thing over and over and expecting different results; Phillips's wonderful book points out how crucial that realization is -- and how we need to get up off our asses and, for god's sake, do something.

And you -- and everyone else -- should start by reading this book.

[Note: if you purchase this book using the image/link below, part of the proceeds goes to support Skeptophilia!]





Saturday, December 28, 2019

Prelude to a cataclysm

Dear Readers:

I'm going to be taking a short break next week.  However, I hope you'll continue to send me ideas for new posts -- I'll be back in the saddle again soon, and I always value your suggestions.

The next Skeptophilia post will be Monday, January 6.  A very Happy New Year to all of you, and I hope this century's Roaring Twenties bring you everything you desire!

cheers,

Gordon

*******************************

There'd be nothing like a good supernova to liven things up around here.

Far and away the most spectacular event in the universe, a supernova of a massive star releases more energy in a few seconds than our Sun will release in its entire lifetime.  The colossal explosion is set off by the exhaustion of the fuel in the star's core, a phenomenon that deserves a little more explanation.

Stars are kept in equilibrium by two forces, the outward pressure of the heat produced by fusion in the core, and the inward pull of gravity.  When the star runs out of fuel, the heat diminishes, and gravity wins -- causing a sudden collapse and a phenomenally quick heating of the star's atmosphere.

The result is a supernova, which temporarily outshines everything else in the near vicinity.  Actually, "outshine" is the wrong word; nearby star systems would be flash-fried, and even at a relatively safe distance the high-energy electromagnetic radiation could severely damage a planet's atmosphere.  (Just as a clarification, I'm talking about planets in other star systems; if there were planets in the supernova's system, they'd be instantaneously vaporized.)

The collapsed core of the star then becomes either a neutron star or a black hole, depending on the star's initial mass.  The exploded remnants continue to glow brightly for several months, before finally cooling, fading, and disappearing from the night sky.

As it happens, we've got a good candidate for a supernova not too far away (well, 640 light years away, which isn't exactly next door, but is still close by astronomical standards).  It's called Betelgeuse, and it's the familiar star on Orion's right shoulder.  A red supergiant, the star is about eleven times the mass of the Sun (putting it in the "neutron star" range after it blows itself to smithereens).  However, volume-wise, it's enormous; if you put Betelgeuse where the Sun is, its edge would be somewhere between the orbits of Jupiter and Saturn.

Yes, that's what it sounds like.  If Betelgeuse replaced the Sun, we here on the Earth would be inside the star.

The constellation of Orion; thats's Betelgeuse in the upper left [Image licensed under the Creative Commons Rogelio Bernal Andreo, Orion Head to Toe, CC BY-SA 3.0]

Betelgeuse has long been known as one of the better supernova candidates that are relatively close by.  Asked when it's going to explode, though, astronomers have always played it cagey; could be tomorrow, could be a hundred thousand years from now.  But its recent behavior has made a lot of scientists wonder if the actual date of the explosion might not be closer to the "tomorrow" end of the spectrum than we'd thought.

The star has been exhibiting some odd behavior lately.  It's long been known as a variable star, varying in magnitude between about 0.0 and +0.5 (the bigger the number, the fainter the star).  This means it oscillates between being the fifth brightest star in the night sky and the tenth, with the period of its variation averaging at a little over a year.  But in the last few months, it's defied expectations, dimming to a magnitude of +1.3 and dropping to 23rd place on the list of brightest stars.

Could this herald the beginnings of the collapse that initiates the supernova?  Could be, but the truth is, we don't know.  Supernovae are uncommon events, and nearby ones nearly unheard of -- the last one was "Kepler's Star," the 1604 supernova in the constellation of Ophiuchus.  So what the leadup will look like, we aren't really sure.

What's certain is that this is unprecedented, at least since we've kept detailed records.  It merited a press release from the Villanova University Department of Astronomy three weeks ago, so even the astronomers -- ordinarily the most cautious of scientists -- are admitting that something's up.

Now, we still don't know what's going to happen.  Like I said, we've never been able to observe the events leading up to a supernova before.  But you can bet that the astrophysicists are paying close attention.

And with good reason.  If Betelgeuse went supernova -- no, correction, when Betelgeuse goes supernova -- it's going to be spectacular.  It's estimated it will brighten to a magnitude of -10, which (for reference) is sixteen times the brightness of the full Moon, and over a hundred times the brightness of the planet Venus.  It will be easily visible during the day and will provide enough light to read by at night.  And this won't be a blink-and-you-miss-it occurrence; the supernova will only fade gradually, over a period of eight to nine months, and during that time it will be (other than the Sun itself) the brightest thing in the sky.

And I can say unequivocally that I hope it happens really soon.

It'd be nice to have something happen out there in the universe to take our minds off of how screwed up things are down here, you know?  It'd be a good reminder that there are bigger and more powerful things than we are, and our petty little squabbles are really pretty minuscule by comparison.

So as far as I'm concerned: bring on the supernova.

********************************

As technology has improved, so has our ability to bring that technology to bear on scientific questions, sometimes in unexpected ways.

In the fascinating new book Archaeology from Space: How the Future Shapes Our Past, archaeologist Sarah Parcak gives a fascinating look at how satellite photography has revolutionized her field.  Using detailed photographs from space, including thousands of recently declassified military surveillance photos, Parcak and her colleagues have located hundreds of exciting new sites that before were completely unknown -- roads, burial sites, fortresses, palaces, tombs, even pyramids.

These advances are giving us a lens into our own distant past, and allowing investigation of inaccessible or dangerous sites from a safe distance -- and at a phenomenal level of detail.  This book is a must-read for any students of history -- or if you'd just like to find out how far we've come from the days of Heinrich Schliemann and the excavation of Troy.

[Note: if you purchase this book using the image/link below, part of the proceeds goes to support Skeptophilia!]





Friday, December 27, 2019

Canine mathematics

I remember a while back reading an interesting paper that concluded that dogs have a concept of fairness and morality.

There have been a number of studies confirming this, most strikingly an investigation involving border collies.  Pairs of dogs were trained to do a task, then rewarded with doggie biscuits.  The thing was, Dog #1 was rewarded for correctly doing the task with one biscuit, and Dog #2 with two biscuits for doing the same task.

Within a few rounds, Dog #1 refused to cooperate.  "I'm not working for one biscuit when he gets two," seemed to be the logic.  So -- amazing as it seems -- at least some dogs understand fair play, and will forego getting a treat at all if another dog is getting more.

It also implies an understanding of quantity.  Now, "two is more than one" isn't exactly differential calculus, but it does suggest that dogs have at least a rudimentary numeracy.  The evolutionary advantage of a sense of quantity is obvious; if you can do a quick estimate of the number of predators chasing you, or the size of the herd of antelope you're chasing, you have a better sense of your own safety (and such decisions as when to flee, when to attack, when to hide, and so on).

Guinness, either pondering Fermat's Last Theorem or else trying to figure out how to open the kitchen door so he can swipe the cheese on the counter

But how complex dogs' numerical ability is has proven to be rather difficult to study.  Which is why I found a paper last week in Biology Letters so fascinating.

Entitled, "Canine Sense of Quantity: Evidence for Numerical Ratio-Dependent Activation in Parietotemporal Cortex," by Lauren S. Aulet, Veronica C. Chiu, Ashley Prichard, Mark Spivak, Stella F. Lourenco, and Gregory S. Berns, of Emory University, this study showed that when dogs are confronted with stimuli differing only in quantity, they process that information in the same place in their brains that we use when doing numerical approximation.

The authors write:
The approximate number system (ANS), which supports the rapid estimation of quantity, emerges early in human development and is widespread across species.  Neural evidence from both human and non-human primates suggests the parietal cortex as a primary locus of numerical estimation, but it is unclear whether the numerical competencies observed across non-primate species are subserved by similar neural mechanisms.  Moreover, because studies with non-human animals typically involve extensive training, little is known about the spontaneous numerical capacities of non-human animals. To address these questions, we examined the neural underpinnings of number perception using awake canine functional magnetic resonance imaging.  Dogs passively viewed dot arrays that varied in ratio and, critically, received no task-relevant training or exposure prior to testing.  We found evidence of ratio-dependent activation, which is a key feature of the ANS, in canine parietotemporal cortex in the majority of dogs tested.  This finding is suggestive of a neural mechanism for quantity perception that has been conserved across mammalian evolution.
The coolest thing about this study is that they controlled for stimulus area, which was the first thing I thought of when I read about the experimental protocol.  What I mean by this is that if you keep the size of the objects the same, a greater number of them has a greater overall area, so it might be that the dogs were estimating the area taken up by the dots and not the number.  But the researchers cleverly designed the arrays so that although the number of dots varied from screen to screen, the total area they covered was the same.

And, amazing as it sounds, dogs not only had the ability to estimate the quantity of dots quickly and pick the screen with the greatest number, they were apparently doing this with the same part of their brains we use for analogous tasks.

"We went right to the source, observing the dogs' brains, to get a direct understanding of what their neurons were doing when the dogs viewed varying quantities of dots," said study lead author Lauren Aulet, in a press release in Science Daily.  "That allowed us to bypass the weaknesses of previous behavioral studies of dogs and some other species...  Part of the reason that we are able to do calculus and algebra is because we have this fundamental ability for numerosity that we share with other animals.  I'm interested in learning how we evolved that higher math ability and how these skills develop over time in individuals, starting with basic numerosity in infancy."

I wonder, though, how this would work with our dogs.  As I've mentioned before, Lena (our coonhound) has the IQ of a lima bean, and even has a hard time mastering concepts like the fact that the dog in the pond she barks at incessantly is actually her own reflection and not an Evil Underwater Dog Who Has Invaded Her Territory.  Guinness is smarter (not that the bar was set that high), but I don't know how aware of quantity he is.  He's more of an opportunist who will take advantage of any situation that presents itself, be it a single CheezDoodle someone dropped on the floor or (as happened two days ago) a half-pound of expensive French brie that was left unguarded for five minutes on the coffee table.

I doubt he worried about quantity in either case, frankly.

But the Aulet et al. study is fascinating, and clues us in that the origins of numeracy in our brains goes back a long, long way.  The most recent common ancestor between humans and dogs is on the order of eighty million years ago -- predating the extinction of the dinosaurs by fourteen million years -- so that numerical brain area must be at least that old, and is probably shared by most mammalian species.  It's a little humbling to think that a lot of the abilities we humans pride ourselves on are shared, at least on a basic level, with our near relatives.

But now y'all'll have to excuse me, because Lena wants to go outside.  I guess it's time for her to check and see if the Water Dog has returned.  She's a sneaky one, that Water Dog.

********************************

As technology has improved, so has our ability to bring that technology to bear on scientific questions, sometimes in unexpected ways.

In the fascinating new book Archaeology from Space: How the Future Shapes Our Past, archaeologist Sarah Parcak gives a fascinating look at how satellite photography has revolutionized her field.  Using detailed photographs from space, including thousands of recently declassified military surveillance photos, Parcak and her colleagues have located hundreds of exciting new sites that before were completely unknown -- roads, burial sites, fortresses, palaces, tombs, even pyramids.

These advances are giving us a lens into our own distant past, and allowing investigation of inaccessible or dangerous sites from a safe distance -- and at a phenomenal level of detail.  This book is a must-read for any students of history -- or if you'd just like to find out how far we've come from the days of Heinrich Schliemann and the excavation of Troy.

[Note: if you purchase this book using the image/link below, part of the proceeds goes to support Skeptophilia!]





Thursday, December 26, 2019

Dancing down from the past

It will come as no great surprise to anyone who knows me that I've struggled to overcome my shyness and inhibitions.

One of the ways this manifested was a reluctance to dance.  Dancing requires a willingness not only to get yourself out there on the dance floor, but to lose your self-consciousness and move to the music.  If you're constantly watching yourself, wondering what others are thinking, you'll never loosen up enough to be able to dance -- and as a result, you will move awkwardly.

Self-fulfilling prophecy, that.

I'd always advocated for throwing caution to the wind and enjoying yourself, simultaneously being unable for some reason to apply that same standard to myself.  But something shifted at the retreat I attended a month ago, about which I have written already.  The first night of the retreat, the leader said that one of the things we were going to be doing a lot of was dancing.  It's a really primal activity, he said, and is amazing for getting you out of your own head.

Well, my first reaction was panic.  The voice in my mind said, loud and clear, "YOU CAN'T DO THIS."  But as I related in my post, I did, and it was an amazing experience.  He was exactly right.  Dancing is freeing and exhilarating in a way very little else is.

Being a biologist, this got me to wondering why.  It involves moving your body, sure, but so do a lot of other things; and I can tell you in no uncertain terms that weed-whacking along the fence is equally physical, but is the opposite of exhilarating.  Music plays into it, of course, but I can also listen to music without that euphoric feeling occurring (although as I've also written about before here at Skeptophilia, I do have a very visceral and emotional reaction to certain music, another phenomenon that seems to have a neurological basis).

But put the two together -- music and movement -- and you have an extremely powerful combination.

Greg Sample and Jennita Russo, of Deyo Dancers [Image licensed under the Creative Commons Barry Goyette from San Luis Obispo, USA, Two dancers, CC BY 2.0]

Why exactly this synergy happens is a matter of conjecture, but what is certain is that it goes back a long way in our evolutionary history.  A paper that came out last week in Proceedings of the National Academy of Sciences, by Yuko Hattori and Masaki Tomonaga of Kyoto University, shows that when chimpanzees are exposed to music, or even rhythmic sounds, they respond with something that looks very much like rudimentary dance.

"I was shocked," Hattori said to Eve Frederick, writing for Science.  "I was not aware that without any training or reward, a chimpanzee would spontaneously engage with the sound."

The authors write:
Music and dance are universal across human culture and have an ancient history.  One characteristic of music is its strong influence on movement.  For example, an auditory beat induces rhythmic movement with positive emotions in humans from early developmental stages.  In this study, we investigated if sound induced spontaneous rhythmic movement in chimpanzees.  Three experiments showed that: 1) an auditory beat induced rhythmic swaying and other rhythmic movements, with larger responses from male chimpanzees than female chimpanzees; 2) random beat as well as regular beat induced rhythmic swaying and beat tempo affected movement periodicity in a chimpanzee in a bipedal posture; and 3) a chimpanzee showed close proximity to the sound source while hearing auditory stimuli.  The finding that male chimpanzees showed a larger response to sound than female chimpanzees was consistent with previous literature about “rain dances” in the wild, where male chimpanzees engage in rhythmic displays when hearing the sound of rain starting...  These results suggest some biological foundation for dancing existed in the common ancestor of humans and chimpanzees ∼6 million years ago.  As such, this study supports the evolutionary origins of musicality.
Of course, this still doesn't answer what its evolutionary significance is; if I had to guess, it probably has to do with social cohesion and pair bonding, much as it does in humans.  But it's absolutely fascinating that the roots of dance go back at least to our last common ancestor with chimps, which would be between six and seven million years ago.

All of which makes me a little sad for what I missed in all those years I was too inhibited to dance.  I'll end with a quote from writer and humorist Dave Barry, which seems apt: "No one cares if you can't dance well.  Get up and dance."

  ********************************

As technology has improved, so has our ability to bring that technology to bear on scientific questions, sometimes in unexpected ways.

In the fascinating new book Archaeology from Space: How the Future Shapes Our Past, archaeologist Sarah Parcak gives a fascinating look at how satellite photography has revolutionized her field.  Using detailed photographs from space, including thousands of recently declassified military surveillance photos, Parcak and her colleagues have located hundreds of exciting new sites that before were completely unknown -- roads, burial sites, fortresses, palaces, tombs, even pyramids.

These advances are giving us a lens into our own distant past, and allowing investigation of inaccessible or dangerous sites from a safe distance -- and at a phenomenal level of detail.  This book is a must-read for any students of history -- or if you'd just like to find out how far we've come from the days of Heinrich Schliemann and the excavation of Troy.

[Note: if you purchase this book using the image/link below, part of the proceeds goes to support Skeptophilia!]