Skeptophilia (skep-to-fil-i-a) (n.) - the love of logical thought, skepticism, and thinking critically. Being an exploration of the applications of skeptical thinking to the world at large, with periodic excursions into linguistics, music, politics, cryptozoology, and why people keep seeing the face of Jesus on grilled cheese sandwiches.

Tuesday, January 7, 2020

Stretching the boundaries

Be honest, can you tell me anything about the following people?
  • Annie Jump Cannon
  • Jocelyn Bell Burnell
  • Henrietta Swan Leavitt
  • Willamina Fleming
  • Maria Mitchell
  • Ruby Payne-Scott
  • Nancy Roman
  • Vera Rubin
Okay, what about the following?
  • Nikolaus Copernicus
  • Johannes Kepler
  • Neil DeGrasse Tyson
  • Stephen Hawking
  • William Herschel
  • Christiaan Huygens
  • Carl Sagan
  • Edwin Hubble
My guess is that the typical reader recognized six or seven people on the second list, and could probably have named a major contribution for at least five of them.  I'd also wager that the average recognition for the first list is one or two -- and that most people couldn't tell you what the accomplishments were for the ones they did recognize.

Okay, I admit, it's pretty obvious what I'm driving at, here.  I'm not known for my subtlety.  And lest you think I'm deliberately comparing some chosen-to-be-minor female astronomers with a list of male Big Names, here are the major contributions for the women on the first list.

Annie Jump Cannon (1863-1941) is responsible for the current stellar classification system, in which stars are categorized by their spectral output and temperature -- an achievement that was critical for our understanding of stellar evolution.  So when you're watching Star Trek: The Next Generation and Commander Data says, "It is a typical M-class star" -- yeah, that was Annie Jump Cannon's invention.  Oh, and did I mention that she wasn't just female in a time when women were virtually prohibited from becoming scientists, but she was almost completely deaf?  Remember that when you think about the obstacles you have to overcome to reach your goals and dreams.

Jocelyn Bell Burnell (b. 1943) is an astrophysicist from Northern Ireland who was responsible for the discovery and explanation of pulsars in 1967.  Her claim that they were rapidly-rotating neutron stars was at first dismissed -- some scientists even derided the data itself, calling her discovery of the flashing star "LGM" (Little Green Men) -- and she wasn't included in the 1974 Nobel Prize awarded to scientists involved in the research that confirmed her hypothesis.  (Her other awards, though, are too numerous to list here, and she showed her typical graciousness in accepting her exclusion from the Nobel, but it pissed off a slew of influential people and opened a lot of eyes about the struggles of women in science.)

Henrietta Swan Leavitt (1868-1921) was an American astronomer who discovered a seemingly trivial fact -- that the bright/dark periodicity of a type of variable star, Cepheid variables, is directly proportional to its intrinsic brightness.  She very quickly realized that this meant Cepheids could be used as "standard candles" -- a light source with a known actual brightness -- to allow astronomers to figure out how far away stars are.  This understanding was half of the solution to the question of the age of the universe, which added to red shift proved that the universe is expanding, and ultimately led to the Big Bang theory.

Willamina Fleming (1857-1911) was a Scottish astronomer who discovered (literally) thousands of astronomical objects, including the now-famous Horsehead Nebula.  She was one of the founding members of the "Harvard Computers," a group of women who took on the task of doing mathematical calculations using data from the Harvard Observatory -- after Fleming noted that the work their male counterparts had been doing could have been bettered by her housekeeper.

Maria Mitchell (1818-1889) was an American astronomer whose accomplishments were so many and varied that I could go on for pages just about her.  She was the first female professor of astronomy at an American college (Vassar), the first female editor of a column in Scientific American, was director of Vassar's observatory for twenty years, came up with the first good explanation for sunspots, pioneered investigations into stellar composition, and discovered (among other things) a comet before it was visible to the naked eye.  She was an incredibly inspiring teacher -- twenty-five of her students went on to be listed in Who's Who.  "I cannot expect to make astronomers," she once said to her class, "but I do expect that you will invigorate your minds by the effort at healthy modes of thinking.  When we are chafed and fretted by small cares, a look at the stars will show us the littleness of our own interests."

Ruby Payne-Scott (1912-1981) was an Australian scientist who became the first female radioastronomer, who was responsible for linking the appearance of sunspots with radio bursts from the Sun and was also instrumental in developing radar for detecting enemy planes during World War II.  She was not only an astronomer but a gifted physicist and electrical engineer, and made use of all three in her research -- but opportunities for women in science were so limited that in 1963 she resigned as an astronomer and became a secondary school teacher.  But she never ceased fighting for women's voices in science, and in 2008 the Commonwealth Scientific and Industrial Research Organization began the Payne-Scott Award in her honor to support women in science, especially those returning to the research world after taking time for maternity leave.

Nancy Roman (1925-2018) was an American astronomer who was one of the first female executives at NASA, and who has been nicknamed the "Mother of Hubble" for her instrumental role in developing the Hubble Space Telescope.  She did pioneering work in the calculation of stellar velocities -- all this despite having been actively discouraged from pursuing a science career, most notably by a high school counselor when she suggested she'd like to take algebra instead of Latin.  The counselor sneered, "What kind of lady would take mathematics instead of Latin?"  Well, this lady would, and went on to be the recipient of four honorary doctorates (as well as the one she earned), received an Exceptional Scientific Achievement Medal from NASA and a fellowship with the American Association for the Advancement of Science, and was the recipient of many other awards.

Vera Rubin (1928-2016) was an American astronomer whose observation of anomalies in galactic rotation rates led to what might be the weirdest discovery in physics in the last hundred years -- "dark matter."  Her work, according to the New York Times, "usher[ed] in a Copernican-style change in astronomy," and the Carnegie Institute said after her death that the United States had "lost a national treasure."

Honestly, it's Rubin who got me thinking about all of this gender inequity, because I found out that last month the Large Synoptic Survey Telescope was renamed the Vera C. Rubin Observatory, and when I posted on social media how awesome this was, I had several people respond, "Okay, cool, but who is she?"  We like to pride ourselves on how far we've come in terms of equity, but man, we have a long way to go.  Famous straight white male scientists become household names; equally prestigious scientists who are women, LGBTQ, or people of color often become poorly-recognized footnotes.

Don't you think it's time for this to change?

The amazing Vera Rubin in 2009 [Image is in the Public Domain]

I know this is a battle we won't win overnight, but the dominance of straight white males in science has resulted in the stifling of so incredibly much talent, hope, and skill that we ought to all be working toward greater access and opportunity regardless of our own gender, skin color, or sexual orientation.  My little exercise in considering some female astronomers probably won't count for that much, but I'm hoping that it might open a few eyes, invert a few stereotypes, and stretch a few boundaries -- and whatever motion we can have in that direction is nothing but positive.

******************************

This week's Skeptophilia book of the week is simultaneously one of the most dismal books I've ever read, and one of the funniest; Tom Phillips's wonderful Humans: A Brief History of How We Fucked It All Up.

I picked up a copy of it at the wonderful book store The Strand when I was in Manhattan last week, and finished it in three days flat (and I'm not a fast reader).  To illustrate why, here's a quick passage that'll give you a flavor of it:
Humans see patterns in the world, we can communicate this to other humans and we have the capacity to imagine futures that don't yet exist: how if we just changed this thing, then that thing would happen, and the world would be a slightly better place. 
The only trouble is... well, we're not terribly good at any of those things.  Any honest assessment of humanity's previous performance on those fronts reads like a particularly brutal annual review from a boss who hates you.  We imagine patterns where they don't exist.  Our communication skills are, uh, sometimes lacking.  And we have an extraordinarily poor track record of failing to realize that changing this thing will also lead to the other thing, and that even worse thing, and oh God no now this thing is happening how do we stop it.
Phillips's clear-eyed look at our own unfortunate history is kept from sinking under its own weight by a sparkling wit, calling our foibles into humorous focus but simultaneously sounding the call that "Okay, guys, it's time to pay attention."  Stupidity, they say, consists of doing the same thing over and over and expecting different results; Phillips's wonderful book points out how crucial that realization is -- and how we need to get up off our asses and, for god's sake, do something.

And you -- and everyone else -- should start by reading this book.

[Note: if you purchase this book using the image/link below, part of the proceeds goes to support Skeptophilia!]





Monday, January 6, 2020

The weather report

Anyone who paid attention in ninth grade Earth Science class knows that climate and weather are not the same thing.

This, of course, means that we should be scrutinizing the high school transcripts of Donald Trump and the majority of his administration, because without fail you can count on a sneering comment about there being no such thing as anthropogenic climate change every time it snows in Buffalo.

The difference isn't even that hard to understand.  Climate is what you expect to get, weather is what you actually get.  Put more scientifically, climate is the overall averages and trends in a geographical region, and weather is the conditions that occur in a place at a particular time.  So a hot day no more proves the reality of climate change than a cold day disproves it; it's the changes of the average conditions over time that demonstrate to anyone with an IQ larger than their shoe size that something is going drastically wrong with the global climate, and that our penchant for burning fossil fuels is largely the cause.

Well, we might have to amend that last paragraph.  Because a paper that came out last week in Nature has shown pretty conclusively that you can detect the fingerprint of climate change in the weather -- if you look at a large enough sampling on a particular day.

In "Climate Change Now Detectable from Any Single Day of Weather at Global Scale," climatologists Sebastian Sippel, Nicolai Meinshausen, Erich M. Fischer, Enikő Székely, and Reto Knutti, of the Institute for Atmospheric and Climate Science of ETH Zürich decided to look at the assumptions implicit in Donald Trump's incessant tweeting every time there's a hard frost that climate change doesn't exist, and see if it really is possible to see the effects of climate change on a small scale.

And terrifyingly, it turns out that it is.

The authors write:
For generations, climate scientists have educated the public that ‘weather is not climate’, and climate change has been framed as the change in the distribution of weather that slowly emerges from large variability over decades.  However, weather when considered globally is now in uncharted territory.  Here we show that on the basis of a single day of globally observed temperature and moisture, we detect the fingerprint of externally driven climate change, and conclude that Earth as a whole is warming.  Our detection approach invokes statistical learning and climate model simulations to encapsulate the relationship between spatial patterns of daily temperature and humidity, and key climate change metrics such as annual global mean temperature or Earth’s energy imbalance.  Observations are projected onto this relationship to detect climate change.  The fingerprint of climate change is detected from any single day in the observed global record since early 2012, and since 1999 on the basis of a year of data.  Detection is robust even when ignoring the long-term global warming trend.  This complements traditional climate change detection, but also opens broader perspectives for the communication of regional weather events, modifying the climate change narrative: while changes in weather locally are emerging over decades, global climate change is now detected instantaneously.
So Trump's method of "look out of the window and check what the weather's like today" turns out to prove exactly the opposite of what he'd like everyone to believe.

I am simultaneously appalled and fascinated by the fact that there are still people who doubt anthropogenic climate change.  To start with, there is a universal consensus amongst the climatologists (i.e., the people who know what the hell they're talking about) that man-made global warming is a reality.  Note, by the way, that the scientists have always erred on the cautious side; back when I started my teaching career in the 1980s, the truthful stance was that there was suspicion that anthropogenic climate change was happening, but very few scientists were willing to state it with certainty.

Now?  There's hardly a dissenting voice, with the exception of the "scientists" at the Heartland Institute, who coincidentally get their paychecks from the petroleum industry.

Hmm, I wonder why they're still arguing against it?  Funny thing, that.

But even more persuasive than the scientists -- after all, we're not known as a species for trusting the experts when the experts are saying something inconvenient -- there's the evidence of our own eyes.  In my own home of upstate New York, stop by our local coffee shop any morning you like and ask one of the old-timers if winters now are as bad as what they remember as a child.  One and all, they'll tell you about snowstorms and blizzards and so on, the last serious one of which happened back in 1993.  Yeah, we've had snowfalls since then -- this is the Northeast, after all -- but if you look back through the meteorological records from the early to mid 20th century, there is no question that we've trended toward milder winters.

Then there are the summertime droughts and heat waves, the most extreme of which is happening right now in Australia.  Large parts of Australia are currently burning to a crisp in the worst and most widespread series of wildfires in human history.  Whole towns are being evacuated, and in some places the only safety people have found is piling their family members, pets, and belongings onto boats and waiting out the fires offshore.  The latest estimates are that 12.3 million acres have been charred in the last few months, and that half a billion wild animals have died.  Given the threatened status of a great many of Australia's endemic species, the fact is that we might be witnessing in a few months the simultaneous extinction of dozens of endangered plants and animals.

[Image is in the Public Domain]

But Trump and his administration, and their media mouthpieces at Fox News, have continued to feed people the lie that everything's okay, that we can continue polluting and burning gasoline and coal without any repercussions whatsoever.  Deregulate everything has become the battle cry.  Industry, they say, will regulate itself, no need to worry.

Because that worked out so well in the 1950s and 1960s, when the air in big cities was barely breathable, and there was so much industrial waste in the Cuyahoga River in Cleveland, Ohio that it caught fire not once but thirteen times.

The scientists, and concerned laypeople like myself, have been screaming "Will you people please wake up and do something!" for years now, to seemingly little effect.  "Everything's fine" is a comforting lie, especially since rejecting it means putting a crimp in our generally lavish lifestyles.

The problem is, the natural world has a nasty way of having the last word.  We often forget that there is no reason whatsoever that we couldn't completely wipe ourselves out, either through accident or neglect or outright willful fuckery, or some combination thereof.  For my kids' sake I hope we as a species get pulled up short in the very near future and come together to work toward a solution, because my sense is that time is short.  There comes a point when an avalanche has started and no power on Earth can stop it.  I just hope we're not there yet.

But such a point definitely exists, whether it's behind us or ahead of us.  And that by itself should scare the absolute shit out of every citizen of this Earth.

Maybe you still find yourself shrugging and saying, "Meh."  If so, you should shut off Fox News (permanently) and read a scientific paper or two.  Start with the Sippel et al. study I linked above.

If that doesn't convince you, I don't know what would.

******************************

This week's Skeptophilia book of the week is simultaneously one of the most dismal books I've ever read, and one of the funniest; Tom Phillips's wonderful Humans: A Brief History of How We Fucked It All Up.

I picked up a copy of it at the wonderful book store The Strand when I was in Manhattan last week, and finished it in three days flat (and I'm not a fast reader).  To illustrate why, here's a quick passage that'll give you a flavor of it:
Humans see patterns in the world, we can communicate this to other humans and we have the capacity to imagine futures that don't yet exist: how if we just changed this thing, then that thing would happen, and the world would be a slightly better place. 
The only trouble is... well, we're not terribly good at any of those things.  Any honest assessment of humanity's previous performance on those fronts reads like a particularly brutal annual review from a boss who hates you.  We imagine patterns where they don't exist.  Our communication skills are, uh, sometimes lacking.  And we have an extraordinarily poor track record of failing to realize that changing this thing will also lead to the other thing, and that even worse thing, and oh God no now this thing is happening how do we stop it.
Phillips's clear-eyed look at our own unfortunate history is kept from sinking under its own weight by a sparkling wit, calling our foibles into humorous focus but simultaneously sounding the call that "Okay, guys, it's time to pay attention."  Stupidity, they say, consists of doing the same thing over and over and expecting different results; Phillips's wonderful book points out how crucial that realization is -- and how we need to get up off our asses and, for god's sake, do something.

And you -- and everyone else -- should start by reading this book.

[Note: if you purchase this book using the image/link below, part of the proceeds goes to support Skeptophilia!]





Saturday, December 28, 2019

Prelude to a cataclysm

Dear Readers:

I'm going to be taking a short break next week.  However, I hope you'll continue to send me ideas for new posts -- I'll be back in the saddle again soon, and I always value your suggestions.

The next Skeptophilia post will be Monday, January 6.  A very Happy New Year to all of you, and I hope this century's Roaring Twenties bring you everything you desire!

cheers,

Gordon

*******************************

There'd be nothing like a good supernova to liven things up around here.

Far and away the most spectacular event in the universe, a supernova of a massive star releases more energy in a few seconds than our Sun will release in its entire lifetime.  The colossal explosion is set off by the exhaustion of the fuel in the star's core, a phenomenon that deserves a little more explanation.

Stars are kept in equilibrium by two forces, the outward pressure of the heat produced by fusion in the core, and the inward pull of gravity.  When the star runs out of fuel, the heat diminishes, and gravity wins -- causing a sudden collapse and a phenomenally quick heating of the star's atmosphere.

The result is a supernova, which temporarily outshines everything else in the near vicinity.  Actually, "outshine" is the wrong word; nearby star systems would be flash-fried, and even at a relatively safe distance the high-energy electromagnetic radiation could severely damage a planet's atmosphere.  (Just as a clarification, I'm talking about planets in other star systems; if there were planets in the supernova's system, they'd be instantaneously vaporized.)

The collapsed core of the star then becomes either a neutron star or a black hole, depending on the star's initial mass.  The exploded remnants continue to glow brightly for several months, before finally cooling, fading, and disappearing from the night sky.

As it happens, we've got a good candidate for a supernova not too far away (well, 640 light years away, which isn't exactly next door, but is still close by astronomical standards).  It's called Betelgeuse, and it's the familiar star on Orion's right shoulder.  A red supergiant, the star is about eleven times the mass of the Sun (putting it in the "neutron star" range after it blows itself to smithereens).  However, volume-wise, it's enormous; if you put Betelgeuse where the Sun is, its edge would be somewhere between the orbits of Jupiter and Saturn.

Yes, that's what it sounds like.  If Betelgeuse replaced the Sun, we here on the Earth would be inside the star.

The constellation of Orion; thats's Betelgeuse in the upper left [Image licensed under the Creative Commons Rogelio Bernal Andreo, Orion Head to Toe, CC BY-SA 3.0]

Betelgeuse has long been known as one of the better supernova candidates that are relatively close by.  Asked when it's going to explode, though, astronomers have always played it cagey; could be tomorrow, could be a hundred thousand years from now.  But its recent behavior has made a lot of scientists wonder if the actual date of the explosion might not be closer to the "tomorrow" end of the spectrum than we'd thought.

The star has been exhibiting some odd behavior lately.  It's long been known as a variable star, varying in magnitude between about 0.0 and +0.5 (the bigger the number, the fainter the star).  This means it oscillates between being the fifth brightest star in the night sky and the tenth, with the period of its variation averaging at a little over a year.  But in the last few months, it's defied expectations, dimming to a magnitude of +1.3 and dropping to 23rd place on the list of brightest stars.

Could this herald the beginnings of the collapse that initiates the supernova?  Could be, but the truth is, we don't know.  Supernovae are uncommon events, and nearby ones nearly unheard of -- the last one was "Kepler's Star," the 1604 supernova in the constellation of Ophiuchus.  So what the leadup will look like, we aren't really sure.

What's certain is that this is unprecedented, at least since we've kept detailed records.  It merited a press release from the Villanova University Department of Astronomy three weeks ago, so even the astronomers -- ordinarily the most cautious of scientists -- are admitting that something's up.

Now, we still don't know what's going to happen.  Like I said, we've never been able to observe the events leading up to a supernova before.  But you can bet that the astrophysicists are paying close attention.

And with good reason.  If Betelgeuse went supernova -- no, correction, when Betelgeuse goes supernova -- it's going to be spectacular.  It's estimated it will brighten to a magnitude of -10, which (for reference) is sixteen times the brightness of the full Moon, and over a hundred times the brightness of the planet Venus.  It will be easily visible during the day and will provide enough light to read by at night.  And this won't be a blink-and-you-miss-it occurrence; the supernova will only fade gradually, over a period of eight to nine months, and during that time it will be (other than the Sun itself) the brightest thing in the sky.

And I can say unequivocally that I hope it happens really soon.

It'd be nice to have something happen out there in the universe to take our minds off of how screwed up things are down here, you know?  It'd be a good reminder that there are bigger and more powerful things than we are, and our petty little squabbles are really pretty minuscule by comparison.

So as far as I'm concerned: bring on the supernova.

********************************

As technology has improved, so has our ability to bring that technology to bear on scientific questions, sometimes in unexpected ways.

In the fascinating new book Archaeology from Space: How the Future Shapes Our Past, archaeologist Sarah Parcak gives a fascinating look at how satellite photography has revolutionized her field.  Using detailed photographs from space, including thousands of recently declassified military surveillance photos, Parcak and her colleagues have located hundreds of exciting new sites that before were completely unknown -- roads, burial sites, fortresses, palaces, tombs, even pyramids.

These advances are giving us a lens into our own distant past, and allowing investigation of inaccessible or dangerous sites from a safe distance -- and at a phenomenal level of detail.  This book is a must-read for any students of history -- or if you'd just like to find out how far we've come from the days of Heinrich Schliemann and the excavation of Troy.

[Note: if you purchase this book using the image/link below, part of the proceeds goes to support Skeptophilia!]





Friday, December 27, 2019

Canine mathematics

I remember a while back reading an interesting paper that concluded that dogs have a concept of fairness and morality.

There have been a number of studies confirming this, most strikingly an investigation involving border collies.  Pairs of dogs were trained to do a task, then rewarded with doggie biscuits.  The thing was, Dog #1 was rewarded for correctly doing the task with one biscuit, and Dog #2 with two biscuits for doing the same task.

Within a few rounds, Dog #1 refused to cooperate.  "I'm not working for one biscuit when he gets two," seemed to be the logic.  So -- amazing as it seems -- at least some dogs understand fair play, and will forego getting a treat at all if another dog is getting more.

It also implies an understanding of quantity.  Now, "two is more than one" isn't exactly differential calculus, but it does suggest that dogs have at least a rudimentary numeracy.  The evolutionary advantage of a sense of quantity is obvious; if you can do a quick estimate of the number of predators chasing you, or the size of the herd of antelope you're chasing, you have a better sense of your own safety (and such decisions as when to flee, when to attack, when to hide, and so on).

Guinness, either pondering Fermat's Last Theorem or else trying to figure out how to open the kitchen door so he can swipe the cheese on the counter

But how complex dogs' numerical ability is has proven to be rather difficult to study.  Which is why I found a paper last week in Biology Letters so fascinating.

Entitled, "Canine Sense of Quantity: Evidence for Numerical Ratio-Dependent Activation in Parietotemporal Cortex," by Lauren S. Aulet, Veronica C. Chiu, Ashley Prichard, Mark Spivak, Stella F. Lourenco, and Gregory S. Berns, of Emory University, this study showed that when dogs are confronted with stimuli differing only in quantity, they process that information in the same place in their brains that we use when doing numerical approximation.

The authors write:
The approximate number system (ANS), which supports the rapid estimation of quantity, emerges early in human development and is widespread across species.  Neural evidence from both human and non-human primates suggests the parietal cortex as a primary locus of numerical estimation, but it is unclear whether the numerical competencies observed across non-primate species are subserved by similar neural mechanisms.  Moreover, because studies with non-human animals typically involve extensive training, little is known about the spontaneous numerical capacities of non-human animals. To address these questions, we examined the neural underpinnings of number perception using awake canine functional magnetic resonance imaging.  Dogs passively viewed dot arrays that varied in ratio and, critically, received no task-relevant training or exposure prior to testing.  We found evidence of ratio-dependent activation, which is a key feature of the ANS, in canine parietotemporal cortex in the majority of dogs tested.  This finding is suggestive of a neural mechanism for quantity perception that has been conserved across mammalian evolution.
The coolest thing about this study is that they controlled for stimulus area, which was the first thing I thought of when I read about the experimental protocol.  What I mean by this is that if you keep the size of the objects the same, a greater number of them has a greater overall area, so it might be that the dogs were estimating the area taken up by the dots and not the number.  But the researchers cleverly designed the arrays so that although the number of dots varied from screen to screen, the total area they covered was the same.

And, amazing as it sounds, dogs not only had the ability to estimate the quantity of dots quickly and pick the screen with the greatest number, they were apparently doing this with the same part of their brains we use for analogous tasks.

"We went right to the source, observing the dogs' brains, to get a direct understanding of what their neurons were doing when the dogs viewed varying quantities of dots," said study lead author Lauren Aulet, in a press release in Science Daily.  "That allowed us to bypass the weaknesses of previous behavioral studies of dogs and some other species...  Part of the reason that we are able to do calculus and algebra is because we have this fundamental ability for numerosity that we share with other animals.  I'm interested in learning how we evolved that higher math ability and how these skills develop over time in individuals, starting with basic numerosity in infancy."

I wonder, though, how this would work with our dogs.  As I've mentioned before, Lena (our coonhound) has the IQ of a lima bean, and even has a hard time mastering concepts like the fact that the dog in the pond she barks at incessantly is actually her own reflection and not an Evil Underwater Dog Who Has Invaded Her Territory.  Guinness is smarter (not that the bar was set that high), but I don't know how aware of quantity he is.  He's more of an opportunist who will take advantage of any situation that presents itself, be it a single CheezDoodle someone dropped on the floor or (as happened two days ago) a half-pound of expensive French brie that was left unguarded for five minutes on the coffee table.

I doubt he worried about quantity in either case, frankly.

But the Aulet et al. study is fascinating, and clues us in that the origins of numeracy in our brains goes back a long, long way.  The most recent common ancestor between humans and dogs is on the order of eighty million years ago -- predating the extinction of the dinosaurs by fourteen million years -- so that numerical brain area must be at least that old, and is probably shared by most mammalian species.  It's a little humbling to think that a lot of the abilities we humans pride ourselves on are shared, at least on a basic level, with our near relatives.

But now y'all'll have to excuse me, because Lena wants to go outside.  I guess it's time for her to check and see if the Water Dog has returned.  She's a sneaky one, that Water Dog.

********************************

As technology has improved, so has our ability to bring that technology to bear on scientific questions, sometimes in unexpected ways.

In the fascinating new book Archaeology from Space: How the Future Shapes Our Past, archaeologist Sarah Parcak gives a fascinating look at how satellite photography has revolutionized her field.  Using detailed photographs from space, including thousands of recently declassified military surveillance photos, Parcak and her colleagues have located hundreds of exciting new sites that before were completely unknown -- roads, burial sites, fortresses, palaces, tombs, even pyramids.

These advances are giving us a lens into our own distant past, and allowing investigation of inaccessible or dangerous sites from a safe distance -- and at a phenomenal level of detail.  This book is a must-read for any students of history -- or if you'd just like to find out how far we've come from the days of Heinrich Schliemann and the excavation of Troy.

[Note: if you purchase this book using the image/link below, part of the proceeds goes to support Skeptophilia!]





Thursday, December 26, 2019

Dancing down from the past

It will come as no great surprise to anyone who knows me that I've struggled to overcome my shyness and inhibitions.

One of the ways this manifested was a reluctance to dance.  Dancing requires a willingness not only to get yourself out there on the dance floor, but to lose your self-consciousness and move to the music.  If you're constantly watching yourself, wondering what others are thinking, you'll never loosen up enough to be able to dance -- and as a result, you will move awkwardly.

Self-fulfilling prophecy, that.

I'd always advocated for throwing caution to the wind and enjoying yourself, simultaneously being unable for some reason to apply that same standard to myself.  But something shifted at the retreat I attended a month ago, about which I have written already.  The first night of the retreat, the leader said that one of the things we were going to be doing a lot of was dancing.  It's a really primal activity, he said, and is amazing for getting you out of your own head.

Well, my first reaction was panic.  The voice in my mind said, loud and clear, "YOU CAN'T DO THIS."  But as I related in my post, I did, and it was an amazing experience.  He was exactly right.  Dancing is freeing and exhilarating in a way very little else is.

Being a biologist, this got me to wondering why.  It involves moving your body, sure, but so do a lot of other things; and I can tell you in no uncertain terms that weed-whacking along the fence is equally physical, but is the opposite of exhilarating.  Music plays into it, of course, but I can also listen to music without that euphoric feeling occurring (although as I've also written about before here at Skeptophilia, I do have a very visceral and emotional reaction to certain music, another phenomenon that seems to have a neurological basis).

But put the two together -- music and movement -- and you have an extremely powerful combination.

Greg Sample and Jennita Russo, of Deyo Dancers [Image licensed under the Creative Commons Barry Goyette from San Luis Obispo, USA, Two dancers, CC BY 2.0]

Why exactly this synergy happens is a matter of conjecture, but what is certain is that it goes back a long way in our evolutionary history.  A paper that came out last week in Proceedings of the National Academy of Sciences, by Yuko Hattori and Masaki Tomonaga of Kyoto University, shows that when chimpanzees are exposed to music, or even rhythmic sounds, they respond with something that looks very much like rudimentary dance.

"I was shocked," Hattori said to Eve Frederick, writing for Science.  "I was not aware that without any training or reward, a chimpanzee would spontaneously engage with the sound."

The authors write:
Music and dance are universal across human culture and have an ancient history.  One characteristic of music is its strong influence on movement.  For example, an auditory beat induces rhythmic movement with positive emotions in humans from early developmental stages.  In this study, we investigated if sound induced spontaneous rhythmic movement in chimpanzees.  Three experiments showed that: 1) an auditory beat induced rhythmic swaying and other rhythmic movements, with larger responses from male chimpanzees than female chimpanzees; 2) random beat as well as regular beat induced rhythmic swaying and beat tempo affected movement periodicity in a chimpanzee in a bipedal posture; and 3) a chimpanzee showed close proximity to the sound source while hearing auditory stimuli.  The finding that male chimpanzees showed a larger response to sound than female chimpanzees was consistent with previous literature about “rain dances” in the wild, where male chimpanzees engage in rhythmic displays when hearing the sound of rain starting...  These results suggest some biological foundation for dancing existed in the common ancestor of humans and chimpanzees ∼6 million years ago.  As such, this study supports the evolutionary origins of musicality.
Of course, this still doesn't answer what its evolutionary significance is; if I had to guess, it probably has to do with social cohesion and pair bonding, much as it does in humans.  But it's absolutely fascinating that the roots of dance go back at least to our last common ancestor with chimps, which would be between six and seven million years ago.

All of which makes me a little sad for what I missed in all those years I was too inhibited to dance.  I'll end with a quote from writer and humorist Dave Barry, which seems apt: "No one cares if you can't dance well.  Get up and dance."

  ********************************

As technology has improved, so has our ability to bring that technology to bear on scientific questions, sometimes in unexpected ways.

In the fascinating new book Archaeology from Space: How the Future Shapes Our Past, archaeologist Sarah Parcak gives a fascinating look at how satellite photography has revolutionized her field.  Using detailed photographs from space, including thousands of recently declassified military surveillance photos, Parcak and her colleagues have located hundreds of exciting new sites that before were completely unknown -- roads, burial sites, fortresses, palaces, tombs, even pyramids.

These advances are giving us a lens into our own distant past, and allowing investigation of inaccessible or dangerous sites from a safe distance -- and at a phenomenal level of detail.  This book is a must-read for any students of history -- or if you'd just like to find out how far we've come from the days of Heinrich Schliemann and the excavation of Troy.

[Note: if you purchase this book using the image/link below, part of the proceeds goes to support Skeptophilia!]





Wednesday, December 25, 2019

Remnants of forgotten civilizations

As silly as it can get sometimes, I am a dedicated Doctor Who fanatic.  I'm late to the game -- I only watched my first-ever episode of the long-running series four years ago -- but after that, I went at it with the enthusiasm you only see in the born-again.

The best of the series tackles some pretty deep stuff.  The ugly side of tribalism ("Midnight"), the acknowledgement that some tragedies are unavoidable ("The Fires of Pompeii"), the Butterfly Effect ("Turn Left"), the fact that you can't both "play God" and avoid responsibility ("The Waters of Mars"), and the terrible necessity of personal self-sacrifice ("Silence in the Library").  Plus, the series invented what would be my choice for the single most terrifying, wet-your-pants-inducing alien species ever dreamed up, the Weeping Angels (several episodes, most notably "Blink").

So it shouldn't have been a surprise when Doctor Who got a mention in this month's Scientific American, but it still kinda was.  It came up in a wonderful article by Caleb Scharf called "The Galactic Archipelago," which was about the possibility of intelligent life in the universe (probably very high) and the odd question of why, if that's true, we haven't been visited (Fermi's paradox).  Here at Skeptophilia we've looked at one rather depressing answer to Fermi -- the "Great Filter," the idea that intelligent life is uncommon in the universe either because there are barriers to the formation of life on other worlds, or that once formed, it's likely to get wiped out completely at some point.

It's even more puzzling when you consider the fact that it would be unnecessary for the aliens themselves to visit.  Extraterrestrial life paying a house call to Earth is unlikely considering the vastness of space and the difficulties of fast travel, whatever the amazingly-coiffed Giorgio Tsoukalos (of Ancient Aliens fame) would have you believe.  But Scharf points out that it's much more likely that intelligent aliens would have instead sent out self-replicating robot drones, which not only had some level of intelligence themselves (in terms of avoiding dangers and seeking out raw materials to build new drones), but could take their time hopping from planet to planet and star system to star system.  And because they reproduce, all it would take is one or two civilizations to develop these drones, and given a few million years, you'd expect they'd spread pretty much everywhere in the galaxy.

But, of course, it doesn't seem like that has happened either.

Scharf tells us that there's another possibility than the dismal Great Filter concept, and that's something that's been nicknamed the "Silurian Hypothesis."  Here's where Doctor Who comes in, because as any good Whovian will tell you, the Silurians are a race of intelligent reptilians who were the dominant species on Earth for millions of years, but who long before humans appeared went (mostly) extinct except for a few scattered remnant populations in deep caverns.


Last year, astronomers Gavin Schmidt and Adam Frank, of NASA and the University of Rochester (respectively), considered whether it was possible that an intelligent technological species like the Silurians had existed millions of years ago, and if so, what traces of it we might expect to find in the modern world.  And what Schmidt and Frank found was that if there had been a highly complex, city-building, technology-using species running the Earth, (say) fifty million years ago, what we'd find today as evidence of its existence is very likely to be...

... nothing.

Scharf writes:
[Astrophysicist Michael] Hart's original fact [was] that there is no evidence here on Earth today of extraterrestrial explorers...  Perhaps long, long ago aliens came and went.  A number of scientists have, over the years, discussed the possibility of looking for artifacts that might have been left behind after such visitations of our solar system.  The necessary scope of a complete search is hard to predict, but the situation on Earth alone turns out to be a bit more manageable.  In 2018 another of my colleagues, Gavin Schmidt of NASA's Goddard Institute for Space Studies, together with Adam Frank, produced a critical assessment of whether we could even tell if there had been an earlier industrial civilization on our planet. 
As fantastic as it may seem, Schmidt and Frank argue -- as do most planetary scientists -- that it is actually very easy for time to erase essentially all signs of technological life on Earth.  The only real evidence after a million or more years would boil down to isotopic or chemical stratigraphic anomalies -- odd features such as synthetic molecules, plastics, or radioactive fallout.  Fossil remains and other paleontological markers are so rare and so contingent on special conditions of formation that they might not tell us anything in this case. 
Indeed, modern human urbanization covers only on order of about one percent of the planetary surface, providing a very small target area for any paleontologists in the distant future.  Schmidt and Frank also conclude that nobody has yet performed the necessary experiments to look exhaustively for such nonnatural signatures on Earth.  The bottom line is, if an industrial civilization on the scale of our own had existed a few million years ago, we might not know about it.  That absolutely does not mean one existed; it indicates only that the possibility cannot be rigorously eliminated.
(If you'd like to read Schmidt and Frank's paper, it appeared in the International Journal of Astrobiology and is available here.)

It's a little humbling, isn't it?  All of the massive edifices we've created, the far-more-than Seven Wonders of the World, will very likely be gone without a trace in only a few million years.  A little more cheering is that the same will be true of all the damage we're currently doing to the global ecosystem.  It's not so surprising if you know a little geology; the current arrangement of the continents is only the most recent, and won't be the last the Earth will see.  Because of erosion and natural disasters, not to mention the rather violent clashes that occur when the continents do shift position, it stands to reason that our puny little efforts to change things won't last very long.

Entropy always wins in the end.

The whole thing puts me in mind of one of the first poems I ever read that made a significant impact on me -- Percy Bysshe Shelley's devastating "Ozymandias," which I came across when I was a freshman in high school.  It seems a fitting way to conclude this post.
I met a traveller from an antique land,
Who said—Two vast and trunkless legs of stone
Stand in the desert. . . . Near them, on the sand,
Half sunk a shattered visage lies, whose frown,
And wrinkled lip, and sneer of cold command,
Tell that its sculptor well those passions read
Which yet survive, stamped on these lifeless things,
The hand that mocked them, and the heart that fed;
And on the pedestal, these words appear:
"My name is Ozymandias, King of Kings;
Look on my Works, ye Mighty, and despair!"
Nothing beside remains. Round the decay
Of that colossal Wreck, boundless and bare
The lone and level sands stretch far away.
********************************

As technology has improved, so has our ability to bring that technology to bear on scientific questions, sometimes in unexpected ways.

In the fascinating new book Archaeology from Space: How the Future Shapes Our Past, archaeologist Sarah Parcak gives a fascinating look at how satellite photography has revolutionized her field.  Using detailed photographs from space, including thousands of recently declassified military surveillance photos, Parcak and her colleagues have located hundreds of exciting new sites that before were completely unknown -- roads, burial sites, fortresses, palaces, tombs, even pyramids.

These advances are giving us a lens into our own distant past, and allowing investigation of inaccessible or dangerous sites from a safe distance -- and at a phenomenal level of detail.  This book is a must-read for any students of history -- or if you'd just like to find out how far we've come from the days of Heinrich Schliemann and the excavation of Troy.

[Note: if you purchase this book using the image/link below, part of the proceeds goes to support Skeptophilia!]





Tuesday, December 24, 2019

The danger of comfort

There is nothing as dangerous as our attitude that if something isn't bothering us right here and right now, it can effectively be ignored.

It's what is behind the phenomenon that doctors rail against, that if you're feeling good at the moment, there's no reason to have an annual physical.  I say this with a degree of wry amusement because I'm a doctor avoider myself, but at least I acknowledge how foolish that approach is.  There are large numbers of illnesses that if caught early and treated are not really that serious, but if left untreated long enough can kill you.  I was just chatting a couple of days ago with a friend about a mutual acquaintance who had ignored increasingly severe headaches for weeks, and ultimately died of a ruptured cerebral aneurysm that probably would have been operable -- at the age of 41.

Scale that attitude up, and you have our current approach to the global environment.

Every time you look at the news you see more alarm bells about the current state of the natural world.  Just in the last two weeks, we've had the following:
  • A study at the University of Sussex showing that the world's biodiversity is falling far faster than previous models had estimated
  • A paper in Nature with new data about mass loss from the Greenland ice sheet, projecting the displacement of forty million people worldwide from coastal flooding and incursion of seawater in the next eighty years
  • A rather horrifying study from the University of California-San Diego detailing more accurate estimates of microplastics in the ocean -- bits of effectively non-biodegradable debris suspended in seawater, with unknown long-term effects on ecosystems -- and found that the average concentration was 8.3 million pieces of microplastic per cubic meter of water, on the order of six orders of magnitude higher than previous measurements
But here I sit in my comfortable office in rural upstate New York.  It's a clear December morning, the sky is a pristine pale blue, the tilled cornfield across the road dusted with snow.  There are birds at the feeders, a hawk is kiting high overhead, my dogs are snoozing in a patch of sunlight after an early morning's romp.  I have a cup of hot coffee, a fire in the wood stove.

All's well with the world.  Right?

Certainly looks like it is.

[Image licensed under the Creative Commons Pranjal kukreja, Adventure-clouds-environment-672358, CC BY-SA 4.0]

We're geared to respond to how our personal conditions are in the moment, so stories like the ones I mentioned above have a hard time gaining any traction in our consciousness.  I consider myself more environmentally-conscious than a lot of people (and for cryin' in the sink, I just spent the morning researching serious problems with the global ecology) and I still have a hard time feeling viscerally alarmed by it, the way I would if there was a forest fire headed this way or a chemical spill was killing all the fish in my pond or smog was making it impossible to breathe without a filter mask.

There's really no difference, though, between the three problems in the news and the three hypothetical ones I just mentioned -- or if there is, it's a matter of scale.  The three papers I referenced above are orders of magnitude more serious than any of the three local ones I listed.  If a wildfire went out of control and burned my house down, it would be a tragedy for me.  But the three papers I described are disasters in the making that affect not just one person, nor even a single community, but the entire world.

And for most people, they elicit nothing more than a shrug of the shoulders.

It doesn't help, of course, that the current government of the United States is actively involved in perpetuating this attitude, and (worse) spreading scientific misinformation.  For some of the perpetrators it's done with malice aforethought, because of the influence of money from the fossil fuel lobby and others like it, but for some -- like Donald Trump -- it's a combination of the "who cares, I'm doing fine" attitude with outright willful stupidity.  Take, for example, this direct quote from Trump's speech to a Turning Point USA (a conservative student group) rally just two days ago:
I never understood wind.  I know windmills very much, I have studied it better than anybody.  I know it is very expensive.  They are made in China and Germany mostly, very few made here, almost none, but they are manufactured, tremendous — if you are into this — tremendous fumes and gases are spewing into the atmosphere.  You know we have a world, right?  So the world is tiny compared to the universe.  So tremendous, tremendous amount of fumes and everything.  You talk about the carbon footprint, fumes are spewing into the air, right spewing, whether it is China or Germany, is going into the air...  A windmill will kill many bald eagles.  After a certain number, they make you turn the windmill off, that is true.  By the way, they make you turn it off.  And yet, if you killed one, they put you in jail.  That is okay.  But why is it okay for windmills to destroy the bird population?
Watching a video of this speech, it was hard to escape two conclusions: (1) Donald Trump is the single stupidest person ever elected to public office, and (2) the fact that there are still a significant number of supporters of this man's policies, who apparently still think he's the best president ever, makes me despair for the future of the human race.

When I taught Environmental Science, I was up front about my goal -- to widen students' perspective from what's right in front of them, to their homes, to their communities, to their nation, and finally to the entire world.  So much of what we're doing wrong lately -- or failing to do -- is purely because we only care, and act, on what is right before our faces.

So I'm glad that I've got a beautiful morning to enjoy, clean air, a warm and safe place for myself and my family and pets.  But I can't let that lull me into the Panglossian attitude that "all is for the best in the best of all possible worlds."  In the current conditions -- with ecological perils everywhere, and a government that combines complicity and ignorance -- complacency is the deadliest danger of all.

********************************

As technology has improved, so has our ability to bring that technology to bear on scientific questions, sometimes in unexpected ways.

In the fascinating new book Archaeology from Space: How the Future Shapes Our Past, archaeologist Sarah Parcak gives a fascinating look at how satellite photography has revolutionized her field.  Using detailed photographs from space, including thousands of recently declassified military surveillance photos, Parcak and her colleagues have located hundreds of exciting new sites that before were completely unknown -- roads, burial sites, fortresses, palaces, tombs, even pyramids.

These advances are giving us a lens into our own distant past, and allowing investigation of inaccessible or dangerous sites from a safe distance -- and at a phenomenal level of detail.  This book is a must-read for any students of history -- or if you'd just like to find out how far we've come from the days of Heinrich Schliemann and the excavation of Troy.

[Note: if you purchase this book using the image/link below, part of the proceeds goes to support Skeptophilia!]