Skeptophilia (skep-to-fil-i-a) (n.) - the love of logical thought, skepticism, and thinking critically. Being an exploration of the applications of skeptical thinking to the world at large, with periodic excursions into linguistics, music, politics, cryptozoology, and why people keep seeing the face of Jesus on grilled cheese sandwiches.
Showing posts with label risk. Show all posts
Showing posts with label risk. Show all posts

Tuesday, January 14, 2025

Life out of round

All my life, I've been pulled by two opposing forces.

One of them is the chaos-brain I described in yesterday's post, which I seem to have been born with.  The other is a ferocious attempt to counteract that tendency by controlling the absolute hell out of my surroundings.  I know a lot of this came from the way I was raised; throughout my childhood, nothing I ever did was good enough, and any compliments came along with an appended list (notarized and in triplicate) of all the things I should have done differently and/or could have done better.  

The result is that I do a great deal of overcompensation.  I became fanatically neat, because organizing my physical space was a way of coping with the fact that my brain is like a car with bald tires and no brakes.  My classroom was so organized and clean you could just about eat off the floor (and keep in mind that it was a biology lab).  As a teacher, I strove to make use of every moment we had, and faulted myself whenever things didn't go well or there was an eventuality I hadn't planned for.

I didn't expect perfection from my students, but I did from myself.  And, in some parts of my life, it served me well enough.

The problem is, that approach doesn't work when you apply it to the arts.

I'm not even talking about the "learning curve" issue, here.  Even when I've attained some level of proficiency, I still expect nothing less than perfection, excoriating myself for every scene in a story that didn't come out the way I wanted, every slightly lopsided piece of pottery, every missed note when I play music.

In theory, I'm one hundred percent in agreement with the quote from Ludwig van Beethoven -- "To play a wrong note is insignificant; to play without passion is inexcusable."  Or, more accurately, I believe that for everyone else.  It's much harder to treat myself so forgivingly.

The result has been an overwhelming case of impostor syndrome, coupled with fear of criticism -- which will, in my warped way of looking at things, only confirm what I've thought about myself all along.  I'm at least working on getting my writing out there under the public eye, despite the inherent risks of poor sales and/or bad reviews, but it's been harder in other aspects of my creative life.  I'm still at the stage where I had to have my arm twisted (hard) to induce me to join as a flutist in a contradance band, and it's damn near impossible to get me to play the piano in front of anyone else (including my wife).  But I'm harshest about my own skill when it comes to my artistic work, which is pottery.  I keep very little of what I make, and most of what I do keep are the pieces that are simple and purely functional -- bowls and mugs and the like.  The vast majority of the sculptures and other, more unusual, pieces I make end up given that dreadful label of "not good enough" and are smashed against the concrete wall of the back of our house. 

All along, I had the attitude -- again, directly consonant with my upbringing -- that this is how you improve, that constant self-criticism should act as some kind of impetus to getting better, to ridding your work of those dreaded mistakes, to attaining that fabled ability to create something with which others could not find fault.

It's only been recently that I've realized that this approach is completely antithetical to creativity.

I got to thinking about this after watching an online pottery workshop with the wonderful New Hampshire potter Nick Sevigney, whose pieces are weird and whimsical and unexpected.  A lot of his pottery has a steampunk feeling, a sense of having been put together from a random assemblage of parts.  It was a revelation to watch him piece together cut slabs of clay, not caring if the result was a little uneven or had a rough edge.  In fact, he embraces those seeming imperfections, turning them "from a bug into a feature."

So I decided to see if I could do a few pieces that riffed off of his approach.

I'm most comfortable on the potter's wheel, so I started out throwing three medium-sized white stoneware bowls.  I've gotten pretty good at getting that smooth curve and rounded profile, with a perfectly circular rim, that is what most of us shoot for when creating a bowl.  

Usually, that's where I'd stop.  If it passed my critical assessment -- not lopsided, decent weight, evenly thick walls, nice smooth surface -- I'd keep it.  Otherwise, into the scrap bucket it'd go.  But here... that was only the first step.

One of the techniques Nick does is taking a piece, cutting chunks out of it, adding texture to the chunk, then reattaching it.  You'd think that because you're putting the piece back where it had been, it'd fit perfectly; but the problem is that adding texture (usually using stamps or rollers) stretches and flattens the clay, so inevitably it ends up larger than the hole it came from.  Nick just forces it to fit, warping the piece's profile -- and instead of worrying about that, he often adds some circular marks that make it look like the piece was inexpertly riveted or screwed back on.

He leans into the unevenness hard.  And the result is something magical, like a relic you might find in a demolished nineteenth century mad scientist's laboratory, something stitched together and broken and reassembled upside down and backwards.

So I took my three smooth, undamaged stoneware bowls and gave it a try.

One of the results

The hardest part -- unsurprising, perhaps, given my personality -- was making the first cut.  Even knowing that if I didn't like the result, I have more clay and could always make another plain, boring, but "perfect" bowl, I sat there for some time, knife in hand, as if the Pottery Gods would smite me if I touched that sleek, classic profile.  Slicing and pressing and marring and deforming it felt like deliberately choosing to ruin something "nice."  

But maybe "nice" isn't what we should be shooting for, as creatives.

Maybe the goal should be somewhere out there beyond "nice."  The point, I realized, is not to retread the safe, secure footsteps I've always taken, but to take a deep breath and launch off into the shadowlands.

So I cut a big chunk from the side of the bowl, got out my texturing stamps and rollers, and set to work.

I was half expecting to give up after a few attempts and throw the whole thing into the scrap bucket, but I didn't.  I found I actually kind of liked the result, as different as it is from what I usually make.  And what surprised me even more was that once I got into it, it was...

... fun.

I've never been much good at "having fun."  In general, I give new meaning to the phrase "tightly wound."  Letting loose and simply being silly is way outside my wheelhouse.  (I know I shortchanged my boys as a dad when they were little simply by my seeming inability to play.)  But I've come to realize that the spirit of playfulness is absolutely critical to creativity.  I don't mean that every creative endeavor should be funny or whimsical; but that sense of pushing the boundaries, of letting the horse have its head and seeing where you end up, is at the heart of what it means to be creative.

I was recently chatting with another author about times when inspiration in writing will surprise you, coming at you seemingly out of nowhere.  When it happens, the feeling is honestly like the ideas are originating outside of my own brain.  There are two examples of this that come to mind immediately, cases where characters to whom I'd never intended to give a big role basically said, "Nuh-uh, you're not sidelining me.  I'm important, and here's why."  (If you're curious, the two are Jennie Trahan in my novella "Convection," and most strikingly, Marig Kastella in The Chains of Orion, who kind of took over the last third of the book, and became one of my favorite characters I've ever written.)  When that happens, it means I've loosened my death-grip on the story, and given my creativity space to breathe.

And it always is a hallmark of things going really right with the writing process.

So I guess the point of all this is to encourage you to stretch your boundaries in your own creative work.  I won't say "lose your fears" -- that's hopeless advice -- but try something new despite them.  (Either something new within your chosen creative medium, or something entirely new.)  Be willing to throw your creative life out of round, to press it into new and unexpected configurations, to turn in a new direction and see where you end up.  There's good stuff to be found there outside of the narrow, constricted, breathless little boundaries of what we've always been told is "the right way to do things."  Take a risk.  Then take another one.  The goal of creativity is not to play it safe.

As French author and Nobel laureate André Gide put it, "One does not discover new lands without consenting to lose sight of the shore."

****************************************

NEW!  We've updated our website, and now -- in addition to checking out my books and the amazing art by my wife, Carol Bloomgarden, you can also buy some really cool Skeptophilia-themed gear!  Just go to the website and click on the link at the bottom, where you can support your favorite blog by ordering t-shirts, hoodies, mugs, bumper stickers, and tote bags, all designed by Carol!

Take a look!  Plato would approve.


****************************************

Saturday, December 14, 2024

The cliff's edge

The universe is a dangerous place.

Much of what we've created -- the whole superstructure of civilized life, really -- is built to give us a sense of security.  And it works, or well enough.  During much of human history, we were one bad harvest, one natural disaster, one epidemic from starvation, disease, and death.  Our ancestors were constantly aware that they had no real security -- probably one of the main drivers of the development of religion.

The world is a capricious, dangerous place, but maybe the gods will help me if only I pray hard enough.

When the Enlightenment rolled around in the eighteenth century, science seemed to step in to provide a similar function.  Maybe the world could be tamed if we only understood it better.  Once again, it succeeded -- at least partially.  Industrial agriculture and modern medicine certainly saved millions of lives, and have allowed us to live longer, healthier lives than ever before.  Further reassuring us that it was possible to make the universe a secure, harm-free place for such creatures as us.

And we still have that sense, don't we?  When there's a natural disaster, many people respond, "Why did this happen?"  There's an almost indignant reaction of "the world should be safe, dammit."

[Image licensed under the Creative Commons svantassel, Danger Keep Away Sign, CC BY-SA 3.0]

This is why in 2012 a judge in Italy sentenced six geologists to six years in prison and a hefty fines for failing to predict the deadly 2009 L'Aquila earthquake.  There was the sense that if the best experts on the geology of Italy didn't see it coming... well, they should have, shouldn't they?  

That in the present state of our scientific knowledge, it's not possible to predict earthquakes, didn't seem to sway the judge's mind.  "The world is chaotic, dangerous, and incompletely understood" was simply too hard to swallow.  If something happened, and people died, there had to be someone to blame.  (Fortunately, eventually wiser heads prevailed, the charges were thrown out on appeal, and the geologists were released.)

In fact, I started thinking about this because of a study out of the University of California - Riverside that is investigating a technique for predicting earthquake severity based on the direction of propagation of the shock wave front.  This can make a huge difference -- for example, an earthquake on the San Andreas Fault that begins with failure near the Salton Sea and propagates northward will direct more energy toward Los Angeles than one that begins closer in but spreads in the opposite direction.

The scientists are using telltale scratch marks -- scoring left as the rocks slide across each other -- to determine the direction of motion of the quake's shock wave.  "The scratches indicate the direction and origin of a past earthquake, potentially giving us clues about where a future quake might start and where it will go," said Nic Barth, the paper's lead author. " This is key for California, where anticipating the direction of a quake on faults like San Andreas or San Jacinto could mean a more accurate forecast of its impact...  We can now take the techniques and expertise we have developed on the Alpine Fault [in New Zealand] to examine faults in the rest of the world.  Because there is a high probability of a large earthquake occurring in Southern California in the near-term, looking for these curved marks on the San Andreas fault is an obvious goal."

The thing is, this is still short of the ultimate goal of predicting fault failure accurately, and with enough time to warn people to evacuate.  Knowing the timing of earthquakes is something that is still out of reach.

Then there's the study out of the Max Planck Institute for Solar System Research that found that the Sun and other stars like it are prone to violent flare-ups -- on the average, once every century.  These "superflares" can release an octillion joules of energy in only a few hours.

The once-every-hundred-years estimate was based on a survey of over fifty-six thousand Sun-like stars, and the upshot is that so far, we've lucked out.  The last serious solar storm was the Carrington Event of 1859, and that was the weakest of the known Miyake Events, coronal mass ejections so big that they left traces in tree rings.  (One about fourteen thousand years ago was so powerful that if it occurred today, it would completely fry everything from communications satellites to electrical grids to home computers.)

The problem, once again, is that we still can't predict them; like earthquakes, we can know likelihood but not exactitude.  In the case of a coronal mass ejection, we'd probably have a few hours' notice -- enough time to unplug stuff in our houses, but not enough to protect the satellites and grids and networks.  (If that's even possible.  "An octillion joules" is what is known in scientific circles as "a metric shit tonne of energy.")

"The new data are a stark reminder that even the most extreme solar events are part of the Sun's natural repertoire," said study co-author Natalie Krivova.  "During the Carrington event of 1859, one of the most violent solar storms of the past two hundred years, the telegraph network collapsed in large parts of northern Europe and North America.  According to estimates, the associated flare released only a hundredth of the energy of a superflare.  Today, in addition to the infrastructure on the Earth's surface, especially satellites would be at risk."

All of this, by the way, is not meant to scare you.  In my opinion, the point is to emphasize the fragility of life and of our world, and to encourage you to work toward mitigating what we can.  No matter what we do, we'll still be subject to the vagaries of geology, meteorology, and astrophysics, but right now we are needless adding to our risk by ignoring climate change and pollution, and encouraging the ignorant and ill-founded claims of the anti-vaxxers.  (Just yesterday I saw that RFK Jr., who has been nominated as Secretary of the Department of Health and Human Services, is pursuing the de-authorization of the polio vaccine -- an extremely low-risk preventative that has saved millions of lives.)

Life's risky enough without adding to it by listening to reckless short-term profit hogs and dubiously sane conspiracy theorists.

My point here is that the chaotic nature of the universe shouldn't freeze us into despairing immobility; it should galvanize us to protect what we have.  The unpredictable dangers are a fact of life, and for most of our evolutionary history we were unable to do much about any of them.  Now, for the first time, we have figured out how to protect ourselves from many of the risks that our ancestors faced every day.  How foolish do we as a species have to be to add to those risks needlessly, heedlessly, rushing toward the edge of the cliff when we have the capacity simply to stop?

****************************************

Friday, September 2, 2022

When the volcano blows

The human-inhabited part of the world dodged a serious bullet in January of 2022, when the colossal Hunga Tonga - Hunga Ha'apai volcanic eruption took place.

Unless you're a geology buff, you might not even remember that it happened, which is kind of astonishing when you consider it.  The undersea eruption created an upward surge of water that was ninety meters tall, twelve kilometers wide, and the wave it generated displaced a volume of 6.6 cubic kilometers.  The tsunami started out nine times as high as the one that devastated Japan in 2011.

After that, a steam explosion -- caused when cold seawater rushed into the collapsed magma chamber after the eruption -- generated an atmospheric pressure wave, producing a second (and faster-moving) set of tsunamis.

The whole thing is hard to talk about without lapsing into superlatives.

The Hunga Tonga - Hunga Ha'apai eruption [Image is in the Public Domain courtesy of NASA]

The fact that this enormous eruption only caused five deaths and ninety million dollars in damage -- compared with the 2011 earthquake and tsunami in Japan, which killed twenty thousand and caused over two hundred billion dollars in damage -- is due to its remote location in the Tonga Archipelago.  Had it occurred closer to heavily-inhabited coastal locations, it could have been catastrophic.

This analysis of the Tonga eruption came out right around the same time as a study out of the University of Cambridge looking at how woefully unprepared we are for a large eruption in a populated area.

"Data gathered from ice cores on the frequency of eruptions over deep time suggests there is a one-in-six chance of a magnitude seven explosion in the next one hundred years. That's a roll of the dice," said study co-author Lara Mani.  "Such gigantic eruptions have caused abrupt climate change and collapse of civilizations in the distant past...  Hundreds of millions of dollars are pumped into asteroid threats every year, yet there is a severe lack of global financing and coordination for volcano preparedness.  This urgently needs to change.  We are completely underestimating the risk to our societies that volcanoes pose."

You might be wondering which are currently considered by volcanologists to be the most potentially dangerous volcanoes in the world.  Generally, these top the list:
  • Mount Vesuvius/the Campi Flegrei system in Italy, which destroyed Pompeii in 79 C. E. and threatens the modern city of Naples
  • Mount Rainier, southeast of the city of Seattle, Washington
  • Novarupta Volcano in Alaska, which could produce climate-changing ash eruptions
  • Mount Pinatubo in the Philippines, which has a history of violent eruptions -- and over twenty million people live less than a hundred kilometers from its summit
  • Mount Saint Helens -- famous for its 1980 eruption, this volcano has been rebuilding since then and still poses a significant threat
  • Mount Agung and Mount Merapi in Indonesia, part of the same volcanic arc that includes Krakatoa
  • Mount Fuji in Japan -- scarily close to Tokyo, one of the most densely populated cities in the world
The whole thing is kind of overwhelming to thing about, especially given the question of what we could do about it if we knew a massive eruption was imminent.  Consider the failure of the United States government to act effectively prior to Hurricane Katrina in 2005 -- and there we had several days to do something, during which meteorologists correctly predicted the massive strengthening that would occur prior to landfall, and knew pretty accurately when and where it would occur.  With a volcanic eruption, generally geologists know one is coming at some point, but the ability to predict how big and exactly when is still speculative at best.

Imagine, for example, the reaction of the three-million-odd residents of Naples and its environs if the scientists said, "You need to evacuate the area, because there's going to be an eruption of some magnitude or another, some time in the next six months."

So the problems inherent in dealing with this threat are obvious, but (says the Mani et al. study), that's no reason to close our eyes to it, or refusing to consider possible solutions that may seem to be outside the box.  "Directly affecting volcanic behavior may seem inconceivable, but so did the deflection of asteroids until the formation of the NASA Planetary Defense Coordination Office in 2016," Mani said.  "The risks of a massive eruption that devastates global society is significant.  The current underinvestment in responding to this risk is simply reckless."

****************************************


Thursday, August 4, 2022

What's bred in the bone

A friend of mine was chatting with me about irritating situations at work, and she mentioned that she'd really lost her cool with a supervisor the previous week who apparently is notorious for being a bit of an asshole.  I mentioned that I tend to put up with such nonsense and later wish I'd spoken up for myself -- that it has to be pretty bad before I'll blow up (at a supervisor or anyone else).

She laughed and said, "Of course I have a quick temper.  My family's Italian.  It's in our genes."

She was joking, of course, no more serious than my father was when he quipped that our family was "French enough to like to drink and Scottish enough not to know when to stop."  But it's a common enough view, isn't it?  We get our personality traits from some nebulous genetic heritage, despite the fact that a great many of us are pretty thorough mixtures of ancestry, and that all humans regardless of race or ethnicity are well over 99.9% similar anyhow.  As geneticist Kenneth Kidd put it, "Race is not biologically definable.  We are far too similar."

Ha.  Take that, racists.

[Image is in the Public Domain]

The whole thing gets complicated, however, because race and ethnicity certainly have a cultural reality, and that can certainly affect how your personality develops as you grow up.  If you're raised in a family where arguments are regularly settled through shouting and waving your arms around (apparently true in my friend's case), then you learn that as a standard of behavior.  (Or, sometimes, decide, "That was a miserable way to live, I'm never going to treat people that way," and swing to the opposite extreme.)  All of this is just meant to highlight that teasing apart the genetic components of behavior (and there certainly are some) from the learned ones is no simple task.

All of this just gained an additional complication with a study last week in the journal Social Cognition that looked at another factor contributing to our behavior -- how our notions about our genetic makeup influence how we think we should be acting.

The study, by Ryan Wheat and Matthew Vess (of Texas A & M) and Patricia Holte (of Wake Forest University), was simple enough.  What they did was to take a group of test subjects, gave them a (bogus) saliva test, and split the group in two.  They were then given the "results," regarding what the sample said about their genetic makeup for a variety of characteristics.  The salient part, though was that half were told that their genetic sample showed they had an unusually high propensity for risk-taking, and the other half were told their genes said they tended to avoid risk.

Afterward, they were given a personality test, and only one thing was important; the questions that evaluated them for risk-tolerance.  Across the board, the people who were told their genes predisposed them to taking risks scored higher on the risk-tolerance questions than did the people who were told their genes made them risk-averse.

So not only do we have how we were raised complicating any sort of understanding of the genetic component of human behavior, we have our subconscious conforming to our perception of how people with our genetic makeup are thought to behave.

So even if there is no Italian gene for quick temper, maybe my friend's short fuse comes from her belief that there is.

Coupled, of course, with having been raised in a shouty family.  The "nurture" side of "nature vs. nurture" is not inconsequential.  All the more reason that question of whether behavior is learned or innate has been going on for a century and still hasn't been decisively settled.

In any case, I better wrap this up.  I think I'm going to go get another cup of coffee.  It's a little early for a glass of red wine, and you know us people with French blood.  It's either one or the other.

****************************************


Tuesday, May 10, 2022

The cost of helplessness

The United Nations Office for Disaster Risk Reduction defines disaster to mean:

A serious disruption of the functioning of a community or a society at any scale due to hazardous events interacting with conditions of exposure, vulnerability and capacity, leading to one or more of the following: human, material, economic and environmental losses and impacts...  [Disasters] may test or exceed the capacity of a community or society to cope using its own resources, and therefore may require assistance from external sources, which could include neighbouring jurisdictions, or those at the national or international levels.

This comes up because the UNDRR just released its Global Assessment Report, which was (to put it mildly) not optimistic.  The rate of disasters (as defined) has been rising steadily; over the last two decades the world has averaged between 350 and 500 medium- to large-scale disasters a year, but at the current rate of increase we'll be up to an average of 560 by the year 2030.

That's 1.5 disasters a day.

The reason seems to be a combination of factors.  One, of course, is anthropogenic climate change, which is destabilizing the climate worldwide (as just one of many examples, the southeastern and midwestern United States is forecast to have record-breaking heat over the next three days, and summer hasn't even officially started yet).  Sea level rise is not only threatening coastlines, if it gets much worse (and there is no reason to think it will not) there are a number of island nations that will simply cease to exist, Tuvalu, Kiribati, Vanuatu, the Marshall Islands, and the Maldives topping the list.  The combined cost of all these disasters, especially in Asia and the Pacific, is predicted to cost affected nations 1.6% of their GDP every year.

You can't incur these kinds of costs and continue to function as a society.

South Tarawa Island, part of the nation of Kiribati [Image licensed under the Creative Commons Photo taken by Government of Kiribati employee in the course of their work, South Tarawa from the air, CC BY 3.0]

The UNDRR's report found that the primary culprits in our vulnerability were:

  • Optimism -- sure, we built our town on the side of a volcano, but I'm sure it won't erupt.
  • Underestimation -- if there's a flood, we'll get a bit of water in our basement, but we can manage that.
  • Invincibility -- we'll just ride this hurricane out, I'm not afraid of some wind and rain.

I think that's spot on, but I'd like to add three of my own:

  • Helplessness -- what can I do?  I'm just one person.  It doesn't matter if I continue to drive a gas-guzzler, because no one else is gonna give them up.
  • Corporate callousness and greed -- strip-mining the Amazon Basin produces valuable resources that are absolutely necessary for industry.
  • Media disinformation -- there's no such thing as human-caused climate change; Tucker Carlson said it was a myth made up by the radical Left.

Despite the odds, this is no time to give up and accept catastrophes as inevitable.  "The world needs to do more to incorporate disaster risk in how we live, build and invest, which is setting humanity on a spiral of self-destruction," said Amina J. Mohammed, Deputy Secretary-General of the United Nations.  "We must turn our collective complacency to action. Together we can slow the rate of preventable disasters as we work to deliver the Sustainable Development Goals for everyone, everywhere."

Which I certainly agree with in principle, but how do you put it into practice?  We've known about humanity's role in climate change, and the potential devastation it will wreak, for more than three decades.  I remember teaching students about it my first year as a public school teacher, which was 1986.  The people who have been the most vocal in advocating a global climate change policy -- my dear friend, the articulate and endlessly courageous Dr. Sandra Steingraber, comes to mind -- have been fighting a Sisyphean battle.

"Disasters can be prevented, but only if countries invest the time and resources to understand and reduce their risks," said Mami Mizutori, who heads the UNDRR.  "By deliberately ignoring risk and failing to integrate it in decision making, the world is effectively bankrolling its own destruction.  Critical sectors, from government to development and financial services, must urgently rethink how they perceive and address disaster risk."

Yes, but how?  Humanity is notorious for valuing short-term expediency and profit over long-term safety -- and even viability.  There are certainly days when I feel like I'm shouting into a vacuum; I've been ranting about environmental issues since I started Skeptophilia in 2011.  But giving up is exactly the wrong response, as tempting as it is some days.  Perhaps we don't know what positive effect we can have if we act, but we do know what positive effect we'll have if we throw our hands up and say, "To hell with it."

Zero.

I'll end with two quotes that I think are particularly apposite.

The first one comes from one of my personal heroes, Wangari Maathai, the amazing Kenyan activist, environmentalist, and women's rights advocate: "In order to accomplish anything, we must keep our feelings of empowerment ahead of our feelings of despair.  We cannot do everything, but still there are many things we can do."

And I'll give the last word to my friend Sandra: "We are all musicians in a great human orchestra, and it is now time to play the Save the World Symphony.  You are not required to play a solo, but you are required to know what instrument you hold and play it as well as you can.  You are required to find your place in the score.  What we love we must protect.  That's what love means.  From the right to know and the duty to inquire flows the obligation to act."

**************************************

Tuesday, September 7, 2021

Grace under pressure

In the 1992 Winter Olympics, there was an eighteen-year-old French figure skater named LaĂ«titia Hubert. She was a wonderful skater, even by the stratospheric standards of the Olympics; she'd earned a silver medal at the French National Championships that year.  But 1992 was a year of hyperfocus, especially on the women's figure skating -- when there were such famous (and/or infamous) names as Nancy Kerrigan, Tonya Harding, Kristi Yamaguchi, Midori Ito, and Surya Bonaly competing.

What I remember best, though, is what happened to LaĂ«titia Hubert.  She went into the Short Program as a virtual unknown to just about everyone watching -- and skated a near-perfect program, rocketing her up to fifth place overall.  From her reaction afterward it seemed like she was more shocked at her fantastic performance than anyone.  It was one of those situations we've all had, where the stars align and everything goes way more brilliantly than expected -- only this was with the world watching, at one of the most publicized events of an already emotionally-fraught Winter Olympics.

This, of course, catapulted Hubert into competition with the Big Names.  She went into the Long Program up against skaters of world-wide fame.  And there, unlike the pure joy she showed during the Short Program, you could see the anxiety in her face even before she stated.

She completely fell apart.  She had four disastrous falls, and various other stumbles and missteps.  It is the one and only time I've ever seen the camera cut away from an athlete mid-performance -- as if even the media couldn't bear to watch.  She dropped to, and ended at, fifteenth place overall.

It was simply awful to watch.  I've always hated seeing people fail at something; witnessing embarrassing situations is almost physically painful to me.  I don't really follow the Olympics (or sports in general), but nearly thirty years later, I still remember that night.  (To be fair to Hubert -- and to end the story on a happy note -- she went on to have a successful career as a competitive skater, earning medals at several national and international events, and in fact in 1997 achieved a gold medal at the TrophĂ©e Lalique competition, bumping Olympic gold medalist Tara Lipinski into second place.)

I always think of LaĂ«titia Hubert whenever I think of the phenomenon of "choking under pressure."  It's a response that has been studied extensively by psychologists.  In fact, way back in 1908 a pair of psychologists, Robert Yerkes and John Dillingham Dodson, noted the peculiar relationship between pressure and performance in what is now called the Yerkes-Dodson curve; performance improves with increasing pressure (what Yerkes and Dodson called "mental and physiological arousal"), but only up to a point.  Too much pressure, and performance tanks.  There have been a number of reasons suggested for this effect, one of which is that it's related to the level of a group of chemicals in the blood called glucocorticoids.  The level of glucocorticoids in a person's blood has been shown to be positively correlated with long-term memory formation -- but just as with Yerkes-Dodson, only up to a point.  When the levels get too high, memory formation and retention crumbles.  And glucocorticoid production has been found to rise in situations that have four characteristics -- those that are novel, unpredictable, contain social or emotional risks, and/or are largely outside of our capacity to control outcomes.

Which sounds like a pretty good description of the Olympics to me.

What's still mysterious about the Yerkes-Dodson curve, and the phenomenon of choking under pressure in general, is how it evolved.  How can a sudden drop in performance when the stress increases be selected for?  Seems like the more stressful and risky the situation, the better you should do.  You'd think the individuals who did choke when things got dangerous would be weeded out by (for example) hungry lions.

But what is curious -- and what brings the topic up today -- is that a study just published this week in Proceedings of the National Academy of Sciences showed that humans aren't the only ones who choke under pressure.

So do monkeys.

In a clever set of experiments led by Adam Smoulder of Carnegie Mellon University, researchers found that giving monkeys a scaled set of rewards for completing tasks showed a positive correlation between reward level and performance, until they got to the point where success at a difficult task resulted in a huge payoff.  And just like with humans, at that point, the monkeys' performance fell apart.

The authors describe the experiments as follows:

Monkeys initiated trials by placing their hand so that a cursor (red circle) fell within the start target (pale blue circle).  The reach target then appeared (gray circle with orange shape) at one of two (Monkeys N and F) or eight (Monkey E) potential locations (dashed circles), where the inscribed shape’s form (Monkey N) or color (Monkeys F and E) indicated the potential reward available for a successful reach.  After a short, variable delay period, the start target vanished, cueing the animal to reach the peripheral target.  The animals had to quickly move the cursor into the reach target and hold for 400 ms before receiving the cued reward.

And when the color (or shape) cueing the level of the reward got to the highest level -- something that only occurred in five percent of the trials, so not only was the jackpot valuable, it was rare -- the monkeys' ability to succeed dropped through the floor.  What is most curious about this is that the effect didn't go away with practice; even the monkeys who had spent a lot of time mastering the skill still did poorly when the stakes were highest.

So the choking-under-pressure phenomenon isn't limited to humans, indicating it has a long evolutionary history.  This also suggests that it's not due to overthinking, something that I've heard as an explanation -- that our tendency to intellectualize gets in the way.  That always seemed to make some sense to me, given my experience with musical performance and stage fright.  My capacity for screwing up on stage always seemed to be (1) unrelated to how much I'd practiced a piece of music once I'd passed a certain level of familiarity with it, and (2) directly connected to my own awareness of how nervous I was.  I did eventually get over the worst of my stage fright, mostly from just doing it again and again without spontaneously bursting into flame.  But I definitely still had moments when I'd think, "Oh, no, we're gonna play 'Reel St. Antoine' next and it's really hard and I'm gonna fuck it up AAAAUUUGGGH," and sure enough, that's when I would fuck it up.  Those moments when I somehow prevented my brain from going into overthink-mode, and just enjoyed the music, were far more likely to go well, regardless of the difficulty of the piece. 

One of my more nerve-wracking performances -- a duet with the amazing fiddler Deb Rifkin on a dizzyingly fast medley of Balkan dance tunes, in front of an audience of other musicians, including some big names (like the incomparable Bruce Molsky).  I have to add that (1) I didn't choke, and (2) Bruce, who may be famous but is an awfully nice guy, came up afterward and told us how great we sounded.  I still haven't quite recovered from that moment.

As an aside, a suggestion by a friend -- to take a shot of scotch before performing -- did not work.  Alcohol didn't make me less nervous, it just made me sloppier.  I have heard about professional musicians taking beta blockers before performing, but that's always seemed to me to be a little dicey, given that the mechanism by which beta blockers decrease anxiety is unknown, as is their long-term effects.  Also, I've heard more than one musician describe the playing of a performer on beta blockers as "soulless," as if the reduction in stress also takes away some of the intensity of emotional content we try to express in our playing.

Be that as it may, it's hard to imagine that a monkey's choking under pressure is due to the same kind of overthinking we tend to do.  They're smart animals, no question about it, but I've never thought of them as having the capacity for intellectualizing a situation we have (for better or worse).  So unless I'm wrong about that, and there's more self-reflection going on inside the monkey brain than I realize, there's something else going on here.

So that's our bit of curious psychological research of the day.  Monkeys also choke under pressure.  Now, it'd be nice to find a way to manage it that doesn't involve taking a mood-altering medication.  For me, it took years of exposure therapy to manage my stage fright, and I still have bouts of it sometimes even so.  It may be an evolutionarily-derived response that has a long history, and presumably some sort of beneficial function, but it certainly can be unpleasant at times.

*********************************

My friends know, as do regular readers of Skeptophilia, that I have a tendency toward swearing.

My prim and proper mom tried for years -- decades, really -- to break me of the habit.  "Bad language indicates you don't have the vocabulary to express yourself properly," she used to tell me.  But after many years, I finally came to the conclusion that there was nothing amiss with my vocabulary.  I simply found that in the right context, a pungent turn of phrase was entirely called for.

It can get away with you, of course, just like any habit.  I recall when I was in graduate school at the University of Washington in the 1980s that my fellow students were some of the hardest-drinking, hardest-partying, hardest-swearing people I've ever known.  (There was nothing wrong with their vocabularies, either.)  I came to find, though, that if every sentence is punctuated by a swear word, they lose their power, becoming no more than a less-appropriate version of "umm" and "uhh" and "like."

Anyhow, for those of you who are also fond of peppering your speech with spicy words, I have a book for you.  Science writer Emma Byrne has written a book called Swearing Is Good for You: The Amazing Science of Bad Language.  In it, you'll read about honest scientific studies that have shown that swearing decreases stress and improves pain tolerance -- and about fall-out-of-your-chair hilarious anecdotes like the chimpanzee who uses American Sign Language to swear at her keeper.

I guess our penchant for the ribald goes back a ways.

It's funny, thought-provoking, and will provide you with good ammunition the next time someone throws "swearing is an indication of low intelligence" at you.  

[Note: if you purchase this book using the image/link below, part of the proceeds goes to support Skeptophilia!]


Thursday, August 27, 2020

Rewarding the daredevil

There were three magic words that used to be able to induce me to do almost anything, regardless how catastrophically stupid it was: "I dare you."

It's how I ended up walking the ridgeline of a friend's house when I was in eighth grade:
Friend: My house has such a steep roof.  I don't know how anyone could keep his balance up there.
Me:  I bet I could. 
Friend (dubiously):  You think? 
Me;  Yeah. 
Friend:  I dare you. 
Me:  Get me a ladder.
That I didn't break my neck was as much due to luck as skill, although it must be said that back then I did have a hell of a sense of balance, even if I didn't have much of any other kind of sense.

[Image licensed under the Creative Commons Ă˜yvind Holmstad, A yellow house with a sheltering roof, CC BY-SA 3.0]

Research by neuroscientists Lei Zhang (University Medical Center Hamburg-Eppendorf) and Jan Gläscher (University of Vienna) has given us some insight into why I was prone to doing that sort of thing (beyond my parent's explanation, which boiled down to "you sure are an idiot").  Apparently the whole thing has to do with something called "reward prediction error" -- and they've identified the part of the brain where it occurs.

Reward prediction error occurs when there is a mismatch between the expected reward and the actual reward.  If expected reward occurs, prediction error is low, and you get some reinforcement via neurochemical release in the putamen and right temporoparietal junction, which form an important part of the brain's reward circuit.  A prediction error can go two ways: (1) the reward can be lower than the expectation, in which case you learn by changing your expectations; or (2) the reward can be higher than the expectation, in which case you get treated to a flood of endorphins.

Which explains my stupid roof-climbing behavior, and loads of other activities that begin with the words "hold my beer."  I wasn't nearly as fearless as I was acting; I fully expected to lose my balance and go tumbling down the roof.  When that didn't happen, and I came ambling back down the ladder afterward to the awed appreciation of my friend, I got a neurochemical bonus that nearly guaranteed that next time I heard "I dare you," I'd do the same thing again.

The structure of the researchers' experiment was interesting.  Here's how it was described in a press release in EurekAlert:
[The] researchers... placed groups of five volunteers in the same computer-based decision-making experiment, where each of them was presented with two abstract symbols.  Their objective was to find out which symbol would lead to more monetary rewards in the long run.  In each round of the experiment, every person first made a choice between the two symbols, and then they observed which symbols the other four people had selected; next, every person could decide to stick with their initial choice or switch to the alternative symbol.  Finally, a monetary outcome, either a win or a loss, was delivered to every one according to their second decision...  In fact, which symbol was related to more reward was always changing.  At the beginning of the experiment, one of the two symbols returned monetary rewards 70% of the time, and after a few rounds, it provided rewards only 30% of the time.  These changes took place multiple times throughout the experiment...  Expectedly, the volunteers switched more often when they were confronted with opposing choices from the others, but interestingly, the second choice (after considering social information) reflected the reward structure better than the first choice.
So social learning -- making your decisions according to your friends' behaviors and expectations -- is actually not a bad strategy.  "Direct learning is efficient in stable situations," said study co-author Jan Gläscher, "and when situations are changing and uncertain, social learning may play an important role together with direct learning to adapt to novel situations, such as deciding on the lunch menu at a new company."

Or deciding whether or not it's worth it to climb the roof of a friend's house.

We're social primates, so it's no surprise we rely a great deal on the members of our tribe for information about what we should and should not do.  This works well when we're looking to older and wiser individuals, and not so well when the other members of our tribe are just as dumb as we are.  (This latter bit explains a lot of the behavior we're currently seeing in the United States Senate.)  But our brains are built that way, for better or for worse.

Although for what it's worth, I no longer do ridiculous stunts when someone says "I dare you."  So if you were planning on trying it, don't get your hopes up.

*********************************

This week's Skeptophilia book recommendation of the week is a brilliant retrospective of how we've come to our understanding of one of the fastest-moving scientific fields: genetics.

In Siddhartha Mukherjee's wonderful book The Gene: An Intimate History, we're taken from the first bit of research that suggested how inheritance took place: Gregor Mendel's famous study of pea plants that established a "unit of heredity" (he called them "factors" rather than "genes" or "alleles," but he got the basic idea spot on).  From there, he looks at how our understanding of heredity was refined -- how DNA was identified as the chemical that housed genetic information, to how that information is encoded and translated, to cutting-edge research in gene modification techniques like CRISPR-Cas9.  Along each step, he paints a very human picture of researchers striving to understand, many of them with inadequate tools and resources, finally leading up to today's fine-grained picture of how heredity works.

It's wonderful reading for anyone interested in genetics and the history of science.

[Note: if you purchase this book using the image/link below, part of the proceeds goes to support Skeptophilia!]




Tuesday, August 14, 2018

Giving bad news to Pollyanna

Two months ago my younger son moved to Houston, Texas for a new job, and although he's 27, this of course elicited all the usual parental worries from Carol and me.  But we submerged our nervousness, not to mention our awareness that this means we'll only see him once or twice a year at best, and helped him pack up and get on his way.

He spent his last night in New York at our house, and left on a sunny Sunday morning with hugs and good lucks and farewells.  Five hours later I got a telephone call from him that is something no parent would want to hear.

"Dad?  I need some help.  I was in an accident.  The wheel fell off my truck."

After I got past the say-whats and what-the-fucks, and returned my heart rate to as near normal as I could manage, I asked him for details.  The bare facts are as follows.

He was heading down I-90 at the obligatory seventy miles per hour, out in the hinterlands of Ohio, when there was a loud bang and his truck skidded to the right.  What apparently had happened is that three weeks earlier, when he was having the tires replaced, the mechanic had overtightened one of the bolts and cracked it.  At some point, the pressure made it give way, and the torque sheared off all four of the other bolts.

As luck would have it -- and believe me, there's a lot to credit luck with in this story -- the wheel went under his truck and got lodged, so he was skidding with the wheel and tire as padding.  He maneuvered his truck to the shoulder, miraculously without hitting anything or anyone, and without putting a scratch on his truck -- or himself.  But there he was, alongside the freeway ten miles from Ashtabula, wondering what the hell he was going to do.

The story ends happily enough; I called a tow truck and had him towed to a place where they botched a second repair job, but he figured that out before he'd gotten very far (believe me, now he's aware of every stray shudder or wobble), and we had him towed a second time to the Mazda dealership in Erie, Pennsylvania, where they completed the repair the right way.  The remainder of his journey to Houston was uneventful.

[Image is licensed under the Creative Commons Dual Freq, I-72 North of Seymour Illinois, CC BY-SA 3.0]

This all comes up because there's been a new study from University College, London, about our reactions to bad news, and how those reactions change when we're under stress.  The research team was made up of experimental psychologists Neil Garrett, Ana MarĂ­a GonzĂ¡lez-GarzĂ³n, Lucy Foulkes, Liat Levita, and Tali Sharot (regular readers of Skeptophilia may recognize Sharot's name; she was part of a team that investigated why people find lying progressively less shame-inducing the more we do it, a study that I wrote about last year).

The Garrett et al. team's paper, "Updating Beliefs Under Perceived Threat," looked at why we are better at accepting positive news than negative.  It isn't, apparently, just wishful thinking, or resisting believing bad news.  The authors write:
Humans are better at integrating desirable information into their beliefs than undesirable.  This asymmetry poses an evolutionary puzzle, as it can lead to an underestimation of risk and thus failure to take precautionary action.  Here, we suggest a mechanism that can speak to this conundrum.  In particular, we show that the bias vanishes in response to perceived threat in the environment.  We report that an improvement in participants' tendency to incorporate bad news into their beliefs is associated with physiological arousal in response to threat indexed by galvanic skin response and self-reported anxiety.  This pattern of results was observed in a controlled laboratory setting (Experiment I), where perceived threat was manipulated, and in firefighters on duty (Experiment II), where it naturally varied.  Such flexibility in how individuals integrate information may enhance the likelihood of responding to warnings with caution in environments rife with threat, while maintaining a positivity bias otherwise, a strategy that can increase well-being.
In practice what they did was to induce anxiety in one group of their test subjects by telling them that as part of the experiment, they were going to have to give a public speech to a room full of listeners, and then asked them to estimate their risk of falling victim to a variety of dangers -- automobile accident, heart attack, homicide, and so on.  A second group (as the paragraph above explains) was simply exposed to anxiety-inducing situations naturally because of their job as firefighters, and then given the same questions.  Each of those two groups were again split into two groups; one was given bad news (that the chance of their experiencing the negative events was higher than they thought), and the other good news (that the chance was lower than they thought).

The volunteers were then asked to re-estimate their odds of each of the occurrences.

And what they found was that the subjects who had experienced anxiety had no Pollyanna bias -- they were much more realistic about estimating their odds, and revised their estimates either upward or downward (depending on which response they'd been given).

More interesting were the people who were in a control group, and had not experienced anxiety.  The ones who were given good news readily revised their estimates of bad outcomes downward, but the ones given bad news barely budged.  It's as if they thought, "Hey, I'm feeling pretty good, I can't believe I was really that far off in estimating my risk."

My question is whether this might be the origin of anxiety disorders, which are a little hard to explain evolutionarily otherwise.  They're terribly common, and can be debilitating.  Could this be some kind of evolutionary misfire -- that in the risk-filled environments our ancestors inhabited, keeping some background level of anxiety made us more realistic about our likelihood of harm?  And now that the world is a far safer place for many of us, that anxiety loses its benefit, and spirals out of control?

All of that is just speculation, of course.  But as far as what happened to my son, you'd be correct in surmising that it was not easy for me to hear.  My anxiety blew a hole through the roof, even though (1) he was fine, (2) his truck was fine, and (3) once we got him towed and the truck repaired, everything was likely to be fine.

I swear, I spent the next three days shaking.

Which, I guess, constitutes "integrating undesirable information."

In any case, the research by Garrett et al. gives us an interesting window into how induced anxiety alters our ability to modify our worldviews.  Myself, I'm just glad my son is settled in Houston and loves his new job.  It's not like this means I won't be anxious any more, but having one less thing to fret about is definitely a good thing.

*****************************

I picked this week's Skeptophilia book recommendation because of the devastating, and record-breaking, fires currently sweeping across the American west.  Tim Flannery's The Weather Makers is one of the most cogent arguments I've ever seen for the reality of climate change and what it might ultimately mean for the long-term habitability of planet Earth.  Flannery analyzes all the evidence available, building what would be an airtight case -- if it weren't for the fact that the economic implications have mobilized the corporate world to mount a disinformation campaign that, so far, seems to be working.  It's an eye-opening -- and essential -- read.

[If you purchase the book from Amazon using the image/link below, part of the proceeds goes to supporting Skeptophilia!]





Saturday, March 17, 2018

Preventing the unknown

Some days it's no great mystery why the general public is dubious about scientists.

I mean, a lot of it is the media, as I've discussed here at Skeptophilia ad nauseam.  But there are times that the scientists themselves put their best foot backward.  As an example, consider the announcement from the World Health Organization this week that their Research & Development Blueprint for priority diseases includes "Disease X."

A disease that is as-yet unidentified.

The blueprint itself says this:
Disease X represents the knowledge that a serious international epidemic could be caused by a pathogen currently unknown to cause human disease, and so the R&D Blueprint explicitly seeks to enable cross-cutting R&D preparedness that is also relevant for an unknown “Disease X” as far as possible.
On the one hand, there's a grain of sense there.  Recognizing the fact that there are "emerging diseases" that are apparently new to humanity, and that could cause epidemics is the first step toward readying ourselves for when that happens.  (Recent examples are Ebola and Lassa fever, Marburg virus, Severe Acute Respiratory Syndrome (SARS), and chikungunya.)

The Ebola virus [image courtesy of the World Health Organization]

But still.  What the WHO is telling the public is that they're putting time and effort into preventing an epidemic from a disease that:
  • may not exist
  • if it does exist, has unknown symptoms, origins, and mode of transmission
  • may or may not be preventable
  • may or may not be treatable
  • may or may not be highly communicable
  • may or may not be carried by other animals
  • is of unknown duration and severity
Is it just me, or does this seem like an exercise in futility?

Like I said, an awareness of the unpredictability of disease outbreaks is a start, but this seems like trying to nail jello to the wall.  Each time humanity has been faced with a potential pandemic, we've had to study the disease and how it moves from one host to another, scramble to find treatments for the symptoms while we're searching for an actual cure (or better yet, a vaccine to prevent it), and do damage control in stricken areas.  So I can't see where the "Disease X" approach gets us, except to put everyone on red alert for an epidemic that may never happen.

I think my eyerolling when I read about this comes from two sources.  First, I'm all too aware that life is risky, and although it's certainly laudable to try to reduce the risk as much as you can, the bare fact is that you can't remove it entirely.  After all, none of us here are getting out of this place alive.  And second, there is an unavoidable chaotic element to what happens -- we get blindsided again and again by bizarre occurrences, and the professional prognosticators (not to mention professional psychics) get it wrong at least as often as they get it right.

So there probably will eventually be a new emerging epidemic.  On a long enough time scale, there's probably going to be a true pandemic as well.  I hope that with our advances in medical research, we'll be able to respond in time to prevent what happened during the Black Death, or worse, the Spanish flu epidemic of 1918 to 1919, that killed an estimated 40 million people (over twice the number of deaths as the battlefield casualties of World War I, which was happening at the same time).

In one sense, I take back what I said about not being able to do anything about it ahead of time.  We can give ourselves the best shot at mitigating the effects of an outbreak -- by funding medical research, and encouraging our best and brightest to go into science (i.e., education, a topic I've also rung the changes on more than once).  Other than that, I'm just going to eat right, exercise, and hope for the best.

Friday, August 11, 2017

Veterinarians and anti-vaxxers

Let's get something straight from the outset.

Vaccines don't cause autism.  They never have.  The "research" of Andrew Wakefield, which started that whole myth, was shown to be fraudulent years ago, and every study since then -- and there have been many -- has supported that vaccines have few side effects, the vast majority of which are mild and temporary, and their benefits outweigh any risks they might engender.

And yes, that includes the two vaccines most often cited as being dangerous, MMR (Measles/Mumps/Rubella) and HPV (Human Papillomavirus).

This whole thing should have been laid to rest ages ago, but there's no idea so baseless and stupid that there won't be loads of people who believe it.  Which, I believe, largely explains the bizarre resurgence of the "Flat Earth" model, a claim so stupid that anyone who believes it apparently has a single Froot Loop where most of us have a brain.

But back to vaccines.  I've dealt with this topic here at Skeptophilia often enough that you might be wondering why I'm returning to it.  Well, the answer is that the anti-vaxx movement has now expanded its focus to a different target...

... pets.

[image courtesy of photographer Noël Zia Lee and the Wikimedia Commons]

I kid you not.  Veterinarians, especially in urban areas of the United States, are reporting an increasing number of pet owners who are refusing to get their pets vaccinated.  Only one vaccine is mandated for dogs in the U.S. -- rabies -- but the others are critical to prevent devastating diseases.  The reason you hardly ever hear about a dog getting (for example) canine distemper is because responsible dog owners have their dogs vaccinated against it.  The vaccine is nearly 100% effective, and (like virtually all vaccines) safe and side-effect free.

If your dog actually contracts distemper, however, he has a 50-50 chance of surviving it, even with the best veterinary care.

There's no question which option I take for my own dogs.

The anti-vaxxers, however, don't see it like this.  Recall that this is the group of people who believe that it's better to develop "natural immunity," meaning immunity from exposure to the actual pathogen.  If a child (or a pet) has a good diet and is otherwise healthy, they say, these infectious diseases aren't dangerous.  Thus the book Melanie's Marvelous Measles by Stephanie Messenger, which tells the story of little Melanie who is just thrilled to get measles and develop "natural immunity" rather than having to go through the ordeal of getting a vaccination.

For the record, I'm not making this book up.  Although I do find it heartening that of the 511 reviews it's gotten so far on Amazon, 74% of them are one-star.

The problem is twofold.  First, this "natural immunity" carries with it the risk of horrible complications from the disease itself, a few of which are shingles (chicken pox), sterility (mumps), blindness (measles), and birth defects (rubella).  That's if they don't kill you outright.  I have mentioned before my grandfather's two sisters, Marie Emelie and Anne Daisy, who died nine days apart of measles -- at the ages of 22 and 16, respectively.

The second problem is that it doesn't take all that many people choosing not to vaccinate to give infectious diseases a foothold.  Measles and mumps are both making comebacks; to return to the original topic of pets, so is distemper, to judge from a 2014 outbreak in Texas that resulted in 200 cases of the once-rare disease.

And why are people making this decision?  As with the anti-vaxxers who are refusing to vaccinate their children, these people are trying to protect their pets against some unspecified set of ostensible risk factors.  Stephanie Liff, a Brooklyn-based veterinarian, has reported that she has clients who elected not to vaccinate their dogs -- because they were afraid the dogs would become autistic.

"We've never diagnosed autism in a dog," Liff said.  "I don't think you could."

The bottom line here is that our pets, like our children, depend on us to make responsible decisions with regards to their health, safety, and welfare.  The fact that people have loony ideas sometimes is unavoidable; but when those loony ideas start to endanger others, including animals, who have no say in the matter -- then it becomes reprehensible.