Skeptophilia (skep-to-fil-i-a) (n.) - the love of logical thought, skepticism, and thinking critically. Being an exploration of the applications of skeptical thinking to the world at large, with periodic excursions into linguistics, music, politics, cryptozoology, and why people keep seeing the face of Jesus on grilled cheese sandwiches.

Thursday, June 8, 2017

False vacuum catastrophe

It's odd how enamored people are of things that could destroy the entire universe.

I mean, on one level I get it.  The sheer power of the natural world is pretty awe-inspiring, and as I've mentioned before, if I hadn't become a mild-mannered high school biology teacher, I definitely would have been a a tornado chaser.  That same love of extreme danger (especially when it's not you experiencing it) explains shows like The Deadliest Catch and the innumerable quasi-documentaries wherein divers swim around in chum-filled waters and still seem surprised when they're attacked by sharks.

But on a larger scale, there's a real curiosity about things that could wipe out pretty much everything.  A while back, I wrote a piece about people sounding gleeful that we might be looking down the gun barrel of a gamma-ray burster (we're not), and over and over we've heard alarmists suggesting that CERN was going to create a black hole that would eat the Earth (it's not).  But that doesn't begin to exhaust the ways in which we all could die in horrible agony.

Which brings us to the concept of the false vacuum.

Sounds harmless enough, doesn't it?  Well, this is in the long tradition of physicists giving seriously weird things cutesy names, like "strange quarks" and "glueballs."

The idea of the false vacuum is that the universe is currently in a "metastable state."  What this means is that right now we're in a locally stable configuration, but if something destabilizes us a little bit, we might find ourselves suddenly plunging into a more stable state -- a "true vacuum."  The situation, then would be similar to that of the little ball in the graph below:


As long as nothing disturbs the status quo, the ball is stable; but if something gives it a push up the hill in the middle, it'll crest the hill and find itself rushing downward into a more stable position -- the "true vacuum."

Why this concerns anyone but the physicists is that the result of our reconfiguring into a true vacuum would be that a bubble would form, rushing outward at the speed of light, and destroying everything in its path.

The Standard Model of Particle Physics suggests that from the mass of the Higgs boson and the top quark, an estimate could be made of just how likely this is.  Writer Robert Walker concludes, from the research of Joseph Lykken and others, that the answer is "not very:"
[I]f it could happen, then you’d expect it to have happened already in the first 1/10,000,000,000th of a second along with the other symmetry breaking when gravity split off from the other forces, when it was tremendously hot... 
Since that hasn’t happened, the false vacuum has to be very stable, or else, probably as we find new physics we find out that it is not in a false vacuum state at all. 
And yes, on the basis of the measured mass of the Higgs boson, the false vacuum has to be very stable.  Joseph Lykken says that an event that triggers a patch of true vacuum, if the theory is correct, happens on average once every 10, 000, trillion, trillion, trillion, trillion, trillion, trillion, trillion, trillion years. 
That means it is nothing to be worried about.
Walker, who is a mathematician, says that the likelihood of a true vacuum bubble occurring in any given century is less than the likelihood of purchasing tickets for twelve consecutive Euromillions lotteries, and winning the jackpot for all of them.

So "don't worry about it" seems to be an understatement.

However, that hasn't stopped the alarmists from freaking out about it, probably largely due to the fact that if it did happen, it would be pretty catastrophic.  Also, because a lot of them seem to feel that the physicists (for this, read "mad scientists") are actively trying to trigger the creation of a true vacuum, which would be an idiotic thing to do even if it were possible because they'd be the first ones to get vaporized, and wouldn't even have the pleasure of standing around rubbing their hands together and cackling maniacally for more than about a microsecond.

But then there are the ones who think that it could happen accidentally (again, because of CERN, of course), and the physicists are simply being reckless, not suicidal.  I tend to agree with Walker, though.  I'm way more worried about the idiotic things humans are currently doing to the environment, and our determination to slaughter each other over things like who has the best Invisible Friend, than I am about triggering the Scary Bubble of Death.

Anyhow.  That's our Terrifying Thing That Can Kill you for today, along with some soothing words about why it's not very likely.  Now you'll have to excuse me, because I'm gonna go have a pint of beer and watch Twister for the 17th time.

Wednesday, June 7, 2017

Liar liar

In my youth, I was quite an accomplished liar.

I say "accomplished" more to mean "I did it a lot" rather than "I did it well."  I honestly don't know how well I lied -- it might be that people in general didn't believe what I said and were simply too polite to call me out on it.  On the other hand, I did get away with a lot of stuff.  So apparently I was at least marginally successful.

What I lied about tended to be exaggerations about my past -- I rarely if ever lied out of malice.  But I felt my own circumstances to be boring and bland, a sense compounded by the fact that I've always suffered from serious social anxiety, so I think I felt as if building up a fictional persona who was interesting and adventurous might assuage my fear of being judged by the people I met.  Eventually, though, I realized that all I was doing was sabotaging the relationships I had, because once people found out I wasn't who I said I was, they'd be understandably pissed that I hadn't been straight with them.  So I dedicated myself to honesty, a commitment I've tried my hardest to keep ever since then.

On the other hand, I became a fiction writer, which means now I make up elaborate lies, write them down, and people pay me to read them.  So maybe I haven't progressed as far as I'd thought.

Kang Lee and Victoria Talwar of the University of Toronto have been studying lying for some time, and they've found that the propensity of children to lie increases as they age.  Presumably, once they develop a sense of shame and a better impulse control, they find themselves sheepish when they transgress, and lie to cover up their feelings or escape the consequences.  In a study in the International Journal of Behavioral Development, Lee and Talwar gave children of varying ages a task while a music-playing toy played behind them, and told them not to peek at the toy:
When the experimenter asked them whether they had peeked, about half of the 3-year-olds confessed to their transgression, whereas most older children lied.  Naive adult evaluators (undergraduate students and parents) who watched video clips of the children’s responses could not discriminate lie-tellers from nonliars on the basis of their nonverbal expressive behaviours.  However, the children were poor at semantic leakage control and adults could correctly identify most of the lie-tellers based on their verbal statements made in the same context as the lie.  The combined results regarding children’s verbal and nonverbal leakage control suggest that children under 8 years of age are not fully skilled lie-tellers.
Lee considers this behavior a completely normal part of social development, and in fact, says he worries about the 10% of older children in his study who could not be induced to lie -- because telling the truth 100% of the time, without regard for others' feelings or the consequences thereof, might not be the best thing, either.

But the tendency to lie doesn't vanish with adulthood.  A study by Robert Feldman, of the University of Massachusetts-Amherst, found that 60% of adults lied at least once during a ten-minute conversation.

"People tell a considerable number of lies in everyday conversation," Feldman said about his study.  "It was a very surprising result.  We didn't expect lying to be such a common part of daily life...  When they were watching themselves on videotape, people found themselves lying much more than they thought they had... It's so easy to lie.  We teach our children that honesty is the best policy, but we also tell them it's polite to pretend they like a birthday gift they've been given.  Kids get a very mixed message regarding the practical aspects of lying, and it has an impact on how they behave as adults."

Of course, all lies aren't equally blameworthy.  Telling Aunt Bertha that the knitted sweater she made for your Christmas gift is lovely probably is better than saying, "Wow, that is one ugly-ass sweater, and I'm bringing it down to the Salvation Army as soon as I get a chance."

[image courtesy of Aunt Bertha and the Wikimedia Commons]

As for the kind of thing I did as a kid -- saying that I'd spent my summer vacation riding musk oxen in the Aleutian Islands -- it's kind of ridiculous and pointless, but other than distancing one from one's friends (as I described before) probably isn't really very high on the culpability scale, either.

But lying to hurt, lying for personal gain, lying to gain or retain power (I'm lookin' at you, Donald Trump) -- those are serious issues.

Unfortunately, however, even the less serious lies can cause problems, because there is the tendency for small lies to lead to bigger ones.  A study by Tali Sharot of University College London found out that our amygdala -- the structure in the brain that appears to mediate fear, shame, and anxiety -- actually fires less the more we lie.  The first lies we tell elicit a strong response; but we become habituated quickly.

The more we lie, the easier it gets.

So the old adage of "honesty is the best policy" really does seem to apply in most circumstances.

Unless, of course, you're a fiction writer.  Then the rules don't apply at all.  Now you'll have to excuse me, as I've got a herd of musk oxen to attend to.

Tuesday, June 6, 2017

The waking dream

Yesterday's post, about the generally bizarre nature of dream content, prompted a friend and loyal reader of Skeptophilia, the amazing writer A. J. Aalto, to send me a link to a study done a while back in Switzerland that showed that our dream content sometimes forms a continuum with our waking experience.

The author and lead researcher, Sophie Schwartz of the Department of Neuroscience at the University of Geneva, did a clever study where volunteers were instructed to play the computer game Alpine Racer II, wherein the player stands on a movable platform that tracks his/her movements, while an avatar skis downhill on the computer screen.  To be successful in the game, the player has not only to exhibit balance, coordination, and motor skill, but to focus visually on the task and ignore any distractions.  Schwartz then had the players record their dream content, comparing it to people who had only watched the game, and control volunteers who had done an unrelated activity.


Schwartz writes:
After training on the Alpine Racer, 30% of spontaneous mentation collected at different times during pre-sleep wakefulness and light NREM sleep (up to 300 sec after sleep onset) contained imagery (of any modality, 24%) or thoughts (6%) related to the skiing game.  Wamsley et al. also found that imagery directly related to training on the game (unambiguous representations of the Alpine Racer or of skiing) declined across time.  This time-course was paralleled by a tendency for game-related incorporations to become more abstracted from the original experience.  These findings do not only provide empirical evidence for spontaneous memory replay during wakefulness and light NREM sleep (stages 1 and 2), but they show that reports of subjective experience offer valuable information about cognitive processes across changing brain states.
Schwartz acknowledges that the high rate of incorporation of skiing imagery into the players' dreams probably had to do with the degree of attention the game required:
High levels of incorporation of Alpine Racer are most plausibly related to the strong motivational and attentional involvement of the player during the game.  Consistent with this interpretation, a few participants who only observed those playing Alpine Racer also incorporated elements of the game into their sleep-onset mentation, at rates similar to the participants who were actively engaged in the game.  These effects and their time-course suggest that novelty may be a critical factor for the selection of material to be mentally replayed.  Moreover, many baseline night reports incorporated thought or imagery related to the game (compared to a control set of sleep-onset mentation reports), indicating that the mere anticipation of the task could trigger prospective memory processes that emerged at sleep onset.  It is tempting to speculate that hypnagogic imagery may contribute to the integration of recent experiences with long-term memories and future goals.
This is consistent with my wife's memories of being in graduate school and spending an inordinate amount of time avoiding doing her research by playing Tetris.  She realized she should probably stop when she started having dreams of brightly-colored blocks falling from the sky, and fortunately was able to curb her Tetris addiction before her adviser had to stage an intervention.

For myself, I can't say that I see a lot of incorporation of waking experience into my dreams.  Much of my dream content seems to fall squarely into the category of "What the fuck?", such as a recent dream wherein I was filling our bathtub with styrofoam peanuts, except they kept melting and running down the drain, which made even less sense when I looked up and realized that the bathtub wasn't in my house, it was in the middle of the Sahara Desert.

None of which, I can assure you without hesitation, was a continuation of anything I'd been doing that day.

I've also noticed a tendency in my more reality-based dreams to have more content with strong emotional charge than that with any connection to recent events.  I've been teaching for thirty years, and I still have frequent teaching-anxiety dreams -- that my students aren't listening or are misbehaving, that I get confused or off track during a lecture and can't remember what I'm supposed to be doing, even that I'm wandering around the halls in the school and can't find my classroom.  I also have dreams of losing loved ones or pets, dreams of witnessing violence, dreams of being trapped -- all of which have a powerful emotional content.

But I haven't noticed much tendency for my dream content to exhibit Schwartz's continuance from the waking state.  In fact, I can recall many times when I expected to dream about something -- when I've been involved all day in a project, or (especially) when I've watched a scary or emotionally powerful movie -- and it almost never happens.

So once more, we're back to dreams being mysterious, and any explanations we have regarding dream content being incomplete at best.  Which, of course, is part of their fascination.  I'll definitely be giving this topic more thought, once I've figured out what to do with all of these melted styrofoam peanuts.

Monday, June 5, 2017

Live your dream! Unless it's the one where you're naked on the bus.

Last night I had the strangest dream, but it wasn't about a girl in a black bikini (sorry if you're too young to get that reference).  One of my coworkers was going to be interviewed on public television by Yoko Ono.  I won't mention who the interviewee was, but trust me, if there was a list of people who were likely to be interviewed by Yoko Ono, this person would be near the bottom.  So anyway, I was being driven to this event by our school psychologist, but we were going to be late because he had the sudden overwhelming need to find a grocery store so he could buy a bag of potato chips.

I won't go any further into it, because at that point it started to get a little weird.

It is an open question why people dream, but virtually everyone does.  During the REM (rapid eye movement) stage of sleep, there are parts of the mind that are as active as they are during wakefulness.  This observation led brain scientists to call this stage "paradoxical sleep" -- paradoxical because while the body is usually very relaxed, the brain is firing like crazy.

Well, parts of it are.  While the visual and auditory centers are lighting up like a Christmas tree, your prefrontal cortex is snoozing in a deck chair.  The prefrontal cortex is your decision-making module and reality filter, and this at least partly explains why dreams seem so normal while you're in them but so bizarre when you wake up and your prefrontal cortex has a chance to reboot.

[image courtesy of the Wikimedia Commons]

The content of dreams has been a subject of speculation for years, and all available evidence indicates that the little "Your Dreams Interpreted" books you can buy in the supermarket checkout lines are unadulterated horse waste.  Apparently there is some thought that much of our dream content is involved with processing long-term memories; but equally plausible theories suggest that dreaming is a way of resetting our dopamine and serotonin receptors, or a way of decommissioning old neural pathways (so-called "parasitic nodes").  Probably, it aids in all three.  Whatever it is, however, it's important -- all mammal species tested undergo REM sleep, some for as much as eight hours a night.

Anyone who's a dog owner probably knew that already, of course.  Both of my dogs dream, as evidenced by their behavior while they're asleep.  My coonhound, Lena, has squirrel-chasing dreams, which makes sense because while she's awake two of her three operational brain cells are devoted to constant monitoring of our backyard squirrel population.  She'll be lying there, completely sacked out, then suddenly she'll woof softly under her breath, and her paws will twitch as if she were running after her prey.  Every once in a while she apparently catches one, because she'll go, "Rrrrrrr," and shake her head as if tearing a squirrel apart.

Grendel, on the other hand, tends to have happy, sweet dreams.  He'll twitch and sigh... and then his tail starts wagging.  Which is a top contender for the cutest thing I've ever seen in my life.

As far as human dreams go, it's interesting that there is a fairly consistent set of content types in dreams, regardless of your culture or background.  Some of the more common ones are dreams of falling, being chased, fighting, seeing someone who has died, having sexual experiences, being in a public place while inappropriately dressed, and being unable to attend interviews by Yoko Ono because of searching for potato chips.

A few well-documented but less common dreamlike experiences include lucid dreams (being aware that you're dreaming while it's happening), hypnagogic experiences (dreams in light sleep rather than REM), and night terrors (terrifying dreams during deep sleep).  This last-mentioned is something that is found almost exclusively in children, and almost always ceases entirely by age twelve.  My younger son had night terrors, and the first time it happened was truly one of the scariest things I've ever experienced.  At 11:30 one night he started shrieking hysterically, over and over.  I jumped out of bed and ran down the hall like a fury, to find him sitting bolt upright in bed, trembling, eyes wide open, and drenched with sweat.  I ran to him and said, "What's wrong?"  He pointed to an empty corner of the room and said, "It's staring at me!"

I should mention at this point that I had just recently watched the movie The Sixth Sense.

When I finished peeing my pants, I was able to pull myself together enough to realize that he was having a night terror, and that there were in fact no spirits of dead people staring at him from the corner of his bedroom.  When I got him calmed down, he went back into a deep sleep -- and the next morning remembered nothing at all.

I, on the other hand, required several months of therapy to recover completely.

Whatever purpose dreams and other associated phenomena serve, there is no evidence whatsoever that they are "supernatural" in any sense.  Precognitive dreams, for instance, most likely occur because you dream every night, about a relatively restricted number of types of events, and just by the law of large numbers at some point you'll probably dream something that will end up resembling a future event.  There is no mystical significance to the content of our dreams -- it is formed of our own thoughts and memories, both pleasant and unpleasant; our fears and desires and wishes, our emotions and knowledge; so they are at their base a reflection of the bits and pieces of who we are.   It's no wonder that they are funny, scary, weird, complex, erotic, disturbing, exhilarating, and perplexing, because we are all of those things.

So, next time you're in the midst of a crazy dream, you can be comforted by the fact that you are having an experience that is shared by all of humanity, and most other mammals as well.  What you're dreaming is no more significant, but also no more peculiar, than what the rest of us are dreaming.  Just sit back and enjoy the show.  And give my regards to Yoko Ono.

Saturday, June 3, 2017

Face card

I ran into an article in the New York Times a couple of days ago that begins with the line, "The brain has an amazing capacity for recognizing faces."

This made me snort derisively, because as I've mentioned before, I have prosopagnosia -- face blindness.  I'm not completely face blind, as the eminent writer and neuroscientist Oliver Sacks was -- Sacks, after all, didn't even recognize his own face in a mirror.  I'm not quite that badly off, but even so, I don't have anywhere near instantaneous facial recognition.  I compensate by being good at remembering voices, and paying attention to things like gait and stance.  Beyond that, I tend to remember people as lists of features -- he's the guy with the scar through one eyebrow, she's the one with black hair and three piercings in her left ear.  But it's a front-of-the-brain, conscious cognitive thing, not quick and subconscious like it (apparently) is with most people.

And even that strategy can fail, if someone changes hair styles, gets new glasses, or begins to dress differently.  Then I have to rely on my other strategies, as I did a couple of days ago in our local pharmacy.  The check-out clerk smiled at me, and I said hi and greeted her by name.  She was a former student who had taken my neuroscience class a couple of years ago, and she grinned at me and said, "I thought you didn't recognize people's faces."

"I don't," I said.  "You're wearing a name tag."

[image courtesy of the Wikimedia Commons]

Despite my scornful snort at the first line of the article in the Times, I was pretty interested in its content, not least because it gives me an insight into my own peculiar inability.  The article describes the research of Le Chang and Doris Y. Tsao (published this week in Cell), of Caltech, who using fMRI monitoring of the brains of monkeys, have begun to elucidate how the brain processes faces.  Chang and Tsao write:
Primates recognize complex objects such as faces with remarkable speed and reliability.  Here, we reveal the brain’s code for facial identity.  Experiments in macaques demonstrate an extraordinarily simple transformation between faces and responses of cells in face patches.  By formatting faces as points in a high-dimensional linear space, we discovered that each face cell’s firing rate is proportional to the projection of an incoming face stimulus onto a single axis in this space, allowing a face cell ensemble to encode the location of any face in the space.  Using this code, we could precisely decode faces from neural population responses and predict neural firing rates to faces.  Furthermore, this code disavows the long-standing assumption that face cells encode specific facial identities, confirmed by engineering faces with drastically different appearance that elicited identical responses in single face cells.  Our work suggests that other objects could be encoded by analogous metric coordinate systems.
Put more simply, the brain seems to encode facial recognition in a fairly small number of cells -- possibly as few as 10,000 -- which fire in a distinctive pattern depending on the deviation of the face being observed, on various metrics, from an "average" or "baseline" face.  This creates what Chang and Tsao call a "face space" -- a mapping between facial features and a set of firing patterns in the facial recognition module in the brain.

Chang and Tsao got so good at discerning the "face space" in a monkey's brain that they could tell which face photograph a monkey was looking at simply by watching which neurons fired!

So what that means is that we don't have devoted neurons to particular faces; there is no "Jennifer Aniston cell," as the concept has often been called.  We simply respond to the dimensions and features of the face we're observing, and map that into "face space," and that allows us to uniquely identify a nearly infinite number of different faces.

Tsao suspects that there are other types of encoding in the brain that will turn out to work the same way.  "[There is in] neuroscience a sense of pessimism that the brain is similarly a black box," she said. "Our paper provides a counterexample.  We’re recording from neurons at the highest stage of the visual system and can see that there’s no black box.  My bet is that that will be true throughout the brain."

Which makes me wonder where this whole system is going wrong in my own brain.  I certainly see, and can recall, facial features; it is not (as I thought when I was younger) that I am simply inattentive or unobservant.  But somehow, even knowing features doesn't create any kind of recognizable image for me.  For people I know well, I could list off features -- round face, crooked nose, wavy brown hair, prominent chin -- but those don't come together in my brain into any sort of visual image.  The result is the odd situation that for people I know, I can often describe them, but I can't picture them at all.

So anyhow, if at some point I pass you on the street and don't say hi, or even make eye contact and have no reaction, I'm not being unfriendly, you haven't somehow pissed me off, and I'm not daydreaming.  I honestly don't know who you are.  It'd be nice if, like my former student, everyone went around wearing name tags, but failing that, I'll just have to keep muddling along in a sea of unfamiliar faces.

Friday, June 2, 2017

State of denial

My dad was talking about a public figure one time, and called the man "ignorant."  Then he looked thoughtful, and amended his assessment to "stupid."

I asked him what the difference was.

"Ignorance just means you don't know stuff," he explained.  "Ignorance can be cured.  Stupidity, on the other hand, means you're ignorant and you don't care.  Maybe you're even proud of it...  Put a different way, ignorance is only skin-deep.  Stupidity goes all the way to the bone."

Wise man, my dad.

I can't help but think that if he were alive today, he'd have applied the word "stupid" to the people currently determining the direction our country takes apropos of climate change.  There's a willfulness about the way they choose to ignore the consensus of close to 100% of trained, qualified climate scientists in favor of the self-serving nonsense coming from the fossil fuels industry (and the elected officials in their pay).

As urban designer Brent Toderian put it: "If 97% of structural engineers told you that a bridge was unsafe, would you still drive across it?"

That kind of argument doesn't resonate with the people currently running our government, unfortunately.  I woke up to the news yesterday morning (buried amongst hundreds of pieces speculating on the meaning of "covfefe") that Trump was almost certain to pull the United States out of the Paris Accord, and sure enough, yesterday afternoon Trump himself confirmed it.

Which, by the way, would throw us in with only two other countries in the world -- Syria and Nicaragua.

Because the leadership of those two countries is clearly what we want to emulate.

[image courtesy of the Wikimedia Commons]

But there's an added twist to the climate change denialism in the United States government, and that has come about because of the Trump administration's bizarre, if wildly successful, courting of the Religious Right.  Now, there is an increasing message coming from evangelical Christian politicians and spokespeople that okay, maybe the climate is changing, but we shouldn't worry about it...

... because god's gonna fix it.

I kid you not.  Let's start with Michigan Representative Tim Walberg, who said in a town hall meeting that he's not at all concerned:
I believe there’s climate change.  I believe there’s been climate change since the beginning of time.  I believe there are cycles.  Do I think man has some impact?  Yeah, of course.  Can man change the entire universe?  No. 
Why do I believe that?  Well, as a Christian, I believe that there is a creator in God who is much bigger than us.  And I’m confident that, if there’s a real problem, he can take care of it.
Okay, first, does this guy really think that scientists are saying that climate change will affect the entire universe?  Like, if we cut down the forests and pollute the atmosphere and burn up all the coal and oil here on Earth, some alien civilization in the Andromeda Galaxy will die a horrible death?  Because that goes way beyond stupid, into that rarefied atmosphere called "Holy fuck, that's idiotic."

But a deeper problem, of course, is that such a stance absolves us of any need to change our ways now.  We can continue to burn fossil fuels like there's no tomorrow, continue to give nothing more than lip service to renewable energy, continue to allow our elected officials to sit in the deep pockets of the petroleum industry.

Pretty convenient, that.

Then there's right-wing radio host Erick Erickson, who said pretty much the same thing in a series of tweets, which I string together here for the sake of space:
I worship Jesus, not Mother Earth.  He calls us all to be good stewards of the planet, but doesn't mean I have to care about global warming...  100000000% sure my kids will have a habitable planet.  This sort of hysteria is exactly why I couldn't care less about global warming...  The tweets of those upset with me on global warming have a religious fervor to them because by faith they believe so much of the doom&gloom...  Dammit, I'm gonna be drunk off the tears of people crying over the Paris Accord before my show starts.
What, do you think that the people who understand climate science want the Earth's ecosystems to destabilize?  Nutjobs like Erickson act as if coming to a conclusion and liking the conclusion are the same thing.  And now, we're supposed to take his "100000000%" assurance that everything is fine over the knowledge, expertise, and data of trained scientists?

In any case, don't worry about it, because Jesus.

Oh yeah, and liberal tears, har-de-har-har, and all that sorta stuff.

This kind of nonsense would be comical if it wasn't for the fact that people like Walberg and Erickson are currently in the driver's seat with regards to our entire country's climate policy.  So that moves it from the "comical" column to the "scary" column.

Worst, it means that the people who are making decisions for us are not just ignorant, but willfully ignorant.  I.e., what my dad would have called "stupid."  And since stupidity is so seldom limited to one subject, that should be profoundly scary to all of us, because we're all going to have to live with the consequences of where these nimrods are dragging us.

Thursday, June 1, 2017

Going to the dogs

A week ago, I wrote about a fake academic paper (on the topic of how the "concept of the penis" is responsible for climate change, among other things) that got into an allegedly peer-reviewed journal.  Following up on that general train of thought, today we have: a dog who is on the review boards of not one, nor two, but seven medical journals.

In fact, this dog, a Staffordshire terrier whose name is Olivia, is now listed (under the name "Olivia Doll") as an associate editor of the Global Journal of Addiction & Rehabilitation Medicine.  Olivia's CV is pretty intriguing; she lists under "research interests" the "avian propinquity to canines in metropolitan suburbs" and "the benefits of abdominal massage for medium-sized canines."

[image courtesy of the Wikimedia Commons]

Which, you would think, would have been a dead giveaway.  If the people running the journals in question cared, which they probably don't.  Olivia's owner, Mike Daube, is a professor of health policy at Curtin University (Australia), and he signed Olivia for her first position as reviewer as a joke, never expecting anyone to take it seriously.

They did.  And Olivia started getting mail from other journals, requesting her participation in reviewing papers.  Next thing Mike knew, Olivia Doll was listed as a reviewer for seven different medical journals.  (One of them lists Olivia as a member of the editorial board, and with her biographical information Daube submitted a photograph of Kylie Minogue.  Even so, apparently people still didn't realize that it was a joke, and Minogue's photo is next to Olivia's CV on the webpage listing board members.)

"What makes it even more bizarre is that one of these journals has actually asked Ollie to review an article," Daube said in an interview with the Medical Journal of Australia’s InSight Magazine.  "The article was about nerve sheath tumors and how to treat them.  Some poor soul has actually written an article on this theme in good faith, and the journal has sent it to a dog to review...  Every academic gets several of these emails a day, from sham journals. They’re trying to take advantage of gullible younger academics, gullible researchers."

So all of this delivers another blow to public confidence in the peer review process.  Which is sad; my sense is that most of the time, peer review works just fine, and is the best thing around for winnowing out spurious results.  For the best academic journals -- Nature and Science come to mind -- the likelihood of a hoax paper getting past review, or someone unqualified (or even a different species) sneaking his/her way onto an editorial board is slim to none.

I get why Daube did what he did.  He was trying to point a finger (or paw, as the case may be) at predatory journals that will publish damn near anything if you pay them, and for which the review board is simply a list of names of random people.  But right now -- with a government administration here in the United States that is making a practice of ignoring and/or casting doubt on legitimate scientific research -- the last thing we need is something to make academics look like a bunch of gullible nimrods.

Which, of course, isn't Daube's fault; it's the fault of journals like the Global Journal of Addiction & Rehabilitation Medicine.  Daube is simply acting as a whistleblower, assisted by his faithful hound.  Even so, I still couldn't help but wince when I read this.  I can just hear the next salvo from people like Senator James "Snowball" Inhofe: "Why the hell should we listen to scientists?  Their research gets reviewed by dogs."

So it'll be interesting to see where this goes.  As of the writing of this post, Olivia is still listed as an editor and reviewer for seven journals, further reinforcing my sense that the journals in question don't give a damn who is on their review staff.  As far as Olivia goes, I hope that she's getting well rewarded for her service to the academic world.  Maybe Daube can list her as a graduate student, and have her doggie biscuits paid for by his research grants.