Skeptophilia (skep-to-fil-i-a) (n.) - the love of logical thought, skepticism, and thinking critically. Being an exploration of the applications of skeptical thinking to the world at large, with periodic excursions into linguistics, music, politics, cryptozoology, and why people keep seeing the face of Jesus on grilled cheese sandwiches.

Tuesday, June 6, 2017

The waking dream

Yesterday's post, about the generally bizarre nature of dream content, prompted a friend and loyal reader of Skeptophilia, the amazing writer A. J. Aalto, to send me a link to a study done a while back in Switzerland that showed that our dream content sometimes forms a continuum with our waking experience.

The author and lead researcher, Sophie Schwartz of the Department of Neuroscience at the University of Geneva, did a clever study where volunteers were instructed to play the computer game Alpine Racer II, wherein the player stands on a movable platform that tracks his/her movements, while an avatar skis downhill on the computer screen.  To be successful in the game, the player has not only to exhibit balance, coordination, and motor skill, but to focus visually on the task and ignore any distractions.  Schwartz then had the players record their dream content, comparing it to people who had only watched the game, and control volunteers who had done an unrelated activity.


Schwartz writes:
After training on the Alpine Racer, 30% of spontaneous mentation collected at different times during pre-sleep wakefulness and light NREM sleep (up to 300 sec after sleep onset) contained imagery (of any modality, 24%) or thoughts (6%) related to the skiing game.  Wamsley et al. also found that imagery directly related to training on the game (unambiguous representations of the Alpine Racer or of skiing) declined across time.  This time-course was paralleled by a tendency for game-related incorporations to become more abstracted from the original experience.  These findings do not only provide empirical evidence for spontaneous memory replay during wakefulness and light NREM sleep (stages 1 and 2), but they show that reports of subjective experience offer valuable information about cognitive processes across changing brain states.
Schwartz acknowledges that the high rate of incorporation of skiing imagery into the players' dreams probably had to do with the degree of attention the game required:
High levels of incorporation of Alpine Racer are most plausibly related to the strong motivational and attentional involvement of the player during the game.  Consistent with this interpretation, a few participants who only observed those playing Alpine Racer also incorporated elements of the game into their sleep-onset mentation, at rates similar to the participants who were actively engaged in the game.  These effects and their time-course suggest that novelty may be a critical factor for the selection of material to be mentally replayed.  Moreover, many baseline night reports incorporated thought or imagery related to the game (compared to a control set of sleep-onset mentation reports), indicating that the mere anticipation of the task could trigger prospective memory processes that emerged at sleep onset.  It is tempting to speculate that hypnagogic imagery may contribute to the integration of recent experiences with long-term memories and future goals.
This is consistent with my wife's memories of being in graduate school and spending an inordinate amount of time avoiding doing her research by playing Tetris.  She realized she should probably stop when she started having dreams of brightly-colored blocks falling from the sky, and fortunately was able to curb her Tetris addiction before her adviser had to stage an intervention.

For myself, I can't say that I see a lot of incorporation of waking experience into my dreams.  Much of my dream content seems to fall squarely into the category of "What the fuck?", such as a recent dream wherein I was filling our bathtub with styrofoam peanuts, except they kept melting and running down the drain, which made even less sense when I looked up and realized that the bathtub wasn't in my house, it was in the middle of the Sahara Desert.

None of which, I can assure you without hesitation, was a continuation of anything I'd been doing that day.

I've also noticed a tendency in my more reality-based dreams to have more content with strong emotional charge than that with any connection to recent events.  I've been teaching for thirty years, and I still have frequent teaching-anxiety dreams -- that my students aren't listening or are misbehaving, that I get confused or off track during a lecture and can't remember what I'm supposed to be doing, even that I'm wandering around the halls in the school and can't find my classroom.  I also have dreams of losing loved ones or pets, dreams of witnessing violence, dreams of being trapped -- all of which have a powerful emotional content.

But I haven't noticed much tendency for my dream content to exhibit Schwartz's continuance from the waking state.  In fact, I can recall many times when I expected to dream about something -- when I've been involved all day in a project, or (especially) when I've watched a scary or emotionally powerful movie -- and it almost never happens.

So once more, we're back to dreams being mysterious, and any explanations we have regarding dream content being incomplete at best.  Which, of course, is part of their fascination.  I'll definitely be giving this topic more thought, once I've figured out what to do with all of these melted styrofoam peanuts.

Monday, June 5, 2017

Live your dream! Unless it's the one where you're naked on the bus.

Last night I had the strangest dream, but it wasn't about a girl in a black bikini (sorry if you're too young to get that reference).  One of my coworkers was going to be interviewed on public television by Yoko Ono.  I won't mention who the interviewee was, but trust me, if there was a list of people who were likely to be interviewed by Yoko Ono, this person would be near the bottom.  So anyway, I was being driven to this event by our school psychologist, but we were going to be late because he had the sudden overwhelming need to find a grocery store so he could buy a bag of potato chips.

I won't go any further into it, because at that point it started to get a little weird.

It is an open question why people dream, but virtually everyone does.  During the REM (rapid eye movement) stage of sleep, there are parts of the mind that are as active as they are during wakefulness.  This observation led brain scientists to call this stage "paradoxical sleep" -- paradoxical because while the body is usually very relaxed, the brain is firing like crazy.

Well, parts of it are.  While the visual and auditory centers are lighting up like a Christmas tree, your prefrontal cortex is snoozing in a deck chair.  The prefrontal cortex is your decision-making module and reality filter, and this at least partly explains why dreams seem so normal while you're in them but so bizarre when you wake up and your prefrontal cortex has a chance to reboot.

[image courtesy of the Wikimedia Commons]

The content of dreams has been a subject of speculation for years, and all available evidence indicates that the little "Your Dreams Interpreted" books you can buy in the supermarket checkout lines are unadulterated horse waste.  Apparently there is some thought that much of our dream content is involved with processing long-term memories; but equally plausible theories suggest that dreaming is a way of resetting our dopamine and serotonin receptors, or a way of decommissioning old neural pathways (so-called "parasitic nodes").  Probably, it aids in all three.  Whatever it is, however, it's important -- all mammal species tested undergo REM sleep, some for as much as eight hours a night.

Anyone who's a dog owner probably knew that already, of course.  Both of my dogs dream, as evidenced by their behavior while they're asleep.  My coonhound, Lena, has squirrel-chasing dreams, which makes sense because while she's awake two of her three operational brain cells are devoted to constant monitoring of our backyard squirrel population.  She'll be lying there, completely sacked out, then suddenly she'll woof softly under her breath, and her paws will twitch as if she were running after her prey.  Every once in a while she apparently catches one, because she'll go, "Rrrrrrr," and shake her head as if tearing a squirrel apart.

Grendel, on the other hand, tends to have happy, sweet dreams.  He'll twitch and sigh... and then his tail starts wagging.  Which is a top contender for the cutest thing I've ever seen in my life.

As far as human dreams go, it's interesting that there is a fairly consistent set of content types in dreams, regardless of your culture or background.  Some of the more common ones are dreams of falling, being chased, fighting, seeing someone who has died, having sexual experiences, being in a public place while inappropriately dressed, and being unable to attend interviews by Yoko Ono because of searching for potato chips.

A few well-documented but less common dreamlike experiences include lucid dreams (being aware that you're dreaming while it's happening), hypnagogic experiences (dreams in light sleep rather than REM), and night terrors (terrifying dreams during deep sleep).  This last-mentioned is something that is found almost exclusively in children, and almost always ceases entirely by age twelve.  My younger son had night terrors, and the first time it happened was truly one of the scariest things I've ever experienced.  At 11:30 one night he started shrieking hysterically, over and over.  I jumped out of bed and ran down the hall like a fury, to find him sitting bolt upright in bed, trembling, eyes wide open, and drenched with sweat.  I ran to him and said, "What's wrong?"  He pointed to an empty corner of the room and said, "It's staring at me!"

I should mention at this point that I had just recently watched the movie The Sixth Sense.

When I finished peeing my pants, I was able to pull myself together enough to realize that he was having a night terror, and that there were in fact no spirits of dead people staring at him from the corner of his bedroom.  When I got him calmed down, he went back into a deep sleep -- and the next morning remembered nothing at all.

I, on the other hand, required several months of therapy to recover completely.

Whatever purpose dreams and other associated phenomena serve, there is no evidence whatsoever that they are "supernatural" in any sense.  Precognitive dreams, for instance, most likely occur because you dream every night, about a relatively restricted number of types of events, and just by the law of large numbers at some point you'll probably dream something that will end up resembling a future event.  There is no mystical significance to the content of our dreams -- it is formed of our own thoughts and memories, both pleasant and unpleasant; our fears and desires and wishes, our emotions and knowledge; so they are at their base a reflection of the bits and pieces of who we are.   It's no wonder that they are funny, scary, weird, complex, erotic, disturbing, exhilarating, and perplexing, because we are all of those things.

So, next time you're in the midst of a crazy dream, you can be comforted by the fact that you are having an experience that is shared by all of humanity, and most other mammals as well.  What you're dreaming is no more significant, but also no more peculiar, than what the rest of us are dreaming.  Just sit back and enjoy the show.  And give my regards to Yoko Ono.

Saturday, June 3, 2017

Face card

I ran into an article in the New York Times a couple of days ago that begins with the line, "The brain has an amazing capacity for recognizing faces."

This made me snort derisively, because as I've mentioned before, I have prosopagnosia -- face blindness.  I'm not completely face blind, as the eminent writer and neuroscientist Oliver Sacks was -- Sacks, after all, didn't even recognize his own face in a mirror.  I'm not quite that badly off, but even so, I don't have anywhere near instantaneous facial recognition.  I compensate by being good at remembering voices, and paying attention to things like gait and stance.  Beyond that, I tend to remember people as lists of features -- he's the guy with the scar through one eyebrow, she's the one with black hair and three piercings in her left ear.  But it's a front-of-the-brain, conscious cognitive thing, not quick and subconscious like it (apparently) is with most people.

And even that strategy can fail, if someone changes hair styles, gets new glasses, or begins to dress differently.  Then I have to rely on my other strategies, as I did a couple of days ago in our local pharmacy.  The check-out clerk smiled at me, and I said hi and greeted her by name.  She was a former student who had taken my neuroscience class a couple of years ago, and she grinned at me and said, "I thought you didn't recognize people's faces."

"I don't," I said.  "You're wearing a name tag."

[image courtesy of the Wikimedia Commons]

Despite my scornful snort at the first line of the article in the Times, I was pretty interested in its content, not least because it gives me an insight into my own peculiar inability.  The article describes the research of Le Chang and Doris Y. Tsao (published this week in Cell), of Caltech, who using fMRI monitoring of the brains of monkeys, have begun to elucidate how the brain processes faces.  Chang and Tsao write:
Primates recognize complex objects such as faces with remarkable speed and reliability.  Here, we reveal the brain’s code for facial identity.  Experiments in macaques demonstrate an extraordinarily simple transformation between faces and responses of cells in face patches.  By formatting faces as points in a high-dimensional linear space, we discovered that each face cell’s firing rate is proportional to the projection of an incoming face stimulus onto a single axis in this space, allowing a face cell ensemble to encode the location of any face in the space.  Using this code, we could precisely decode faces from neural population responses and predict neural firing rates to faces.  Furthermore, this code disavows the long-standing assumption that face cells encode specific facial identities, confirmed by engineering faces with drastically different appearance that elicited identical responses in single face cells.  Our work suggests that other objects could be encoded by analogous metric coordinate systems.
Put more simply, the brain seems to encode facial recognition in a fairly small number of cells -- possibly as few as 10,000 -- which fire in a distinctive pattern depending on the deviation of the face being observed, on various metrics, from an "average" or "baseline" face.  This creates what Chang and Tsao call a "face space" -- a mapping between facial features and a set of firing patterns in the facial recognition module in the brain.

Chang and Tsao got so good at discerning the "face space" in a monkey's brain that they could tell which face photograph a monkey was looking at simply by watching which neurons fired!

So what that means is that we don't have devoted neurons to particular faces; there is no "Jennifer Aniston cell," as the concept has often been called.  We simply respond to the dimensions and features of the face we're observing, and map that into "face space," and that allows us to uniquely identify a nearly infinite number of different faces.

Tsao suspects that there are other types of encoding in the brain that will turn out to work the same way.  "[There is in] neuroscience a sense of pessimism that the brain is similarly a black box," she said. "Our paper provides a counterexample.  We’re recording from neurons at the highest stage of the visual system and can see that there’s no black box.  My bet is that that will be true throughout the brain."

Which makes me wonder where this whole system is going wrong in my own brain.  I certainly see, and can recall, facial features; it is not (as I thought when I was younger) that I am simply inattentive or unobservant.  But somehow, even knowing features doesn't create any kind of recognizable image for me.  For people I know well, I could list off features -- round face, crooked nose, wavy brown hair, prominent chin -- but those don't come together in my brain into any sort of visual image.  The result is the odd situation that for people I know, I can often describe them, but I can't picture them at all.

So anyhow, if at some point I pass you on the street and don't say hi, or even make eye contact and have no reaction, I'm not being unfriendly, you haven't somehow pissed me off, and I'm not daydreaming.  I honestly don't know who you are.  It'd be nice if, like my former student, everyone went around wearing name tags, but failing that, I'll just have to keep muddling along in a sea of unfamiliar faces.

Friday, June 2, 2017

State of denial

My dad was talking about a public figure one time, and called the man "ignorant."  Then he looked thoughtful, and amended his assessment to "stupid."

I asked him what the difference was.

"Ignorance just means you don't know stuff," he explained.  "Ignorance can be cured.  Stupidity, on the other hand, means you're ignorant and you don't care.  Maybe you're even proud of it...  Put a different way, ignorance is only skin-deep.  Stupidity goes all the way to the bone."

Wise man, my dad.

I can't help but think that if he were alive today, he'd have applied the word "stupid" to the people currently determining the direction our country takes apropos of climate change.  There's a willfulness about the way they choose to ignore the consensus of close to 100% of trained, qualified climate scientists in favor of the self-serving nonsense coming from the fossil fuels industry (and the elected officials in their pay).

As urban designer Brent Toderian put it: "If 97% of structural engineers told you that a bridge was unsafe, would you still drive across it?"

That kind of argument doesn't resonate with the people currently running our government, unfortunately.  I woke up to the news yesterday morning (buried amongst hundreds of pieces speculating on the meaning of "covfefe") that Trump was almost certain to pull the United States out of the Paris Accord, and sure enough, yesterday afternoon Trump himself confirmed it.

Which, by the way, would throw us in with only two other countries in the world -- Syria and Nicaragua.

Because the leadership of those two countries is clearly what we want to emulate.

[image courtesy of the Wikimedia Commons]

But there's an added twist to the climate change denialism in the United States government, and that has come about because of the Trump administration's bizarre, if wildly successful, courting of the Religious Right.  Now, there is an increasing message coming from evangelical Christian politicians and spokespeople that okay, maybe the climate is changing, but we shouldn't worry about it...

... because god's gonna fix it.

I kid you not.  Let's start with Michigan Representative Tim Walberg, who said in a town hall meeting that he's not at all concerned:
I believe there’s climate change.  I believe there’s been climate change since the beginning of time.  I believe there are cycles.  Do I think man has some impact?  Yeah, of course.  Can man change the entire universe?  No. 
Why do I believe that?  Well, as a Christian, I believe that there is a creator in God who is much bigger than us.  And I’m confident that, if there’s a real problem, he can take care of it.
Okay, first, does this guy really think that scientists are saying that climate change will affect the entire universe?  Like, if we cut down the forests and pollute the atmosphere and burn up all the coal and oil here on Earth, some alien civilization in the Andromeda Galaxy will die a horrible death?  Because that goes way beyond stupid, into that rarefied atmosphere called "Holy fuck, that's idiotic."

But a deeper problem, of course, is that such a stance absolves us of any need to change our ways now.  We can continue to burn fossil fuels like there's no tomorrow, continue to give nothing more than lip service to renewable energy, continue to allow our elected officials to sit in the deep pockets of the petroleum industry.

Pretty convenient, that.

Then there's right-wing radio host Erick Erickson, who said pretty much the same thing in a series of tweets, which I string together here for the sake of space:
I worship Jesus, not Mother Earth.  He calls us all to be good stewards of the planet, but doesn't mean I have to care about global warming...  100000000% sure my kids will have a habitable planet.  This sort of hysteria is exactly why I couldn't care less about global warming...  The tweets of those upset with me on global warming have a religious fervor to them because by faith they believe so much of the doom&gloom...  Dammit, I'm gonna be drunk off the tears of people crying over the Paris Accord before my show starts.
What, do you think that the people who understand climate science want the Earth's ecosystems to destabilize?  Nutjobs like Erickson act as if coming to a conclusion and liking the conclusion are the same thing.  And now, we're supposed to take his "100000000%" assurance that everything is fine over the knowledge, expertise, and data of trained scientists?

In any case, don't worry about it, because Jesus.

Oh yeah, and liberal tears, har-de-har-har, and all that sorta stuff.

This kind of nonsense would be comical if it wasn't for the fact that people like Walberg and Erickson are currently in the driver's seat with regards to our entire country's climate policy.  So that moves it from the "comical" column to the "scary" column.

Worst, it means that the people who are making decisions for us are not just ignorant, but willfully ignorant.  I.e., what my dad would have called "stupid."  And since stupidity is so seldom limited to one subject, that should be profoundly scary to all of us, because we're all going to have to live with the consequences of where these nimrods are dragging us.

Thursday, June 1, 2017

Going to the dogs

A week ago, I wrote about a fake academic paper (on the topic of how the "concept of the penis" is responsible for climate change, among other things) that got into an allegedly peer-reviewed journal.  Following up on that general train of thought, today we have: a dog who is on the review boards of not one, nor two, but seven medical journals.

In fact, this dog, a Staffordshire terrier whose name is Olivia, is now listed (under the name "Olivia Doll") as an associate editor of the Global Journal of Addiction & Rehabilitation Medicine.  Olivia's CV is pretty intriguing; she lists under "research interests" the "avian propinquity to canines in metropolitan suburbs" and "the benefits of abdominal massage for medium-sized canines."

[image courtesy of the Wikimedia Commons]

Which, you would think, would have been a dead giveaway.  If the people running the journals in question cared, which they probably don't.  Olivia's owner, Mike Daube, is a professor of health policy at Curtin University (Australia), and he signed Olivia for her first position as reviewer as a joke, never expecting anyone to take it seriously.

They did.  And Olivia started getting mail from other journals, requesting her participation in reviewing papers.  Next thing Mike knew, Olivia Doll was listed as a reviewer for seven different medical journals.  (One of them lists Olivia as a member of the editorial board, and with her biographical information Daube submitted a photograph of Kylie Minogue.  Even so, apparently people still didn't realize that it was a joke, and Minogue's photo is next to Olivia's CV on the webpage listing board members.)

"What makes it even more bizarre is that one of these journals has actually asked Ollie to review an article," Daube said in an interview with the Medical Journal of Australia’s InSight Magazine.  "The article was about nerve sheath tumors and how to treat them.  Some poor soul has actually written an article on this theme in good faith, and the journal has sent it to a dog to review...  Every academic gets several of these emails a day, from sham journals. They’re trying to take advantage of gullible younger academics, gullible researchers."

So all of this delivers another blow to public confidence in the peer review process.  Which is sad; my sense is that most of the time, peer review works just fine, and is the best thing around for winnowing out spurious results.  For the best academic journals -- Nature and Science come to mind -- the likelihood of a hoax paper getting past review, or someone unqualified (or even a different species) sneaking his/her way onto an editorial board is slim to none.

I get why Daube did what he did.  He was trying to point a finger (or paw, as the case may be) at predatory journals that will publish damn near anything if you pay them, and for which the review board is simply a list of names of random people.  But right now -- with a government administration here in the United States that is making a practice of ignoring and/or casting doubt on legitimate scientific research -- the last thing we need is something to make academics look like a bunch of gullible nimrods.

Which, of course, isn't Daube's fault; it's the fault of journals like the Global Journal of Addiction & Rehabilitation Medicine.  Daube is simply acting as a whistleblower, assisted by his faithful hound.  Even so, I still couldn't help but wince when I read this.  I can just hear the next salvo from people like Senator James "Snowball" Inhofe: "Why the hell should we listen to scientists?  Their research gets reviewed by dogs."

So it'll be interesting to see where this goes.  As of the writing of this post, Olivia is still listed as an editor and reviewer for seven journals, further reinforcing my sense that the journals in question don't give a damn who is on their review staff.  As far as Olivia goes, I hope that she's getting well rewarded for her service to the academic world.  Maybe Daube can list her as a graduate student, and have her doggie biscuits paid for by his research grants.

Wednesday, May 31, 2017

The fact of the matter

A couple of days ago I made the mistake of participating in that most fruitless of endeavors: an online argument with a total stranger.

It started when a friend of mine posted the question of whether the following quote was really in Hillary Clinton's book, It Takes a Village:


It isn't, of course, and a quick search was enough to turn up the page on Snopes that debunks the claim.  I posted the link, and my friend responded with a quick thanks and a comment that she was glad to have the straight scoop so that she wasn't perpetuating a falsehood.  And that should have been that.

And it would have been if some guy hadn't commented, "Don't trust Snopes!!!"  A little voice in the back of my head said, "Don't take the bait...", but a much louder one said, "Oh, for fuck's sake."  So I responded, "Come on.  Snopes is one of the most accurate fact-checking sites around.  It's been cross-checked by independent non-partisan analysts, and it's pretty close to 100% correct."

The guy responded, "No, it's not!"

You'd think at this point I'd have figured out that I was talking to someone who learned his debate skills in Monty Python's Argument Clinic, but I am nothing if not persistent.  I found the analysis I had referred to in my previous comment, and posted a clip from a summary of it on the site Skeptical Science:
Jan Harold Brunvand, a folklorist who has written a number of books on urban legends and modern folklore, considered the site so comprehensive in 2004 as to obviate launching one of his own.[10] 
David Mikkelson, the creator of the site, has said that the site receives more complaints of liberal bias than conservative bias,[23] but insists that the same debunking standards are applied to all political urban legends.  In 2012, FactCheck.org reviewed a sample of Snopes’ responses to political rumors regarding George W. Bush, Sarah Palin, and Barack Obama, and found them to be free from bias in all cases.  FactCheck noted that Barbara Mikkelson was a Canadian citizen (and thus unable to vote in US elections) and David Mikkelson was an independent who was once registered as a Republican.  “You’d be hard-pressed to find two more apolitical people,” David Mikkelson told them.[23][24]  In 2012, The Florida Times-Union reported that About.com‘s urban legends researcher found a “consistent effort to provide even-handed analyses” and that Snopes’ cited sources and numerous reputable analyses of its content confirm its accuracy.[25]
And he responded, "I disagree with you, but I respect your right to your opinion."

At that point, I gave up.

But I kept thinking about the exchange, particularly his use of the word "opinion."  It's an odd way to define the term, isn't it?  It's an opinion that I think single-malt scotch tastes good with dark chocolate.  It's an opinion that I detest the song "Stayin' Alive."

But whether Snopes is accurate or not is not an opinion.  It is either true, or it is not.  It's a little like the "flat Earth" thing.  If you believe, despite the overwhelming evidence, that the Earth is anything but an oblate spheroid, that is not "your opinion."

You are simply "wrong."

Now, I hasten to add that I don't think all of my own beliefs are necessarily correct.  After all, I haven't cross-checked Snopes myself, so I'm relying on the expertise of Brunvand et al. and trusting that they did their job correctly.  To the best of my knowledge, Snopes is accurate; and if anyone wants me to think otherwise, they need to do more than say "No, it isn't" every time I open my mouth.

But to call something like that an "opinion" implies that we all have our own sets of facts, even though many of them contradict each other, with the result that we all do what writer Kathryn Schulz calls "walking around in our little bubbles of being right about everything."  It's a little frightening how deep this mindset goes -- up to and including Donald Trump's shrieking "Fake news!" every time he hears something about him or his administration that he doesn't like.

I can understand wanting reality to be a different way than it is.  Hell, I'd rather teach Defense Against the Dark Arts at Hogwarts than biology in a public high school.  But wishin' don't make it so, as my grandma used to say, and once you grow up you need to face facts and admit it when you're wrong.  And, most importantly, recognize that the evidence won't always line up with your desires.  As President John Adams put it, "Facts are stubborn things.  Whatever our wishes, inclinations, and passions, they cannot alter the facts and the evidence."

Tuesday, May 30, 2017

One language to rule them all

The aphorism "No matter what you know, there's always more to learn" is something you'd be likely to see on one of those cheesy "motivational posters" that cheery type-A personalities like to pin up on office walls, but there's a lot of truth to it.  I rather prefer the formulation credited to Socrates -- "The more I know, the more I realize how little I know."

I ran into a fun example of this principle yesterday, when a member of the online linguistic geekery group Our Bastard Language posted an article from The Public Domain Review called "Trüth, Beaüty, and Volapük," about a constructed language (or "conlang," in the lingo of the field) called Volapük that I had never heard of before.

My M.A. is in linguistics, but my field of study was historical/reconstructive linguistics (my thesis was about the effects of the Viking invasions on Old English and Old Gaelic, and should have won some kind of award for research that has absolutely no practical application).  But even though conlangs aren't my specialty, I've always had a fascination from them.  There are a remarkable number out there, from the familiar (Esperanto, Klingon, Elvish) to the obscure but fascinating (such as John Quijada's Ithkuil, which attempts to express concepts in a combinatory way from the smallest possible number of root words).

A sample of Tolkein's lovely Elvish script [image courtesy of the Wikimedia Commons]

But despite my interest in conlangs, I had never run across Volapük, which is strange because next to Esperanto, it's apparently one of the most studied constructed languages ever created.  It was the invention of a German priest named Johann Schleyer, who not only wanted to create a regularized speech that came from familiar roots (to Europeans, anyhow) and was easy to learn, but was also "beautiful sounding."  Schleyer had an inordinate fondness for umlauts, which he added because he thought that "A language without umlauts sounds monotonous, harsh, and boring."

Which reminds me of the credits in Monty Python and the Search for the Holy Grail, especially the "A mööse once bit my sïster" part.  One of Schleyer's contemporaries couldn't resist poking some gøød-natured fün at him over his umlautophilia, and published the following limerick in the Milwaukee Sentinel:
A charming young student of Grük
Once tried to acquire Volapük
But it sounded so bad
That her friends called her mad,
And she quit it in less than a wük.
To my ears, it doesn't sound bad at all, and kind of has a Scandi-Slavic lilt to it.  Here's a sample:


The author of the article in The Public Domain Review, Arika Okrent, attributes the relative failure of Volapük to its plethora of umlauts and the easier word roots of its competitor Esperanto, which currently has about two million fluent speakers (an estimated 1,000 of which learned it as their first language).  I'm a little doubtful about that; certainly umlautiness hasn't discouraged anyone from learning Finnish.  I think it's more that the idea of a universal language is one of those high-flown ideals that won't ever catch on because most people are going to be resistant to giving up their native tongue in favor of an invented system of speech, however easy it is to learn.  Language is such a deep part of culture that to jettison our own mode of communication runs counter to every social instinct we have.  (Note that one of the most common things conquerors do to conquered people is to outlaw the speaking of the native language -- it's a sure way to deal a death blow to the culture.)

Even so, I find the whole conlang thing fascinating, and was tickled to run across one I'd never heard of.  Every so often I have students who participate in an independent study class I teach in introductory linguistics, and the final project is to invent the framework of a language -- a phonetic and phonemic structure, morphological scheme, and syntax, along with a lexicon of at least a hundred words.  They then translate a passage from English into their language.  (One of the best ones I've ever seen did a charming translation of Eric Carle's The Very Hungry Caterpillar.)

The result of this project is twofold -- students find out how hard it is to create a realistic language, and they learn a tremendous amount about the structure of our own language.  And that's just from producing a rudimentary skeleton of a language.  For people like Schleyer, who created a rich and fully functional language, it was the result of many years of devotion, hard work, and love for language itself.

So it's kind of a shame that people didn't appreciate Volapük more.  Schleyer's dream of having a language that would bring the entire world together in a common mode of communication may be as far off as ever, but even so, it's a beautiful dream.  Even if it would mean making friends with the mäjestïc ümlaüt.