Skeptophilia (skep-to-fil-i-a) (n.) - the love of logical thought, skepticism, and thinking critically. Being an exploration of the applications of skeptical thinking to the world at large, with periodic excursions into linguistics, music, politics, cryptozoology, and why people keep seeing the face of Jesus on grilled cheese sandwiches.

Sunday, December 5, 2010

Accentuating the positive

Having grown up in the Deep South (as my dad used to say, any Deeper South and your hat would be floating), I'm frequently asked why I don't have more of an accent.  I think there are several answers.  First, my dad was a career Marine, and retired when I was seven, so I spent the first few years of my life moving from military base to military base, amongst people who came from all parts of the United States.  Second, although my mom was what they call "full-bleed Cajun," my dad was a complete mutt -- his father was born in Louisiana and was of French, German, Scottish, and Dutch descent, and his mother was a Scotch-Irish Yankee from southwestern Pennsylvania.  The third reason, though, I think is the most interesting; when I moved north (to Seattle) when I was 21, I got teased out of my accent.  To this day my voice can assume the south Louisiana Cajun swing in no time at all -- all I have to do is talk to one of my cousins on the phone, or better yet, go back down to visit.  It's like I never left.

To this day I still find it rather appalling that I was teased for having a southern accent, but I've found (having lived in YankeeLand USA for almost thirty years) that the perception of southern accents as being comical, or worse yet, a sign of ignorance, is common across the north.  Of course, the media is partially to blame; witness television shows like The Beverly Hillbillies, Green Acres, and Petticoat Junction, and the comic strip and Broadway show Li'l Abner -- all four of which, I must point out, were produced and written by Northerners, and all of which portray Southerners as ignorant, backwards bumpkins.  However, if that stereotype had not already existed, no one would have found them funny.  The South was already considered an uneducated backwater beforehand.

The fact that the Southern accent is considered a sign of ignorance was highlighted a few years ago with an experiment in which groups of college students were shown different video clips of a pre-recorded speech.  It turned out that the content of the speech in each clip was identical; the only thing that differed was the accent.  The students were then asked to rate the speaker on articulateness, presentation, and content, and to guess the speaker's educational level.  Across the board, the clip that featured someone speaking with a Southern accent was rated lower -- even when the experiment was performed in Georgia, and the students themselves were from the South!

I recall some years ago hearing students in the high school where I teach talking about watching some clips from Ken Burns' The Civil War, and they referred to one of the historians interviewed as "that hillbilly dude."  "That hillbilly dude" turned out to be the late Shelby Foote, a highly educated man whose expertise on the Civil War allowed him to author a number of outstanding books, both fiction and non-fiction, on the subject.  To my ears, his graceful Mississippi accent sounds cultured; to my students', it apparently sounded foolish enough that they hardly listened to what he said.

All of this is just a preface to my telling you about a study recently released by Portfolio magazine, identifying the ten brainiest cities, and the ten least brainy cities, in the United States.  (The determination was done using the average number of years of education for adults in the city.)  While the brainiest cities were scattered about fairly randomly -- the five highest were Boulder, Colorado; Ann Arbor, Michigan; Washington DC; Durham, North Carolina; and Bridgeport, Connecticut -- the ten least brainy showed a distinct grouping.  Anyone care to guess what state hosts four of Portfolio's least-brainy cities in the United States?

California.

Interesting, no?  Furthermore, while a couple of the least-brainy cities were in Texas, none of them were in the states of the "Old South" -- Louisiana, Mississippi, Alabama, Florida, Georgia, South Carolina, North Carolina, Tennessee, and Virginia.

It's nice to know that I have a little more hard data to use when I lambaste my students for laughing when I say "y'all."

I guess it's time to revise some stereotypes, eh, Yanks?

Friday, December 3, 2010

Awoo

Yesterday I was asked by a student of mine if I'd ever heard of Florida Swamp Apes.  After a brief moment in which I wondered if he were asking about a sports team, I answered in the negative.  He brought out his cellphone, on which he had downloaded an admittedly creepy image, which I include below:



Imagine my surprise when I found out that there's a whole site devoted to this odd beast, also called the "Florida Skunk Ape" for its strong smell.  (Visit the site here.)  Considered to be the "southernmost Bigfoot species in the United States," the Florida Skunk Ape has been sighted all over southern Florida, but most commonly in the Everglades region.  (And, if you're interested, this website also offers collectible Florida Skunk Ape silver coins for sale.)

I thought I had heard of most of the cryptozoological claims from the United States, but this one was new to me.  Of course, the Sasquatch of the Pacific Northwest is so familiar by now as to elicit yawns, and many of us know of the Boggy Creek Monster of Fouke, Arkansas, which generated not one but two truly dreadful movies.  I've posted before about the Connecticut Hill Monster, which is veritably in my own back yard, and roams the wild hills of the southern Finger Lakes.  But the Skunk Ape is one I'd never heard of before, and I'm finding myself wondering how I missed it.  It did cross my mind briefly that perhaps the Skunk Ape sightings were merely elderly Bigfoots from the north who had moved to Florida when they retired, but apparently this is incorrect, as one site talks about a sighting of a "young and vigorous animal, probably an adolescent" and another refers to "Skunk Ape mating season" (May, if you're curious).  All these years as a cryptozoologist, and you still keep coming across new and outlandish stories.  But isn't that what pseudoscience is all about?

As with most of these alleged animals, the claims of sightings are numerous and varied, and the hard evidence essentially non-existent.  There are a lot of photographs, but to borrow a line from the astronomer Neil DeGrasse Tyson, there probably is an "Add Bigfoot" button on PhotoShop, so we shouldn't consider the photographic evidence to be evidence at all.  Also on the website is an audio clip of a Skunk Ape's howls, which to my ear sounded more like a distant dog, or possibly a guy going "Awoo."  We also have an interview with Dave Shealy, who seems to be one of the people responsible for the whole Skunk Ape phenomenon (he is the director of the Skunk Ape Research Center of Ochopee, Florida, open 7 AM to 7 PM, admission $5, which I am definitely going to visit next time I'm in Florida).  Lastly, we are informed that Skulls Unlimited, a company which sells a virtually unlimited number of skulls (thus the name), is now offering resin models of Bigfoot skulls.  One has to wonder what they cast the mold from, but in the field of cryptozoology it is sometimes best not to ask too many questions.

In any case, the Florida Skunk Ape gives us yet another line in the ledger of Extraordinary Claims Requiring Extraordinary Evidence Of Which There Seems To Be None.  Too bad, because winter's coming on up here in the Frozen North, and I'd have welcomed a reason to assemble my research team and head on down to sunny Florida, as conducting cryptozoological research in shorts and a t-shirt certainly seems more inviting than stomping around Connecticut Hill in two feet of snow, dressed in seven layers of clothing and still feeling like I'm freezing off valuable body parts.  But I'm willing to make those kind of sacrifices to bring this sort of quality research journalism to your doorstep.  Don't thank me, the privilege of listening to the lonesome howls of the Florida Skunk Ape during mating season is thanks enough.

Wednesday, December 1, 2010

May you stay forever young

This month's issue of Nature magazine includes a paper by Dr. Ronald DePinho of the Dana-Farber Cancer Institute of Harvard University, describing how he and his team were able to reverse aging in mice.

Now, first, it must be stated that these mice were genetically engineered to have a faulty gene for telomerase, so they aged much faster than normal.  Telomerase is an enzyme that protects the end caps on the chromosomes during cell division; the shortening of these end caps (telomeres) is thought to be behind a lot of the less-popular symptoms of aging, including graying of hair, wrinkling of skin, dementia, organ degeneration, and the wearing of knee-high socks with plaid shorts and black leather shoes.  The mice, whose telomerase gene didn't work in the first place, were given injections which activated the gene, and the expected result -- that any further aging would slow or stop -- didn't happen.  What happened, surprising DePinho's team, was that the aging symptoms actually reversed -- the mice began to repair damaged organs, increase in fertility, and manufacture new brain cells.  Some of them even stopped insisting that total strangers look at photographs of their grandbabies.

The conventional wisdom had always been that once the damage was done to the telomeres, reactivating the gene for telomerase was unlikely to rebuild them -- potentially, it could stop further damage, but wouldn't repair the damage already done.  Now, it appears that DePinho's team has shown that this is incorrect.  And this, of course, immediately raises hopes for the development of an anti-aging therapy in humans.

I find this interesting from two standpoints.  First, I'm fascinated with genetics, and anything that further elucidates how genes work is bound to be cool.  Second, I'm 50, and am beginning to experience a few of those aging symptoms myself, and I don't like it.  I'm not wearing knee socks with shorts yet, I'm glad to say; and I still remember where my reading glasses are most of the time.  On the other hand, the laugh lines, stiff joints, and graying hair are a little troublesome.  I remember the first time I noticed the gray -- my wife and I were in Iceland, and I decided to forgo shaving for the duration of the trip, and my facial hair grew in gray.  Carol's comment was, "You look like Kenny Rogers."  The facial hair was gone within ten minutes of our arrival back home.  Kenny Rogers, indeed.

In any case, the idea of being able to maintain the vigor and appearance of youth is certainly appealing.  However, consider the implications.  Suppose they really could drastically slow the aging process, without any untoward results (and it must be said, in the interest of honesty, that one concern about telomerase reactivation is that it might increase the likelihood of cancer).  Suppose human life span were drastically lengthened -- you might live to 500 or 600 years, barring an accident.  Since growth in size, sexual maturity, emotional/intellectual maturity, and aging are all controlled by different genetic constructs, and only the last-mentioned is being altered, there's no reason to believe that the others would be affected.  Therefore, you would reach your adult height at 17 or so, go through puberty at 13 or 14, reach emotional and intellectual maturity in your early 20s -- and then, you'd simply go into stasis.  For five hundred years.  You'd stand a good chance of meeting your great-great-great-great grandchildren, and when you did, you'd look pretty much like you did when you were raising their great-great-great grandparents.  Women would probably still hit menopause in their 50s -- the whole genetic control of sex would, once again, likely be unaffected -- but men would remain fertile indefinitely.  (Think of the effect on the population, which is already huge enough as it is.)  Then, there are the cultural effects -- people would not just have one career, or two, but twelve or thirteen -- imagine doing the same job not for thirty years, but for four hundred.  (The phrase "shoot me now" comes to mind.)  Still, would you want to retire at 65 if you still felt like a twenty-year-old?  And even if you did, could the current retirement system handle paying out retirement checks to you for 450 years?  You think the Social Security System has problems now...

While all of this has the sound of science fiction, Dr. DePinho's team has taken the first steps toward making it possible.  And every time I've tried to predict the timing of breakthroughs, I've always been wrong, and the error has usually been an overestimation.  (I'm the one who told my AP Biology class "adult-tissue cloning won't be possible for another ten years or more" -- one week before the news of Dolly the Cloned Sheep hit the newspapers.)  I'll be watching for further developments, but I'd say that we're not far away from being able to address the whole issue of human life span.  I just hope that when it becomes possible, we are careful to consider the implications -- but given our track record of thinking things through, that may be a forlorn hope.

Tuesday, November 30, 2010

iPads and the war on morality

Once again, conservative columnist Brent Bozell is after (1) the high tech industry, and (2) the purveyors of popular media.

Anyone who reads Bozell's column regularly is probably wondering why this is even deserving of mention, as it seems to be about all he ever talks about.  If you simply wrote out the phrase, "The Internet and the entertainment industry are destroying the morals of America's youth!" and read it every week, it would save you reading his column, which takes valuable minutes of your life that you'll never get back.

Be that as it may, I read Bozell's column every once in a while, and this morning perused his latest screed, titled "New Gadgets, New Worries."  This article was a response to the statement by Mike Elgan of PC World magazine that Apple's iPad is going to be "the children's Toy of the Year."  Predictably, Bozell treated this statement as if it were a coded way of saying, "Parents are once again tossing their children into the maw of hell."  Elgan himself refers to the iPad as a "kid pacifier" (in situations such as long car rides), and states that inevitably kids will end up monopolizing an iPad if the parent owns one; "The path of least resistance is for the parent to get the kids an iPad of their own."  Bozell responds that this will open up another avenue of assault against children using the weapon of graphically violent or sexual video material (including not only games, but actual television shows viewable on the iPad).

There are a variety of grounds on which I question Bozell's arguments (and Elgan's, too, as you'll see).

First, one has to wonder if Bozell has ever heard of the term "source bias."  Of course Elgan thinks that the iPad is going to be the Toy of the Year; he works for PC World magazine, for cryin' out loud.  It's hardly likely that an article in PC World is going to claim that this year's Toy of the Year is the frisbee.  Further, Elgan's statement about parents buying their kids iPads sounds like a lot of wishful thinking, in my opinion.  Market prices (I looked around and they seem to start at around $500) are simply out of the range of most families to afford.

But Elgan's bias and pipe dreams are minor sins against the gods of logic as compared to Bozell's.  As usual, he paints children as helpless dupes, pawns in the entertainment industry's war on morals, and parents as even more helpless -- weak-spined jellyfish who cave at every whine our kids emit.  Well, listen up, Mr. Bozell (okay, I know he probably doesn't read my blog, but just humor me here) -- I teach morals and ethics as part of my class in Critical Thinking, and in my experience the high school students I deal with have a sophisticated, well-considered sense of right and wrong.  They may be more forgiving of transgressions that were taboo when you and I were teenagers (e.g. sex before marriage), but by and large, they are respectful of authority, understand and honor commitments, and believe that telling the truth matters.  They may watch South Park and Family Guy, but they know the difference between fiction and/or satire and real life, seemingly better than you do.

And another thing, Mr. Bozell; you seem to have the attitude that if a kid begs for something, parents have no other choice than to acquiesce.  Let me suggest a radical proposal; if the regular readers of your column are alarmed at the sex and violence on television, they should turn the damn thing off.  That's what I did, when my kids were little.  Actually, I went a further step.  We live in the middle of nowhere (the original Podunk is about five miles from my home -- so I don't even live in Podunk, U.S.A., I live in the outskirts of Podunk).  This means that without a satellite dish, we have no television reception at all.  My solution:  no dish.  The question of what the kids were watching when I wasn't around became a non-issue; they could watch the TV all the time, if they for some reason enjoyed static, snow, and white noise.  We had videos and DVDs, of our choice, for them to watch on occasion.  And you know what?  My now twenty-year-old thanked me a while back for limiting their access to television when they were young -- for having the guts to make that choice, for not letting the television become a babysitter, for not exposing them to the rampant commercialism of public media, which to me is more of an issue than the sexuality and violence.

The bottom line is; kids are smarter and more moral than you think, and parents not quite the hapless dimwits you claim they are.  Give credit to someone other than yourself for some brains and ethical standards, and for the love o' pete, find a new topic to blather on about.  This one's getting old.

Friday, November 26, 2010

Black Friday blues

Today is the day on which I will not go within ten miles of the nearest mall or department store, namely, Black Friday.

Please understand that I mean no disrespect to people who love shopping.  Everyone has their hobbies, and I wouldn't expect others necessarily to participate, or even understand, mine.  Take birdwatching, for example.  I zoomed out of the door at just before 8 AM on Thanksgiving day, drove almost 30 miles, and stood on the lake shore in the freezing wind clutching my binoculars, because there'd been a report of a King Eider (a rare species of duck) at Myer's Point on Cayuga Lake.  Me and two other equally insane birdwatchers shivered in the cold for a half hour, scanning all of the hundreds of ducks bobbing out there in the lake, and finally, after all that work and discomfort... we didn't see the bird.

And, oddly, none of us felt like we'd wasted our time.  "Ha ha, these things happen, if you're a birdwatcher," was our basic response, and I've no doubt if the King Eider suddenly reappears, all three of us will rush right back without a second thought.

So people, in the throes of a pastime, will do some pretty odd things.  Add to that the bonus of getting a good deal, money-wise, and you've got a combination that leads people to engage in behavior that under normal circumstances would be grounds for a psychiatric evaluation.

The news reports are already beginning to come in... apparently the parking lot of the Toys "R" Us in Nanuet, New York was already full by 10 PM on Thanksgiving night.  This means that these people are going to sleep in their cars, or (more likely) stand in line in the cold and dark, to be amongst the first to be able to shop.  Myself, I'd choose the King Eider over that in a heartbeat.  I might even choose a root canal.

What I find the most amusing about this is how we as a society let ourselves be drawn in to media-driven rituals.  I'm not even talking about Christmas and Easter and so on, because those were holidays of long standing, with religious significance and replete with traditions, long before the media got involved.  I'm more thinking of the ones that the media and corporations either created (e.g. Black Friday) or morphed so drastically from their original versions and purpose that they're virtually unrecognizable (e.g. Halloween).  And we allow ourselves to be drawn in.  We dress our kids up as Batman, Superman, the Little Mermaid, and so forth, with the traditional plastic masks with poorly-lined-up eyeholes, on October 31 because that's what the media says we should do.  As an experiment to support this, I challenge you to dress your kid up as, say, Shrek on April 17, and send him out to knock on your neighbors' door and say "Trick or Treat."  Odds are, it won't work.  Odds also are that your neighbors will begin to wonder if you yourself need to up the dosage on one or more of your prescriptions.

Once again, I'm not questioning the motives of people who participate in these activities because they think they're fun; I'm more thinking about the folks like myself who actually loathe shopping,  but they go out on Black Friday anyhow, because "that's just what you do."  For myself, I can't imagine allowing myself to be coerced into shopping.  I can barely even tolerate grocery shopping -- my idea of the proper technique for grocery shopping is to zoom down the aisles at 45 miles per hour, knocking over small children and little old ladies, while hurling various grocery items into the cart after barely looking at them to check and see if it's what I actually wanted to purchase.  Every once in a while this will mean that I buy something I really didn't intend to.  "Gerber Mashed Carrots?" Carol will ask, while putting away groceries.  "Our kids are 20 and 22 years old.  And you hate carrots."  But I consider this a small price to pay, if it allows me to beat my previous record time for completing my shopping list.

In any case, if you love shopping and deals and Black Friday, I hope you enjoy yourself.  Me, I'm sticking close to home today.  Unless that King Eider comes back.

Thursday, November 25, 2010

Some call me the pumpkin of love

So here I sit, in a food-induced coma, and I was reading the news (being that that's about the most active thing I could consider doing at the moment).  And lo, I ran across a story in which researchers have found that men consider the smell of pumpkins sexually arousing.

I am not making this up, and if you don't believe me, go here.  Apparently, Dr. Alan Hirsch of the Smell and Taste Research Foundation in Chicago decided to do a study to, and I quote, "investigate the impact of ambient olfactory stimuli upon sexual response in the human male."  And upon much research, they found that the smell that ranked number one in the, um, ready-to-party department was... pumpkin.

Apparently, the response was especially pronounced when the pumpkin smell was combined with lavender.  And when you added the smell of doughnuts... well, it caused horniness levels that pegged the meters.  All of which made me think, "Were the guys just hungry?"

You may be wondering how they measured all of this stuff.  I know I did, so I did a little research into it.  It turned out that while the guys in the study were breathing air infused with various scents, the researchers were measuring blood flow into their naughty bits.  Blood flow increases during sexual arousal, so there you are.  And lemme tell you, that pumpkin/lavender/doughnut combination really did the trick.

My next question was, who thought of that combination?  It seems like a pretty weird trio to put together.  Did the researchers try various other combinations, and they didn't work so well, and they kept combining random scents until they found one that caused the test subject to get an erection?  "Let's see... bologna/caramel/anchovy... nope.  Cinnamon/shrimp/peanut butter... nope.  Vinegar/chocolate/bacon... nope."  Until they finally happened to hit on pumpkin/lavender/doughnut, and they found that one was, as it were, hard to beat.

The thing I found the funniest was that although the Triple Threat of pumpkin/lavender/doughnut worked the best, none of the scents turned guys off.  The reason I found this funny is that most guys could have told you that without lots of expensive research.  If a pretty, willing young woman wanted to get seriously amorous in, say, a sewage treatment facility, I suspect that most guys would not be dissuaded by a trivial little thing like an odor so bad that it's actually visible.  Now, the women, on the other hand... in my experience, women are thousands of times more sensitive to odors than guys are.  My wife will come home, and will immediately wrinkle her nose and say, "What in god's name is that smell?" and it will turn out that one of the cats puked up bits of dead rodent in five separate locations in the living room, and I didn't notice a thing.  Now, I'm willing to entertain the possibility that I am simply oblivious, but I know that other guy friends have corroborated my experience -- women are just more sensitive to smells.  This, I suspect, also explains why guys' locker rooms smell, by and large, like your face is wrapped in a bundle of dirty sweat socks, and nobody seems to mind it all that much.  I can't vouch for what the ladies' locker room smells like, having never been in there, but I don't think I'm going out on a limb in speculating that it's better.

In any case, I can't wait to see what's going to happen when the perfume manufacturers get a hold of this research.  We'll have a whole new line of ladies' scents, with names like "Chanel Eau de Pumpkinne."

It's also not without irony, nor is it probably a coincidence, that this article came out on Thanksgiving.  So enjoy your pumpkin pie... and its aftereffects.

Wednesday, November 24, 2010

Vaa vaa vlack sheep

We humans are laboring under the sometimes false impression that our sensory organs, and the brain integrative centers that interpret the input they provide, are reliable.  We hope that they are reliable most of the time -- after all, science itself would be a massive self-delusion if the percent of wrong data our senses provide was much larger than 1%.  The error, of course, is in assuming that because our sensory organs and brain are reliable much of the time, that they are reliable all of the time.

The most common example of sensory misinterpretation is the optical illusion.  There are a number of weird, and largely unexplained, optical illusions at this page (I use several of the ones on this site in my Brain & Senses class when we discuss the visual integrative systems in the brain).  And while these illusions are charming and fascinating, there's a lesser-known one that I want to consider today.

Called the McGurk effect, this phenomenon is a type of brain confusion that occurs when what your ears are telling you and what your eyes are telling you are at odds.  Think, for example, of how much easier it is to understand someone's speech in a crowded, noisy pub if you're looking at him while he talks.  (The same thing in part explains why it's so easy to mishear someone on the telephone, when you have no visual cues to support what you're hearing.)

What if, however, what you're hearing and what you're seeing don't line up?  Common sense might dictate that since what we're talking about is the interpretation of sounds, that hearing would win -- that if your ears told you that you were hearing one phoneme, and your eyes told you you were hearing another, your brain would give more credence to your ears.

This, in fact, isn't what happens, and thus the McGurk effect.  Watch the following video (here) if you don't believe me.  In it, psychologist Dr. Lawrence Rosenblum forces your brain into a perceptual no-win situation by saying the syllable "ba" while overlapping it with a video of him saying "va."  Amazingly enough, the brain hears "va."

And the effect is quite robust -- you can't make it go away once you've understood what's going on, the way you can with many simple optical illusions.  It's instantaneous and quite unambiguous.  While watching the clip, in the segment where he's saying "ba" and we're seeing "va" over and over, I tried shutting my eyes and blocking the visual input for every other syllable.  And what I heard was... "ba va ba va ba va."  As soon as my eyes were open, my visual cortex overrode my auditory cortex, even as my prefrontal cortex was shouting at it, "Hey!  You!  You're being tricked!  Don't believe it!"

You might think that rationalists like myself would be dismayed at this, relying as we do on our brains' ability to distinguish fact from fiction, perception from illusion.  My reaction is quite the opposite.  Our brains are generally so good at picking up, sorting out, and making sense of the chaotic mishmash of sensory input we get bombarded with that the few instances that they don't work stand out.  Optical illusions, and such sensory-clash phenomena as the McGurk effect, want explanation precisely because our brains are amazingly good at detecting, interpreting, and storing information.

And that's probably why we find them so fascinating.  It's definitely why I've watched the McGurk effect video clip three times, and each time I try unsuccessfully to get the effect to vanish with a variety of tricks -- deliberately blurring my vision, concentrating on his forehead instead of his mouth, and so on.  It's also why I'll be paying closer attention to looking at my friends' faces next time I'm chatting with them in a crowded pub.