Skeptophilia (skep-to-fil-i-a) (n.) - the love of logical thought, skepticism, and thinking critically. Being an exploration of the applications of skeptical thinking to the world at large, with periodic excursions into linguistics, music, politics, cryptozoology, and why people keep seeing the face of Jesus on grilled cheese sandwiches.

Monday, January 31, 2011

The sins of the fathers

Christine Weston's novel Indigo, set in pre-World War I colonial India, chronicles the coming of age of three very different characters -- Jacques de St.-Rémy, the French son of an indigo planter; Hardyal Rai, the son of a wealthy Indian lawyer; and John Macbeth, a wry, tough Englishman, son of a colonel in the British army.  Weston does a masterful job of describing the slow-motion train wreck of the British occupation of India in an even-handed fashion, presenting the native Indians neither as noble savages nor as helpless victims, and their British overlords neither as evil exploiters nor as the emissaries of civilization.  Her characters are complex, three-dimensional entities, not easily pigeonholed.

The most interesting of the three is John Macbeth, who as a young teen at the beginning of the story mistrusts all native Indians, but through his friendship with Hardyal grows past his bigoted, black-and-white view of the world.  Nevertheless, when as a young man he joins the police force, he and Hardyal end up on opposite sides of the growing revolutionary movement, and Macbeth has no choice but to do the job his superiors expect of him and support the cause of the British occupiers.  Although he remains a man of his time and context, he represents the potential we all have for doing the best we can with what we're given.

Which brings me to Haley Barbour.

For those of you who are not newshounds, Barbour is the governor of Mississippi.  He gained accolades for his handling of the Katrina tragedy, and easily won a second term.  He is now considered a front-runner for the Republican nomination for president in 2012.

The pundits are already speculating about the extent to which his past, and his state's, will weigh him down if he does make a bid for the nomination, and as a result he has been peppered with questions about the South's racist past.  His replies thus far have seemed disingenuous at best.  When asked what he remembered about the civil rights era in Mississippi, he said, "Not much."  When a reporter mentioned the riots of 1961, he said that mostly what he recalled about that year was being on Yazoo City's winning baseball team as a thirteen-year-old.

"What I remember was more Mayberry than it was Mississippi Burning," the governor said.

Predictably, he's come under fire for these comments.  Some have referred to him as a racist who is attempting to whitewash the history of a state that saw some of the bloodiest violence of the entire civil rights movement. 

Barbour has tried, with little success, to counter these perceptions.  "I went to an integrated college," he said, when asked about his earlier comments.  "I never thought twice about it."  He told a story about sitting in a literature class next to a pleasant young African-American woman who let him borrow her notes.

None of this has done much to alter the views of his critics.

My question is:  what did you want him to say?

Barbour is 63; he was born in 1947.  He was a privileged white boy in the Jim Crow South, and went to a segregated high school.  He was seventeen when the Civil Rights Act was passed, twenty-one when Martin Luther King, Jr. was assassinated.  Many young whites, growing up in that era, accepted without question the prevailing attitudes of the time - just as we accept today's.

However, even if he once believed that whites were superior - a claim that seems to have no particular basis in fact, by the way - there is no indication whatsoever that he still does.  Of all of the news stories and editorials written about him and his past, no one seems to be able to find a single thing he has said or done that is explicitly (or even implicitly) racist.  That he's unwilling to engage in a dialogue about Mississippi's violent history may seem suspicious, but to me it speaks of nothing more than political expediency.  Barbour's claim "not to remember much" about the bloody protests of the civil rights era seems to be more wishful thinking than it is racism.  That all happened a long time ago; I was young then.  It's over.  Let's move on.

For the record, I don't particularly like Barbour's politics; I very much doubt I'd vote for him.  But to cast him as a racist for his reluctance to discuss something that happened fifty years ago when he was a teenager is ridiculous.  Like John Macbeth in Indigo, he is a person of his time and place -- as we all are.  But any time and place produces good people and bad, people who rise above the prejudices of their fellows and those who swallow them and sink.  It remains very much to be seen that Barbour is one of the latter type.

Sunday, January 30, 2011

Anti-science in the science classroom

In a recently released study, 13% of high school biology teachers in the United States admitted that they advocated young-earth creationism in class.

The study, conducted by Michael Berkman and Eric Plutzer of Penn State, surveyed 926 biology teachers from a variety of areas, and asked them questions about their approach to teaching the topic, how strongly they advocated for its factual basis, and their own (personal) views.  To quote a Minnesota teacher, "I don't teach the theory of evolution in my life science classes, nor do I teach the Big Bang Theory in my [E]arth [S]cience classes.... We do not have time to do something that is at best poor science."

It is not necessarily a coincidence that you can receive a bachelor's degree from the University of Minnesota without taking a class in evolutionary biology.  Randy Moore, a science and education specialist at UMinn, was quoted as saying, "We let that go in the name of religious freedom."  (Advocating, apparently, the "freedom to remain ignorant.")

A full 60% "don't take a stance on the subject."  Berkman and Plutzer call this the "cautious 60%" who are either conflicted about teaching a topic that they themselves have doubts about (or don't fully understand), or are anxious to avoid controversy with students and parents.

This leaves somewhat under 30% who teach the topic as valid science.

Guess which slice I fall in?

What it boils down to is, to quote Daniel Moynihan, "You are entitled to your own opinions, but you are not entitled to your own facts."  The theory of evolution is supported by a vast amount of incontrovertible evidence, from every sphere of science even vaguely related to biology.  In fact, we know far more about the mechanisms involved in evolution than we do about the mechanisms involved in gravity.  If you are a young-earth creationist, you simultaneously have to discount the following:
  • our understanding of the physics of radioactive decay, which is how rocks and fossils are dated.
  • pretty much the entire field of geology, which has demonstrated the antiquity of the earth in about a hundred different ways.
  • the field of genetics, which has provided evidence of common ancestry between (for example) birds and reptiles.
  • all of astronomy, for a multitude of reasons.
Which leaves you in a bit of a quandary as to how to explain:
  • how we developed nuclear reactor technology.
  • why Iceland is getting bigger, why California is an earthquake zone, and why there are extinct fossil animals on the top of Mount Everest.
  • why the DNA of chickens has a gene, now inactive, for producing teeth.
  • why, if the Earth is only six thousand years old, we can see things that are more than six thousand light years away.
The objection to evolution is not based on any concerns about its being "poor science," as the unnamed Minnesotan biology teacher claimed; it is, pure and simple, a conflict between verified science and the desperate adherence of 13% of biology teachers to their views about how the world works, despite the fact that those views are themselves anti-science and have no basis whatsoever in fact.  Let's face it; there is no objection to evolutionary theory except the religious one.

Oh, and don't even start with me about how "evolution is just a theory."  They call it "music theory," and that's not because they think that music may not exist, okay?

Science is a way of knowing.  It requires the ability to draw inferences based upon facts and evidence.  If a particular hypothesis does not fit the available evidence, it must be discarded.  The religious way turns this on its head; it gives you the conclusion, and asks you to discard any evidence that doesn't fit the conclusion you've already accepted.  Much has been made of the idea that "there is no necessary conflict between science and religion," but as far as I understand it, I don't think this is possible.  They are, at their basis, mutually contradictory algorithms.

Note that by this I do not mean that you can't both trust science and believe in a deity; simply that if at some point these two different worldviews are in conflict, you have to choose one or the other.  There is no reconciling them.

And therefore, the 13% of biology teachers who advocate young-earth creationism have no business being in a science classroom.  What they are teaching is not science.  It is at its basis a non-scientific viewpoint that ignores a quantity of evidence which would be overwhelmingly convincing in any other realm.  If they cannot accept this, they should keep their views confined to the proper venue -- Sunday school.

Friday, January 28, 2011

Giving credit where credit is due

The latest Flavor-of-the-Month at the New York State Education Department is called "credit recovery."  Here's a quote from NYSED's proposal regarding this provision:

"Sometimes students may come close to passing a course and may have deficiencies only in certain clearly defined areas of knowledge and skill. In those cases, it may not be necessary for the student to retake the entire course. Instead, the student might be permitted to make up those deficiencies, master the appropriate standards, and receive credit. Of course, this should only be allowed under carefully controlled conditions to ensure that the student does receive the opportunity to learn and does meet the required standards...  In order to receive credit, the student must receive equivalent, intensive instruction in the deficiency areas of the course by a teacher certified in the subject area...  The provisions above do not require specific seat time requirements for the make-up opportunity since the opportunity must be tailored to the individual student’s need. There is precedent for allowing a reduced amount of seat time in the context of summer school."

I find this troubling.  The concept of credit recovery may be well-meaning -- although cynics, present company excepted of course, make a plausible case that the only impetus for this provision was to boost graduation rates.  But however well-intentioned the policy is, its implementation is considerably problematic.

Consider, for example, a student who is failing my Regents (Introductory) Biology class.  Let's say that this student has reached April, currently has an average of 31%, and suddenly has the realization that he's headed toward failing for the year.  Under the provisions of credit recovery, I could be required to give him an opportunity to make up the work that he'd failed, so that he'd have a chance of passing for the year.

While the provision as drafted by NYSED states that he should receive "equivalent, intensive instruction," practically speaking, there is no way to do that.  The school district has neither the funds nor the facilities to hire another teacher to go back and reteach this kid; the duties would necessarily fall on me, as the subject teacher.  During what time would I do this?  I already teach a full schedule - in fact, in my case, I am a section over the contractual limit.  Further, could this provision require that I put together activities that he'd missed, failed, or simply not turned in - including labs?  Lab activities almost always require the preparation of chemicals, equipment, and supplies, which would all have to be redone for the sake of a single kid.  I believe that under this provision, teachers could be required to do exactly that.

Of course, in practice, that's not what would happen.  Besides labs, what about activities that can't be replicated, such as in-class discussions, group activities, and so on?  Between the time constraints and the simple impossibility of recreating a curriculum, sometimes months after it was initially presented, teachers will inevitably be forced into developing worksheets, problem sets, and other "seat work."  In other words -- whether or not we feel it's justified or even educationally sound -- we'll be in the situation of being coerced by state mandate to provide inferior delivery of instruction just so students can receive credit.

Lest you think that this is just a case of yours truly being a hysterical alarmist, there are places in the state where this is already being done, and it's playing out exactly this way.  A teacher at Jamaica High School (New York City School District) quipped, "You shouldn't drive by our school with your window rolled down, because someone will toss a diploma in."  Students there were being awarded credit for an entire course they'd failed by showing up for nine hours, total, during winter and spring break.

It's a case where everyone loses; the school districts' feet are being held to the fire by NYSED to develop some kind of policy, but at the same time they have no money to hire additional staff, and the current staff are already stretched to the limit.  The kids figure out very quickly how to game the system -- you can take a whole year off, fail a course, and then get credit for putting in nine hours of busy work the following year.  Tell me that won't be taken advantage of.

Myself, I have a philosophical problem with this, and one that goes deeper than the practical issue of how to implement the policy fairly.  My feeling is that there's nothing inherently wrong with failing at something; it's a sign that you need to get your ass in gear and work harder the next time.  If you're learning to ride a horse, and you fall off, the only thing that can fix your problem is getting back up on the horse and figuring out what you did that led to your falling off, and making sure you don't do that again.  What credit recovery does is a little like your trainer saying, "Oh, you fell off?  Well, no problem.  Get up on this merry-go-round horse for a few minutes, and we'll all pretend that you can ride."

Thursday, January 27, 2011

The circle game

Yesterday, the news carried reports of an eye-opening first-ever event, an occurrence which some were attributing to supernatural forces, others calling a Sign of the End Times, and most scientists dismissing as a hoax:  Ann Coulter said something nice about a Democrat.

No, not really, I wouldn't expect anyone to believe that that had happened.  What actually occurred is that the country of Indonesia had its first recorded crop circle.  And, predictably, the alien-invasion crowd immediately converged upon the spot, claiming that this was conclusive proof at last.

Villagers in Sleman, Yogyakarta, woke last Sunday morning to find that a rice field had been adorned with a pattern of circles and triangles seventy meters in diameter.  The stalks were flattened in the "combed-down" fashion typical of earlier crop circles, and the symmetrical pattern soon became a magnet for gawkers.

(Check out a photograph of the circle here.)

The Jakarta Post quotes a local, Cahyo Utomo, as saying, "I think they were left by an alien spacecraft, like I saw on TV."

Well, far be it from me to contradict Mr. Utomo or his television, but it's already been demonstrated that crop circles can be made fairly quickly by a couple of guys with nothing more than a board, a spotting scope, and some rope; a couple of old English dudes, Doug Bower and Dave Chorley, even demonstrated back in 1991 how they had made a few themselves.  Shortly after that, a couple of high school kids in Hungary were actually arrested for crop damage after making one, and a guy named Matt Ridley published an article in none other than Scientific American describing how he'd made several single-handed.  You'd think that at that point, people would go, "Oh.  Humans make these.  I see now.  How silly of me to think it was aliens."

You'd be wrong.

Since Bower and Chorley confessed on the BBC back in '91, the crop circle phenomenon has exploded, and the theories about what is making them have progressively gotten wilder and wilder.  The most prosaic-minded theorists -- and this isn't saying much -- suggest that they're caused by some sort of localized, extremely symmetrical weather phenomenon.  Basically, what they describe is sort of an OCD tornado.  From there, the hypotheses sail on out into the void, and include visitations by aliens, signs left by secret societies as messages to other, even more secret societies, and (my personal favorite) attempts at communication with humans by Mother Earth herself.

My problem with all of these explanations, besides the obvious one that even writing them down makes me want to take Ockham's Razor and slit my wrists with it, is that if crop circles represent some sort of communique -- whether from aliens, the Illuminati, or Gaia -- they're a pretty obscure communique.  Some of them are quite beautiful -- in fact, I've got a photograph of a crop circle as the desktop background of my computer at school, much to the wry amusement of my students.  However, if they mean anything, it certainly isn't immediately obvious what that might be.

My general thought is, if aliens were trying to announce their presence, there are more direct ways to do it.  Landing a spaceship in Times Square, for example, would certainly do the trick.  Why a highly-developed, technological race would take the time and trouble to fly across the light years of interstellar vacuum, and then get to Earth, flatten a bunch of cornstalks, and fly away, I have no idea.

In any case, the woo-woos have been so stirred up by this incident that officials in Indonesia were prompted to take action -- the Indonesian National Atomic Energy Agency was so flooded by phone calls demanding that they investigate the site that they reluctantly sent someone out with a Geiger counter, which (surprise!) didn't register anything.  Once again, what you'd hope would be the response -- "Oh, okay, I guess it wasn't an atomic-powered alien spacecraft" -- didn't happen.  Most folks seemed to say, "Wow!  Those aliens sure are pretty tricky, to come and go and leave no traces of radiation!"

Anyhow, as with all of these events, sooner or later the hype will fade, and the woo-woos will return to their crystal-lined, pyramid-shaped houses, and all will quiet down until the next time some college kids get into a field with a board and some rope.  Maybe eventually, people will begin to see that these really are human-generated pranks, and not of paranormal origin, and will begin to take a more skeptical view of these sorts of things.  Or maybe Ann Coulter will say something nice about a Democrat.  Given a choice, I'd put my money on the latter happening first.

Wednesday, January 26, 2011

When the volcano blows

The latest from the "News That Isn't Actually News" department is:  We are all going to be killed in a massive eruption of the Yellowstone Supervolcano!  It could happen tomorrow!  Giant ash clouds!  Searing bursts of gas vaporizing the entire state of Wyoming!  We should prepare for the worst!  Or at least run about, making flailing arm gestures, writing overhyped articles and webpages, and overusing exclamation points!

For some reason, recently this non-story seems to be all over the news.  I've seen more than one reference to this geologic hotspot just in the last couple of days, usually accompanied by photos of the geysers and hot springs, or (in one case) by a photo of Yellowstone Lake, captioned, "It SEEMS peaceful... but hidden beneath its pristine beauty is a RED HOT MAGMA CHAMBER JUST WAITING TO BLOW."

Well, yeah, okay, technically I have to admit that they're correct.  The Yellowstone Supervolcano is a pretty scary place.  The last time it erupted, about 640,000 years ago, it produced about two thousand times the volume of ash that Mt. St. Helens did.  It is reasonable to find the prospect of this happening again terrifying.  The direct damage from the blast, the secondary damage from the ash cloud, and the climate changes which would ensue, would be devastation on a level humanity has never seen before.  (The eruption of Mt. Tambora in Indonesia in 1815, which killed 71,000 people directly and led to the "Year Without a Summer," in which there were hard freezes in July across Europe and North America, would be a mere firecracker by comparison.)

However, the hysterical tone of some of these articles, which imply that we're "overdue for an eruption" of the Yellowstone Supervolcano, is completely unwarranted.  For example, one source I read stated that the ground was rising over the magma chamber at "a rate of three inches a year," and "new geysers were forming," and that this was indicative that an eruption was imminent.  This is ridiculous, as this source conveniently omitted mention of the fact that some areas over the magma chamber are actually subsiding; and in any volcanically active area, new geysers form frequently, and others cease to flow, and this isn't indicative of anything other than the area is experiencing movement of magma -- which we already knew, because that's what "volcanically active" means.

The whole idea of "overdue for an eruption" implies that volcanoes erupt on some kind of schedule, which is nonsense.  The three known eruptions of the Yellowstone Supervolcano occurred 2.1 million, 1.3 million, and 640,000 years ago -- gaps of 800,000 and 660,000 years, respectively.  Even presuming that there was some kind of pattern, we're still 20,000 years shy of the previous gap, and 160,000 years shy of the longer one.  But, of course, a headline that says, "Massive Volcano Could Erupt Now or 160,000 Years From Now!" doesn't make people read any further.

And I'm not even going to go into the websites that claim that the Yellowstone Supervolcano is connected to (1) the 2012 lunacy, (2) the prophecies from Revelations, (3) conspiracy theories, or (4) all of the above.  If you Google "Yellowstone Supervolcano" you can find plenty of those sites for yourselves, but if you read them you have to promise me you'll try your best not to find them plausible.

In any case, if you have a vacation planned to Yellowstone, it's probably a bit premature to cancel it.  With apologies to Jimmy Buffett, I don't know where I'm-a-gonna go when the volcano blows, because chances are I'll be dead and gone before anyone has to worry about it.

Tuesday, January 25, 2011

Alone again, naturally

Today's London Telegraph features a story in which Dr. Howard Smith, senior astrophysicist at Harvard University, has stated that alien life is almost certainly impossible based upon the conditions on the exoplanets so far discovered.

"We have found that most other planets and solar systems are wildly different than our own," Smith was quoted as saying.  "They are very hostile to life as we know it."

So according to Smith, we're... all alone.  *cue sad music*

Hang on a second.  I think, Dr. Smith, that we may not want to resign ourselves to being Lost in Space quite yet.

We have, at last check, found about five hundred exoplanets.  Most of them are of the "hot Jupiter" variety -- large, probably gaseous planets, orbiting very close to their sun.  The reason that we've preferentially found those has nothing to do with their being likely to be more common in the universe; it's that they're easier to find, because they create a greater gravitational perturbation of the star they orbit.  Small, rocky worlds, such as the Earth, are harder to detect, although we're beginning to be quite good at that, too.  When the data from NASA's Kepler satellite is released in a few weeks, it is expected to include information about hundreds of additional, newly-discovered exoplanets.

However, there's a far bigger problem with Dr. Smith's statement than this.

I must say that I would not have expected a prominent scientist to make quite such a catastrophically faulty inference.  In order to make an inference, you're supposed to take into account a very important factor -- what your sample size is.  Dr. Smith's mistake is analogous to someone going through my house, and finding that I own twelve flutes, recorders, tinwhistles, and so on, and concluding that that there must be 80 billion wind instruments on the planet Earth.

"Wait," you might be saying.  "That's a bad comparison -- no one would be so foolish as to take a sample size of One Person and extrapolate it to all 6.7 billion people on Earth."

Okay, fine.  Point made.  Let's see how Dr. Smith's inference compares to that one.

There are something on the order of a hundred billion galaxies in the universe, and each of those has maybe a billion stars.  (This information is from Cornell's astrophysics faq website, if you're wondering about my sources.)  This means that there are potentially one hundred billion billion -- that's ten followed by twenty zeroes -- stars in the observable universe...

... and we've surveyed about a thousand of them.

So Dr. Smith's sample size is a thousand, out of a hundred billion billion.  This comes to not one out of 6.7 billion, which was the sample size in my ridiculous inference about wind instruments, but one out of one hundred million billion.

There are so many stars in the universe that if we surveyed one star system every second of every day; 365 days a year; no time off for holidays; no potty breaks, for cryin' out loud -- it would take one hundred trillion years to finish.

And Dr. Smith thinks a thousand stars is an adequate sample size to conclude that we're alone in the universe?  Oh, the pain, the pain...!

It's a little like my climbing on to my roof, and looking around, and saying, "No wombats in my back yard!  No wombats in my neighbor's yard!  In fact, no wombats anywhere to be seen!  I guess wombats don't exist."

I know it's tempting to draw conclusions quickly; patience is not a notable human trait.  However, a mark of a skeptical mind is the willingness to suspend belief and disbelief -- to be completely comfortable with saying, perhaps indefinitely, "the jury's still out on this one."  I am frequently asked by students if I "believe" in various things -- UFOs, bigfoot, ghosts, the Loch Ness Monster, god...  and my usual answer is, "I neither believe in, nor disbelieve in, anything for which I have no concrete evidence of any kind.  If you want, however, we can discuss how likely I think those things are."

All of which brings me to a comment I've made before; the world would be a far better place if people had more facts and fewer beliefs.

Monday, January 24, 2011

Even if we're just dining in the dark

Last night, Carol and I participated in an event called "Dark Dining."  The sixty participants were blindfolded, and then led in groups into a dining room, seated, and served a five-course meal.  At no time were we allowed to remove our blindfolds, and in fact at the conclusion we were led out of the room in groups back to the room where we started -- never to see the place where we ate dinner (unless at some time in the future, we cheat and go back there).

The whole idea was to heighten our sensory experience by depriving us of one of our senses, the one that in fact we require the least in order to enjoy a good meal.  You might think that it would have been a messy affair, but to my knowledge there was not a single spill the whole evening.  We did look kind of silly,  however, to judge by a photograph taken by one of the wait staff:

Some of my impressions:

1)  The thing I found the most disorienting was not having any idea of the physical space I was in.  My sense of hearing is quite good and by the middle of the evening I was fairly certain that we were in a long, narrow room, but it was a very weird feeling not having any real sense of where I was, or where I was in relation to the objects and other people in the room.  I did figure out that I was at the end of the table -- I suspected it fairly quickly by the pattern of sounds, but I didn't want to reach out my hand to find out and gut punch someone.

2)  My sense of taste is really quite inaccurate.  The big shocker of the night was dessert, which took me several bites to identify as chocolate.  My first sense was that it was something a little sweet and a little bitter, but I believe that one of my neighbors said "chocolate" before I had decided that was what it was.  This makes me wonder to what extent the visual sense does contribute to your sense of taste, priming the brain for what it will be experiencing.

3)  It's almost impossible to turn off the brain's determination to create visual images.  Terry and Kornelia, our dining companions across the table, were complete strangers to us, and Carol and I both found ourselves creating strong visual impressions of them despite having exactly zero hard evidence to go on.  Need I add that neither of our mental images were even close?

4)  It's really, really difficult to eat pot roast when you can't see what you're doing.  The best technique is to spear a big piece and gnaw, caveman-style, chunks off of it.  I figured, "what the hell, no one can see me but the wait staff, and they're probably used to this sort of thing."

5)  My wife has a warped sense of humor.  At several points, we were supposed to stop talking and listen to some music, and during that time the wait staff would do things like brush a hand across our shoulders, fan us with something, or walk around clinking glasses -- all to make us more aware of our other senses.  So the next time the music played, Carol touched the back of my neck, and I thought, "ah, the wait staff is up to their tricks again."  Then she stuck her finger in my ear.  I have to admit, her aim was impeccable.  Either that, or I have big ears.

All of this put my in mind of the Ganzfeld Experiments, done in the 1930s by psychologist Wolfgang Metzger.  He had his subjects blindfolded and placed in sensory deprivation, or (in one variation) staring at a field of uniform color (the "ganzfeld," German for "complete field," of the name).  In all cases, subjects reported heightened awareness, their electroencephalogram outputs changed, and some subjects hallucinated.  (One wonders if those subjects thought a finger was being put into their ears.)  The Ganzfeld Experiments were extended in the late 1970s by Dean Radin and Daryl Bem, who reported that subjects in sensory deprivation were capable of telepathy -- a finding that has been called seriously into question by skeptics, but remains an intriguing claim of what is supposedly "the strongest quantifiable evidence of telepathy to date." 

Be that as it may, it was a fascinating evening, and one which forced me to slow down (physically and mentally) and really focus on what I was experiencing.  And the food and wine was delicious.  If you're interested in finding out more, or seeing if there's a dark dining experience near you, here's a link to the Dark Dining Project.  Give it a try -- it'll be an unforgettable evening.

Saturday, January 22, 2011

Haute wackiness

I am probably identifying myself as a philistine by saying this, but I just have to ask:

Are modern haute couture clothing designers kidding?  Or what?

I was sitting at my computer this morning, reading the news, and I happened to notice a photograph (under "This Week in Photos") of a woman who appeared to be encased in a giant pot-scrubber.  I clicked on the link, and was brought here.  (Do click through the slides, although you might want to be aware that a few of them border on the Not Safe For Work, not only because some of them involve a lot of skin showing, but because you will probably laugh out loud and attract the attention of your boss.)

My overall impressions:

"Haute couture" must (contrary to my knowledge of French) mean "clothing that no one in his or her right mind would ever dream of wearing in public, for fear of being arrested for (1) indecent exposure, (2) striking an innocent bystander with a protruding garment part, (3) looking completely ridiculous, or (4) all of the above."  Several of the models in the photographs look like they went to the Princess Amidala School of Design -- encase yourself in folds of starched cloth to the point that it becomes almost impossible to walk, and layer on the makeup with a mortar trowel.  Others go for the minimalist approach; one of them is clothed in a tight fitting, brightly colored knit body-sock, but makes up for it by wearing an enormous, comical-looking sombrero.  The male models, on the other hand, look like escapees from an Alternative Lifestyles Parade in San Francisco, and favor extremely tight, Speedo-style thongs that would leave most guys singing soprano.  I also noticed that many of these models look extremely sullen.  Now that I come to think of it, if I were forced to wear clothes like that, and then appear in public and have my photograph taken, I'd look sullen, too.

Then, I wondered:  how much does this clothing cost?  So I did some research, and I found out that the average haute couture outfit costs $20,000.  That's right; it will set you back twenty grand, or more, for you to look either like an alien hooker or a Village People wannabee.  With apologies to Billy Joel; you can't dress wacky till you spend a lot of money.

I have to wonder, country-boy uncultured hillbilly that I am; is this all some kind of massive joke?  I wonder if Christian Dior and all of the other haute couture designers sit around with their design committees late at night, swigging Absolut straight from the bottle and saying, "Hey!  I know!  We could make a dress out of an old refrigerator carton!  Just cut a hole in the top for her head, and two holes in the sides for her arms!  She could wear an orange traffic cone on her head!  Let's charge $30,000 for that one!"  And then they all laugh like goons.

It may well be that I'm missing something here.  I'm as much of a connoisseur of the female form as the next red-blooded male, so it leaves me a little mystified when I look at a shapely woman strutting her stuff and my only reaction is, "Huh?"  It could be that you have to be at a certain level of sophistication, of savoir faire, to appreciate this sort of thing.

But no matter how hard I try, I can't imagine finding giant pot scrubbers and sombreros sexy.

Friday, January 21, 2011

Your lying eyes

"I'll believe it if I see it with my own eyes."

How many times have you heard someone say that?  The implication, of course, is that if you see it (or hear it), that you can't get fooled.  What your senses tell you, and how your brain interprets those inputs, are pretty reliable.

Enter Kokichi Sugihara of the Meiji Institute for Advanced Study of Mathematical Sciences, who is the master of creating illusions that do things your eyes and brain say are impossible -- and all with no trickery, no CGI, using only cardboard, glue, and other ordinary items.  Take a look at this video, in which marbles seem to roll uphill -- until he turns his little structure around and shows you that it's a trick of perspective.  (For those of you who usually aren't inclined to check out links in posts, this one and the others in this post are a must-see.)

The thing that makes me watch that clip over and over is how absolutely convincing it is, even when you know what's going on.  Something is happening in your brain when you see his little cardboard channels and platforms from one angle that makes it impossible to interpret it any other way than that the marbles are defying gravity.  "Stop it," I tell myself.  "First, you know that the Law of Gravity is strictly enforced in most jurisdictions, and second, you know how he did this!"  But my brain stubbornly refused to cooperate, preferring instead its impossible explanation of anti-gravity.

For more of Sugihara's fantastic structures, go here and here -- I find the second of these so brain-bending that it almost makes me a little seasick. 

All of this vividly illustrates a point I've made before; our sensory organs and brain are easily fooled.  Just as in my earlier post regarding visual/auditory conflict and the McGurk effect, there are times when our brains can't handle the sensory input they're being given, and amazingly, the brain's response is to admit defeat immediately and say, "Okay, then, I guess the world doesn't work the way I thought it did."  Given how easily the brain can be tricked into giving up something it's always been sure of -- gravitation, or in the case of the last video, the properties of structures lying in a plane -- is it any wonder that skeptical people disbelieve eyewitness testimony of the paranormal?

"It was a UFO!" someone says.  "I saw it!"  Or, "I saw the ghost come into the room and float across the floor and finally disappear through the wall."  Well, as Sugihara shows, I might believe that you saw something.  But whether your brain was correctly interpreting what your eyes detected is another matter entirely.  So don't get grumpy with me if I ask for hard evidence of your UFO or ghost.  It's just too simple to trick the human brain -- and scientific measuring devices are a heck of a lot less easy to fool.

Thursday, January 20, 2011

The meaning of "bark bark bark bark"

An article in the Seattle Times last week tells the story of Chaser, the dog who has a vocabulary of over a thousand words.

Chaser belongs to John W. Pilley, a psychology teacher at Wofford College.  Pilley had read an article in Science about Rico, a dog who had been taught to recognize the names of two hundred different objects, and he set out to better that.  Working with Chaser four or five hours a day, Pilley showed Chaser objects, stating their names, then hiding them, then doing it again, up to forty times, with treats and other reinforcements for identifying things correctly.  He tried to add two new words a day, and also spent time reviewing ones learned earlier.

According to Pilley, Chaser "loves her drills."  She demands the four or five hours of work, gets fretful if she doesn't get it, and sometimes, Pilley says, he "has to go to bed to get away from her."

At this point, it will come as no surprise to you dog owners that Chaser is a border collie.

Border collies are not, in my opinion, dogs.  They are doglike entities created by aliens from the planet Neurotica-6, which were then put on earth to infiltrate the ranks of real dogs and learn to emulate their ways.  This effort has been only partially successful.  I say this because I own a border collie, Doolin, who is the single oddest animal I've ever owned, and she's had some stiff competition in that regard.  Doolin learned how to unlatch our fence gates by watching us do it.  She has no concept of the word "play;" when she chases a frisbee and brings it back, you can tell that what she's thinking is, "Didn't I do a good job retrieving this frisbee?  I did notice, however, that I was 5.8 milliseconds shy of my previous record time.  Next time, I will beat my old record!  You'll see!"  Thus the joke:

Q:  How many border collies does it take to change a light bulb?
A:  Only one!  And then he will rewire the electrical system to bring it up to code!

Contrast this to my other dog, Grendel, who is a mutt to the extent that he looks like the result of someone putting random body parts from about seven different dog breeds together with superglue.  All Grendel thinks about is playing, food, and sleeping.  To say that he and Doolin don't understand each other is a vast understatement.  Mostly when they interact, they seem to regard each other with mild puzzlement.  Grendel seems to be thinking, "It looks like a dog, and smells like a dog.  But it never wants to play.  Oh, well, I do!  Where's my rope toy?"  Doolin, on the other hand, thinks, "Dear lord, that's a funny-looking sheep.  No matter, I can still herd it.  There's a job to be done here, and I'm the one to do it!"

In any case, back to Chaser.  After teaching Chaser over a thousand nouns, Pilley went on to teaching her some verbs -- touch, paw, fetch, nose, and so on.  And even more amazingly, Chaser understands categories; each of her frisbees (for example) has its own name, and she knows them all, but given the command "fetch a frisbee" she will pick out one of them.  Now, Pilley is working on trying to teach Chaser syntax -- the idea that changing the order of the words can change the meaning of the command; that "touch the red frisbee, then fetch the green ball" means something different than "fetch the green frisbee, then touch the red ball."

I find this absolutely fascinating.  I wonder if what is going on in Chaser's brain is the same as what happens when children learn words -- i.e. if there are analogous language-learning center in dogs' brains and in humans'.  I wonder, too, what the limit of her understanding is -- if she could be taught to understand consequentials, for example -- "if I touch the red toy, you touch the blue one; if I touch the blue toy,  you touch the green one."  (I know some students who still haven't mastered simple consequentials such as, "If you don't turn in your homework, you will get a bad grade.")  Lastly, I wonder if this is a skill unique to border collies, or if other dog breeds, or other animal species, might have the same skill.  I doubt seriously whether Grendel's vocabulary, for example, could ever be extended past "rope toy," "dinner time," and "youwannagoforawalk?"  And our cats are hopeless -- not that I don't think they have adequate brainpower, but because any time I try to train them to do something, they respond with scorn.  "Stay off the dinner table?" they seem to say, their expressions dripping sarcasm.  "Maybe for the moment.  But you have to stop watching me at some point, you know."

But keep your eye on Chaser.  She's going places.  I'm sure she'll be making the rounds of the talk shows, and after that, the next logical step is the political arena.  Wouldn't you like to see Sarah Palin and Chaser debate the merits of Universal Veterinary Health Care?  I know I would.

Wednesday, January 19, 2011

Six degrees of cousinhood

Allegedly we're connected to anyone in the world through six degrees of separation.  That contention usually uses the criteria of "knowing someone" as the connector.  What, however, about being actually blood-related to the people we bump into?

I've had three instances of finding I'm related to someone that, by all odds, I shouldn't have any particular connection to.  The most recent, and to my mind most amazing, example of this I discovered only recently.  I was on the way to a gig in Rochester with my band Crooked Sixpence, and Kathy, the fiddler, was carpooling with me and our friend Pamela who calls many of the dances we play for.

We were just chatting idly when the subject of family came up.  Kathy, who was born and raised in southeastern England, was telling us that while she seems thoroughly British, actually one side of her family were French Jews who came over to England from Alsace in the late 1800s.

"That's interesting," I said.  "My family is mostly Cajun French, and they were Catholics who came over to Nova Scotia from France in the 17th century; but one branch of my family were part of a small Jewish group in Donaldsonville, Louisiana -- and my forebears on that side of the family came over from Alsace in the 1800s, too.  What was your Jewish ancestor's last name?"

"It's a pretty odd name," Kathy said.  "Godchaux."

Well, my jaw dropped; my great uncle, Lehmann Meyer, married a Godchaux.  This spurred Pamela to announce at the dance that Kathy and I were cousins, which got a good laugh because we look nothing alike, and there it rested.

Well, a couple of days ago, I decided to do a little digging, and found that someone had posted online records of a Godchaux family that had gone from Alsace to England.  I sent the link to Kathy, and she responded, "Je suis GOBSMACKED!  C'est ma famille!"  In fact, the records included the names of her grandmother and grandfather!  Intrigued, I started to examine the information on the link I'd sent more carefully.

Kathy's family, and mine, not only emigrated from the same area of France at the same time, the names of her ancestors and the folks they married share not one, or two, but eight family names.  On the Meyer branch of my family there are (by blood and by marriage) the names Bloch, Godchaux, Levy, Solomon, Kahn, Weill, and Dreyfus.  Amongst Kathy's Godchauxs are... Meyer, Bloch, Levy, Solomon, Kahn, Weill, and Dreyfus.

I still haven't found our common ancestor, but if we're not cousins, I'll be astonished.

What's the likelihood?  A French guy from Louisiana and an Englishwoman from near London, and we're probably cousins within six generations or so.  And, as I said, this isn't the first time this has happened to me; a former student, who was born and bred here in upstate New York, turned out to be a third cousin, once removed; and a woman who sat near us at Cornell hockey games for years, a third cousin.  In each case, we discovered the link through casual conversation that turned up the connection.

I know we're all related -- it's become almost a cliché.  But what has left me, like Kathy, gobsmacked is that we may be more closely related to some of our friends and casual acquaintances than we would have dreamed.  I wonder, if it were possible... if we're connected to everyone in the world by six degrees of separation, what is the average degree of cousinhood we share with those around us?  Unfortunately, it's probably not possible to figure that out, given the paucity of genealogical records prior to 1800, but as my experiences show, it may well be a smaller number than any of us would have guessed.

Tuesday, January 18, 2011

Send in the clones

A group of scientists at Kyoto University are currently working on resurrecting the woolly mammoth.

The method is simple in principle; they will use tissue from a frozen mammoth carcass found in Siberia.  Nuclei will be removed from cells in the tissue, and those nuclei will be inserted into the egg cells of an elephant which have had their own nuclei removed.  The engineered egg cells will be inserted into the uterus of a female elephant, and if all goes well, the elephant will give birth to a baby mammoth.

Of course, in practice, it's quite a bit more difficult than that.  Differentiated tissue (i.e. just about all the tissues in an adult organism) has undergone genetic changes that have to be undone, returning the cells to totipotence -- the state of then being able to re-differentiate into all the kinds of tissue the animal produces.  Put simply, if this isn't done, skin cells can only produce more skin cells, muscle cells more muscle cells, and so on.  The trick is to return the cell to the capacity it had very early in development.

It's been done before, not that it's easy; consider Dolly the Sheep, the first animal born that was the result of adult-tissue cloning.  Some animals have proven very difficult to clone -- to my knowledge, monkeys have never been cloned from adult tissue, although my source for that particular piece of information is two years old, and in this field things change nearly on a daily basis.  And the idea of using cloning to produce animals that are endangered (or extinct) has already been done; a gaur, an endangered species of water buffalo, was produced that way in 2001, but the baby only lived two days.

And this brings us to the risks.

First, there's the risk of it being a big waste of money, and I hope my readers know me well enough by now that I'm not saying this from any sort of anti-science stance.  The gaur that only lived two days died of dysentery, but some scientists believe that it was felled by the cloning process itself.  Recall that Dolly the Sheep lived only half the lifespan of a normal sheep.  Cells seem to retain a "memory" of the age of the animal they were taken from, and so if someone cloned me (heaven forfend), the baby thus produced would be normal in all respects except for two -- first, it would look like me, which is unfortunate but not fatal; and second, its cells would very likely retain the genetic memory of having been taken from a fifty-year-old, and therefore the cloned Gordon would probably die of old age by age thirty or so.  So one has to wonder if a mammoth born from this process would live long enough to make it worth it.

Second, of course, there are the vaguer fears of resurrecting extinct animals, which of course have only been made worse by Jurassic Park.  Many folks seem to be reacting to the mammoth-cloning project by saying, "Don't you people ever watch science fiction movies?  Inevitably, the scientists plunge right on ahead with their experiments, ignoring the people who are worried about the risks, and then next thing you know there are herds of giant, malevolent mammoths destroying Tokyo."

Well, maybe.  I think the former problem -- that it will be unsuccessful, and therefore something of a waste of time, effort, and money -- is far more likely.  Nevertheless, I think it should proceed.  There's just the coolness factor of getting to see, finally, what an animal looks like that went extinct long ago.  Myself, I'd love to see them bring back a few others -- how about the dodo?  Or the moa?  (For those of you who don't know what a moa is, picture a badass ostrich on steroids, and you have the idea.)  The Tasmanian wolf would also be near the top of my list, as would the saber-toothed tiger, although I suspect that last one went extinct long enough ago that it might be impossible to find intact cell nuclei.  All animals with high awesomeness factor, however.

I recognize the fact that even if the cloning project is successful, it is a long way from producing a single individual that way to producing a large enough number of them to generate a self-sustaining population, that is capable of reproducing faster than their death rate -- what ecologists call the "minimum viable population."  The question comes up, of course, of where we would put a herd of mammoths once we got one -- heaven knows I don't want them around here, we have enough trouble with deer eating our gardens.  But that's a question to be resolved later.

In any case, we'll keep our eye on the team at Kyoto University, which is predicting success within five years.  Whatever happens, they are sure to learn a great deal about the cloning process from this study.  All Jurassic Park-style fears aside, it's a pretty amazing thing to attempt, and I wish them success in this mammoth undertaking.

Monday, January 17, 2011

Swamp thing redux

A friend of mine, knowing both my Louisiana origins and my passion for all things cryptozoological, sent me the following clip of a news broadcast alleging that some hunters caught a photograph of a zombie-like creature that had trashed their hunting camp in Berwick, Louisiana.  A security camera at the camp caught the image right before the camera was destroyed.  (See it here.)

Well, first I must comment upon the totally, like, you know, amazing articulateness of the, like, newscasters, both of whom like made me totally wonder how anyone would, like, you know, hire them to do the news.  Secondly, the word "Photoshop" screamed itself across my brain as soon as I saw the photograph.

Among the many problems with this photo, the most important one was that the creature's eyes were glowing.  This only happens when you take a photograph with a flash at night -- the glow is the reflection of the flash from the tapetum lucidum, a reflective membrane at the back of the retina of certain animals.  Security cameras, not having flashes, wouldn't create this effect.  So unless you believe that a zombie's eyes glow from the Fire of Their Inner Evil, this one seems to be a non-starter.

Watching this clip of course started me on a veritable orgy of monster-watching, and even if predictably I thought none of them convincing, I found a few that were worthy of honorable mention. My favorite one, in the chills department, is this one, which supposedly captures images of a "shadow creature" taken by some hikers who were using a videocamera.  The videocamera was later found, Blair-Witch-style, and "no one has ever come to claim it."  It's definitely creepy -- not to be watched at night.  However, once again to point out only the main problem with it, it is supposed to come from "Emerson County" in which mysterious disappearances had occurred in 1957, and this camera was found only two miles from where those disappearances had taken place.  Unfortunately for the creators of this video, there is no Emerson County in any state in the United States -- surprising, I know, but I just checked the official Index of Counties, and it goes from Emanuel County, Georgia to Emery County, Utah with nary an Emerson to be found.  So, First Rule of Creepy Video Creation is:  If you expect your viewers to believe that your video is real, don't state that it was taken in a place that does not, in fact, exist.

Here's another, with a bit of a frame from "The Paranormal Report." Besides the obvious problem with Clayton Morris' statement that the reporter who sent it in had no reason to fake it because he didn't want his name mentioned, there's another, and subtler problem with it, that makes me certain it's faked.  Watch it and see if you can figure it out.

Ready for the answer?

The creature's shadow is pointing the wrong direction.  Look at the shadow on the man's face; it's clear that the sun is coming from the right side of the frame (from the viewer's perspective), i.e. from behind the cameraman's right shoulder.  The creature's shadow should therefore be pointing up and to the left (again, from the viewer's perspective).  It points up and to the right.  Unless "interdimensional creatures" block sunlight in a different fashion than we ordinary, plain-old-dimensional creatures, it's a fake.

This next one has all of the classic elements; hikers in a remote area, videotaping just for the hell of it, and accidentally capturing a clip of a bigfoot.  "Did you see that... in the clearing!" is also required dialogue to insert somewhere in there.  Even the site is well-chosen; Mt. St. Helens is supposedly the epicenter of Pacific Northwest bigfoot sightings.  (There's a lava tube on the side of the mountain called "Ape Cave," which some claim is because of the prevalence of sasquatches in the area -- but the real explanation is more prosaic.  It was named after a hiking group called the "St. Helens Apes" in the 1950s.)  I don't have any particular reason to claim that this one's a hoax, except that I tend to instinctively doubt clips like this because of the obvious likelihood of fakery; but to my eyes, it does look a little more like a guy in a monkey suit than it does like my personal conception of bigfoot.

Lastly, I have to give some credit to the makers of this video.  Myself, I didn't know that bigfoot wears Nike tennis shoes, but I guess even cryptozooids have to cave in to fashion trends in athletic wear.  And the tag line "The people in the car and the cameraman were never heard from again" is pure brilliance.

Friday, January 14, 2011

This is the dawning of the Age of... Capricorn?

The hottest news today, for those who believe that their personalities, destinies, and love lives are controlled by the positions of distant planets relative to arbitrary patterns of even-more-distant stars, is: you're not the astrological sign you think you are.

The ancient Greeks are the ones who are responsible for a lot of the names we use for constellations today.  They looked up into the night sky, probably after having tanked up on ouzo and retsina, and instead of seeing what most of us do -- a completely random arrangement of stars -- they saw patterns that reminded them of people, animals, and objects from their myths and folk tales.  Thus we have a vague, wandery curve of faint stars that is Draco the Dragon, a pair of bright stars that is Canis Minor the Little Dog, a crooked zigzag that is Cassiopeia the Celestial Queen, and a little group of six stars that is Waldo the Sky Wombat.

Okay, I made the last one up.  But some of them are equally weird.  There's Coma Berenices, "Berenice's Hair;" Fornax the Furnace; Volans the Flying Fish; for people who like things simple and obvious, Triangulum the Triangle; and for people in the southern hemisphere who like things simple and obvious, Triangulum Australe the Southern Triangle.

Even earlier, astronomers during the Babylonian times had noticed that the sun and the planets seemed to trace a path against the stars, and that path is the zodiac.  The twelve zodiac constellations are the ones that the sun seems to move through, as the earth travels around the sun; and your sign is supposed to be the constellation in which the sun seemed to reside at the moment of your birth.

But now, astronomers with the Minnesota Planetarium Society have released a bombshell.  Because the Earth's axis precesses, the constellations of the zodiac aren't lined up the way they were during the time of the ancient Greeks.  Precession happens because the Earth wobbles like a top as it spins, and the axis of the earth traces out a circular path every 26,000 years (meaning that Polaris won't be the North Star forever).  As a result, the whole zodiac has tipped by about ten degrees, and most likely you aren't the sign you think you are -- you are the one immediately preceding it, or possibly even the one before that.

Worse news still if you're a Sagittarius; not only are you not a Sagittarius, your sign is likely to be a constellation that isn't even part of the standard zodiac.  During Greek times, the zodiac actually passed briefly through the constellation Ophiucus, the Snake Handler, but because thirteen seemed an unpropitious number for the zodiac constellations, and also because "Ophiucus" sounds like the scientific name of an intestinal parasite, they threw it out.  Now, however, because of the precession of the Earth, the zodiac spends a lot longer in Ophiucus, and it's no longer possible to ignore it.  So if you were a Sagittarius, you're probably now an Ophiucus, and might want to consider a career as a herpetologist, or at least a snake charmer.

And I guess I'm not really a Scorpio.  This is too bad.  I kind of liked being a Scorpio.  They're supposed to be deep, intense, passionate, secretive, and a little dangerous, which I always thought was cool.  Now, I guess I'm a Virgo, which means I'm weak, stubborn, and petulant.  So I've gone from being James Bond to being George Costanza.  It figures.

Of course, I console myself with the knowledge that astrology is pretty silly anyhow; one has to wonder why anyone ever found it plausible that the fact that Saturn was in Capricorn at the moment of your birth is why you like cottage cheese.  (Okay, I made that up because I don't feel like researching what it really means if Saturn is in Capricorn.  But my point stands.)  Right now, I'm mostly curious to see what the astrologers will do -- if they will revise their astrological charts to reflect the actual positions of the sun and planets relative to the stars, or if they'll keep doing what they've always done.

My money is on the latter.  I'm guessing that they'll figure that they've never worried about a minor issue like whether their predictions have any basis in reality, so why start now?

Wednesday, January 12, 2011

Tell me suttin good

What do you call a long sandwich, on a French bread roll, usually with meat, shredded lettuce, and some kind of sauce?

What name you use for that delicious creation tells you a lot about what region of the country you grew up in.  Most people in the western part of the US call 'em "hoagies."  Here in upstate New York, they're "subs," but New York City folks call 'em "heroes."  In the upper Midwest, they're "grinders."  And in my home state of Louisiana -- "po' boys."

Regional accents abound in the United States, some different almost to the point of mutual incomprehensibility.  Thus the joke:

A New York City guy was on vacation, and was driving with his girlfriend through rural Maine.  Struck by a sudden romantic impulse, he pulled the car over, got out, hopped the low fence, and began to pick a bouquet of flowers from the field on the other side.

He'd not gotten very far when he noticed that he wasn't alone -- there was a bull staring at him, murder in his eye, pawing the ground.  The poor city boy looked around frantically -- he was too far from the fence to get there first if the bull charged, and no trees nearby to climb.  That was when he noticed an old farmer, leaning on the fence and watching the proceedings.

"Hey!  Mister!" the guy yells.  "That bull... is that bull safe?"

The farmer took his pipe out of his mouth, and thought for a moment.  "Oh, ayuh," the farmer said.  "He's safe."  He thought for a minute more, and then added, "Can't say the same for you, howevuh."

And now a study by Jacob Eisenstein of Carnegie-Mellon Institute has shown that regional dialects aren't just limited to our speech -- they are developing in our tweets and texts, as well.

Eisenstein and his group analyzed the words used in 380,000 tweets -- a total of 4.5 million words.  And they found that the origin of the tweet seemed to be strongly correlated with the presence of certain items of "text-speak" (the linguistic purist in me can't really call them "words").

Some weren't surprising; "yall" in the South, "yinz" in Pittsburgh, for the second person plural pronoun.  Others, however, were strange, and were evidence that text-speak is developing its own regional character, independent of the dialect of the speaker.  For example, "suttin" (for "something") was found all over New York City; "coo" or "koo" (for "cool") in California, with "koo" replacing "coo" and becoming progressively more common as you move northward through the state; and "hella" (for "very," as in "hella tired") in northern California through the Pacific Northwest.

I find this phenomenon fascinating, and also surprising.  Regional dialects in American speech developed primarily because of two things.  First, there are differences in the primary country of origin of the people who settled an area (e.g. France in southern Louisiana, Scotland and Northern Ireland in Georgia, Mississippi, and Alabama, England in Massachusetts, Connecticut, and Maine, and so on).  Second, the lack of mobility in most populations prior to 1940 or so meant that any linguistic conventions that arose were unlikely to spread very far.

Now, however, with texting, emails, Twitter, and so far, you'd think that any spelling conventions and slang that arose would not be confined to one geographic region -- they'd spread so rapidly that either they'd catch fire and everyone would start using them, or they'd dilute out and vanish.  Apparently, this isn't the case -- Eisenstein's study indicates that regardless of the fact that we're communicating more quickly, and over far greater distances, than ever before, we still tend to communicate like the folks we live with.

So, to paraphrase Mark Twain, the rumors of the death of regional culture are a great exaggeration.  That even applies, apparently, to text-speak and tweets.  And given that these sorts of things are what give different parts of the USA their local color, I don't know about yinz, but I'm hella glad about that.

Tuesday, January 11, 2011

Lost among the familiar

I have this peculiar inability.  I seem to be entirely unable to form mental maps.  I can, thank heaven, follow a regular old map, but without one, I'm sort of perpetually lost.

We visited family in Northampton, Massachusetts over the holidays, a town I've been to many times.  No matter how many times we go, I don't seem to be able to figure the place out.  I was driving on our way home, and as we were winding through the streets of Northampton, I was completely relying on my wife (the woman has an internal GPS system, I swear) to get me back to I-91.

I can't even begin to estimate the number of times I've been lost.  My usual method when I'm lost is to drive in a straight line until I see something familiar, which works okay around Ithaca but would not work so well in, say, Nebraska.  And your definition of "familiar" and mine probably differ somewhat.  You'd think that the stores and so forth in Northampton would be familiar by now, and in one sense they are; in fact, they're too familiar.

All physical landmarks pretty much look the same to me.  In Northampton, there are lots of brick buildings and 19th-century wood frame houses in pretty pastel colors.  Around here, there are fields and cows and houses and silos and so forth.  It's not that nothing looks familiar; everything does.  So, in the previous paragraph, by "familiar" I mean "so weird and stand-out that it's the only landmark of its kind in this entire time zone."  I only know that I'm approaching our exit from I-88, for example, because there is this huge structure -- I think it must be a radio transceiver or something -- that has been dressed up to look like a tree.  It is about twice as tall as all the other, real organic trees in the area, so the effect is not so much "Natural" as it is "Mutant Redwood from Outer Space."  It's unmistakable, and can be seen from about ten miles away.  That is the kind of landmark I need.

So, I constantly feel like I'm lost among the familiar.  When I'm in Manhattan it's especially bad, because almost all the streets meet at perfect right angles, and everywhere there are stores and businesses and people.  And they all look alike.  I think the only two sufficiently stand-out landmarks in Manhattan are Times Square and the Public Library, but if you only have two reference points and are not even all that sure where those are, it's really not all that helpful.

There's also the problem, when I'm on foot, of never knowing which direction I'm facing.  At least when I see the mutant redwood I'm always coming at it from the same direction.  If I'm seeing Times Square, and I'm trying to find my hotel, I have to know (a) what direction I'm seeing Times Square from, (b) what direction the hotel is from Times Square, and (c) what direction I have to turn to be pointed in the direction referenced in (b).  Usually, my choice is (d), walk in a straight line in some random direction and hope the hotel magically appears.  So far, I've been lucky, but mostly that's because when I'm in an unfamiliar place, I make sure to keep Carol less than five feet away from me at all times.  Occasionally, however, Carol will let me walk a little ahead, and just watch to see what I do when I get to a street corner.  This is when things get interesting.

On one visit to Manhattan, we were returning to our hotel from a night at the theater, and I asked Carol what direction the hotel was from our current position.

"North," she said.

So when we got to the next street corner, she informed me that we needed to turn right.  So I did.  And then I said, "Now what direction is the hotel from where we are?"

She looked at me like I'd lost my mind.  "It's still north," she said.

"It can't be," I said, adopting the really annoying Patient Teacher Voice I bring out when dealing with an especially slow student.  "You said it was north before, and then we turned a corner.  It can't still be north."

Carol stared at me, open-mouthed, and finally said, "Um, Gordon?  The position of the North Pole does not change every time you go around a corner."

Oh.  Right.  I guess I knew that.

It's a little frustrating that I was seemingly born without the Directionality Brain Module, but I guess I make up for it by my extra-special Tune-Remembering Brain Module and Name-Recall Brain Module.  All in all, I can't complain.  But if I ever turn up missing, don't be surprised.  You might suggest searching in Nebraska.

Monday, January 10, 2011

Music, emotion, and sex

I discovered the Bach Mass in B Minor when I was a teenager, and vividly remember the first time I listened to it -- I put the LP record on my dad's turntable, turned the volume up to 11 (that's for you fans of This is Spinal Tap) and lay on my back on the floor.  The work moves from dark to light, from driving rhythms to delicate sweetness, and I drowned myself in baroque counterpoint -- a wonderful way to die, I think.

Then came the bass aria, "Quoniam Tu Solus Sanctus."  It's a wandery little bit, quiet and mellow.  I almost drifted off to sleep.  And then, suddenly, the full chorus and orchestra explode into "Cum Sancto Spiritu."  I'll never forget that moment -- I felt like I had been physically lifted off the floor -- a shiver ran through my whole body.  It was one of the most visceral responses I've ever had to a piece of music.

Now, lest you think I'm some kind of classical music snob, I have to state for the record that I don't just have this kind of reaction to classical music.  Which music will send me into a state of rapture is a question I've pondered frequently, because there seems to be no particular rhyme nor reason to it.  I had similar reactions to Imogen Heap's "Aha," Collective Soul's "Shine," Iron & Wine's "Boy With a Coin," the Harlem Shakes' "Sunlight," OneRepublic's "Everybody Loves Me," Overtone's South African chant "Shosholoza," and the wild, spinning Finnish waltz "Kuivatusaluevalssi" as recorded by Childsplay.

All of which, by the way, you should immediately download from iTunes.

While I still don't understand why certain songs or pieces of music create this reaction in me, Robert Zatorre and Valorie Salimpoor of McGill University have now explained how the reaction happens.  In an article published Sunday in Nature, Zatorre and Salimpoor explain that what happens in the brains of music lovers when hearing favorite pieces of music is similar to what happens during sex -- there is a sudden release of the chemical dopamine.  This chemical is a neurotransmitter, and is part of what creates the rush of pleasurable sensation not only while doing the deed, but while listening to Bach -- or whatever music turns you on.  As it were.

Participants in the study underwent PET scans while listening to favorite pieces of music, and researchers found that dopamine was released in large amounts in a region of the brain called the striatum, which is part of the limbic system's pleasure-and-reward center.  Interestingly, the dopamine release started about fifteen seconds prior to a "peak moment" in the music in a part of the striatum associated with tension and anticipation, and then when the climax of the music came, there was a sudden rush of dopamine in a different part of the striatum, one connected to physical pleasure.

Myself, I don't find this surprising at all.  For me, music is all about emotion.  I can appreciate technically fine playing (or singing), but if a song or piece of music evokes no emotional reaction in me, it's not worth listening to.  When I teach music lessons, I've always tried to impress upon my students that when you can play the notes and rhythm correctly, up to the correct speed, you're halfway there; the other half is learning how to express feeling through the music.

I still find it a fascinating, and unanswered, question why certain pieces resonate with one person, and leave another completely cold.  I know that although we like the same basic musical styles, my wife and I have very different taste when it comes to specific songs, and neither of us can really put our finger on why a particular song blows us away, and another leaves us shrugging our shoulders.  I suspect that that is a question that will never be resolved -- it's as personal, and as mysterious, as one's favorite food, favorite color, or (more to the point, apparently!) what one finds sexually arousing.

There's also the question of what possible evolutionary purpose this reaction could have.  Something so powerful, and so universal, must provide some kind of evolutionary advantage, but I'm damned if I can see what it might be.

Despite the fact that there are still questions -- and in science, there always are -- at least now there's a physiological explanation of what's going on in the brain when this reaction occurs.  I find this fun and fascinating, and am glad to finally have an understanding of something I've always experienced, and always wondered about.

And now, I think I'm going to go listen to the Mass in B Minor.

Saturday, January 8, 2011

Words, words, words

In Dorothy Sayers' novel Gaudy Night, set (and written) in 1930s England, a group of Oxford University dons are the targets of threats and violence by a deranged individual.  The motive of the perpetrator (spoiler alert!) turns out to be that one of the dons had, years earlier, caught the perpetrator's spouse in academic dishonesty, and the spouse had been dismissed from his position, and ultimately committed suicide.

Near the end of the novel, the main character, Harriet Vane, experiences a great deal of conflict over the resolution of the mystery.  Which individual was really at fault?  Was it the woman who made the threats, a widow whose grief drove her to threaten those she felt were smug, ivory-tower intellectuals who cared nothing for the love and devotion of a wife for her husband?  Or was it the don who had exposed the husband's "crime" -- which was withholding evidence contrary to his thesis in an academic paper?   Is that a sin that's worth a life?

The perpetrator, when found out, snarls at the dons, "... (C)ouldn't you leave my man alone?  He told a lie about somebody who was dead and dust hundreds of years ago.  Nobody was the worse for that.  Was a dirty bit of paper more important than all our lives and happiness?  You broke him and killed him -- all for nothing."  The don whose words led to the man's dismissal, and ultimately his suicide, says, "I knew nothing of (his suicide) until now...  I had no choice in the matter.  I could not foresee the consequences... but even if I had..."  She trails off, making it clear that in her view, her words had to be spoken, that academic integrity was a mandate -- even if that stance left a human being in ruins.

It's not, really, a very happy story.  One is left feeling, at the end of the book, that the incident left only losers, no winners.

The same is true of the tragic shooting today of Rep. Gabrielle Giffords of Arizona. 

At the writing of this post, Rep. Giffords is still alive, but an innocent child and a federal judge are both dead because of the shooting.  The shooter, Jared Loughner, is clearly mentally ill, to judge by the YouTube video he had posted (now taken down) and posts on his MySpace page (now also gone).  But at the center of his rage were nothing more than words.  Words, words, words.

His video clip rails against the government, posits conspiracy theories about mind control, claims that America is a "terrorist nation."  He didn't come up with those words himself; others put them there.  Others fed him those distortions, and in his twisted, faulty logic he bought them wholesale.  Loughner himself is, of course, responsible for the shootings; but what blame lies with the ones who, whatever their motives, broadcast the ideologies he espoused?

Sarah Palin's website posts a map of vulnerable Democratic members of congress -- and identifies them on the map with rifle crosshairs.  (See the map here.)  And she's not the only one.  How about Ann Coulter:  "It's the Christmas season, so godless liberals are citing the Bible to demand the redistribution of income by government force."  Or Pat Buchanan:  "If the left hasn't realized it yet, Obama has: liberals have lost the country.  The liberal hour is over in America and the West."  And lest you think that the inflammatory rhetoric comes only from the right, how about Ted Rall:  "Like Jon Stewart's Million Moderate March, No Labels is meant 'not to create a new party, but to forge a third way within the existing parties, one that permits debate on issues in an atmosphere of civility and mutual respect,' say organizers.  Sweet.  Because, you know, you should always be civil and respectful to people who think torture and concentration camps are A-OK."

And, of course, all of these folks want to accomplish two things; to use emotionally-charged language in order to make their own opinions sound unassailable, and to generate such a negative spin on their opponents' thinking that readers are left believing that only morons could possibly agree with them.  The most appalling thing about the coverage of the shooting of Giffords and today's other victims was the immediate volcanic eruption of posts and tweets -- half of them labeling the shooter Loughner as a Tea-Party Ultra-Right-Winger who had attacked Giffords because she was too liberal (based upon his anti-federal statements and his identification of Mein Kampf on his MySpace page as one of his favorite books), and the other half identifying him as a loony leftist who had attacked Giffords because she was too conservative (based upon his stated atheism and his identification of The Communist Manifesto as one of his favorite books).  A frighteningly small number stated the truth: that Rep. Giffords is a devoted, hard-working woman who wants only the best for her country, and her attacker is simply crazed and delusional.

I'm appalled not just because these political hacks are using this tragedy to hammer in their own views with an increasingly polarized citizenry; but because they are doing this, blind to the end results of their words, just like the Oxford don in Gaudy Night whose dedication to the nth degree of academic integrity made her blind to the human cost of her actions.  Words are tools, and they are using them with as much thought and responsibility as a five-year-old with a chainsaw.

I will end with a devout hope that Rep. Giffords and the other wounded individuals in today's attacks will pull through and eventually be healed completely of their injuries, and that the families of those who died will be able to find consolation in the outpouring of sympathy from the vast majority of Americans who still value compassion over political rhetoric.  And to the ideologues who are using this tragedy as a platform to trumpet their views, I can only say:  shut the hell up.

Friday, January 7, 2011

Her tears, like diamonds on the floor

Crying is one of the weirdest biological phenomena.  Try to think about it from a non-human perspective, as if some benevolent alien scientist came to earth to study humanity.  So picture yourself being interviewed by the scientist, as you are clearly one of the more intelligent native life-forms:

Dr. Xglork:  "So, this crying thing I've heard of.  What is 'crying' and why do you do it?"

You:  "Well, when humans get sad, they start breathing funny, in little fits and starts, and water comes from their eyes."

Myself, I think that our Dr. Xglork would be justifiably mystified at how that sort of reaction makes any sense.  "How does that make you feel better?" he'd probably ask, looking at you quizzically from seven of his twelve eyes, while making notes on a clipboard held in his tentacles.

And yet, it does, doesn't it?  I'll admit, I cry easily.  Somehow guys aren't supposed to be that way, but there's no use denying it.  I cried my way through the last third of The Return of the King, embarrassing my older son to the point that for two years after that he refused to sit next to me in the theater.  I've cried over songs, television shows, and books (I almost had to wring out my friend's copy of Marley and Me before I could return it).

And after you cry, you feel better.  You don't look better, unless you somehow find red eyes and a snotty nose sexy; but you do somehow feel more relaxed and centered.  This universal reaction led scientists to surmise that crying was doing something to the levels of chemicals in the blood, so they did a study in which volunteers were put in a variety of situations that made them cry, and were asked to collect their tears in a vial.  Some were just exposed to irritants, like onions; others were shown sad movies (I'd have needed a bucket).  Then they chemically analyzed the tears to see if there were differences.

And there were.  There were proteins present in the tears we cry when we're sad that are absent in the ones we cry because our eyes are irritated.  This implies an interesting function for crying -- ridding our blood (and therefore presumably our brain as well) of chemicals which are making us feel sad or stressed.  Crying therefore does serve an important function, as our emotional reaction afterwards would suggest.

And just a few days ago a new study became public that sheds even more light on the whole thing.  Friday's issue of the journal Science included an article by Noam Sobel of Israels' Weizmann Institute of Science.  Sobel and his team took the crying study one step further -- they wanted to find out the effects of crying not on the person who was doing the crying, but anyone nearby.

So they collected tears from female volunteers (it being difficult, according to Sobel, to get a guy to cry in a lab; maybe they should have flown me over there).  They then allowed male volunteers to smell the vials of tears, including some vials of saline solution (as a control).

The team's hypothesis -- that there was a pheromone in tears that elicited empathy in others -- turned out to be incorrect.  When shown photographs of sad or tragic events, the men who'd smelled the actual tears didn't rate them as any sadder, or their emotional reaction to them as any stronger, than the guys who'd smelled the saline solution.

The real surprise came when the guys in the study were asked to rate various women's photographs for sexual attractiveness, and they found out that the guys who'd smelled the tears rated all the photographs lower than the guys who'd smelled saline did.  And -- most amazingly -- when given a quick saliva test for testosterone levels, the guys who'd smelled the tears showed lower levels of testosterone than the control group, and when given an MRI, lower activity in the parts of the brain associated with sexual arousal.

So crying, it seems, has a chemical "not NOW, honey!" feature.  This whole thing opens up a variety of questions, however.  First, it makes you wonder how the writers of Seinfeld ever came up with the idea of "make-up sex."  Second, do male tears have a pheromone as well?  Apparently Sobel's team has now found a "good male crier" and is going to see if there's any kind of reciprocal reaction in women -- and I'll bet there is.  And third, and most important -- does this explain the phenomenon of the "chick flick?"  I'll leave that one for you to decide.

Thursday, January 6, 2011

I felt the earth move under my feet

I'll bet that you think you know what causes earthquakes.

You probably learned a lot of stuff from your Earth Science teacher in ninth grade about plates and rifts and trenches and magma and so on, and you think that an earthquake occurs when the plates are pushing against each other, and one of them slips a little.

Ha.  A lot you know.

A new study by a fellow named Patrick Regan has found that earthquakes are, in fact, caused by UFOs.

Why should you believe Patrick Regan, you might ask?  Well, to start with, he's the founder of the Northwest (England) UFO Research Society, and has written two authoritative books, UFO: The Search for Truth and The New Pagan Handbook.  (I didn't even know that there was an old pagan handbook, did you?  I always figured that in the olden days, pagans just sort of capered about naked in the woods, sacrificing goats and worshiping oak trees and so forth.  I never knew they had a handbook, although I admit that must have made it easier to figure out if they were doing it right.  "Hey, Prolix!  This is the rain ritual!  After sacrificing the goat, you're supposed to caper about the oak tree in a counterclockwise direction, not clockwise!"  "Dammit, I knew I should have looked it up in the handbook.  What does the ritual mean if you caper in a clockwise direction?"  "Let me look it up."  *brief pause*  "Well, Prolix, if you have erectile dysfunction, you're in luck!")

Anyhow, Pat Regan noted a sudden spate of UFO sightings in Cumbria, in northern England, and predicted that the Brits should be on their toes for earthquakes.  And lo, on December 21, there was a magnitude 3.5 earthquake centered in Coniston, in the Lake District.

Note, too, that this earthquake happened on the Winter Solstice.  Don't expect me to believe that's a coincidence.  Pat either.  You can read about his ideas, if I can use that word rather loosely, here. (One warning for the faint of heart, however; this web page has a very scary photo of Pat holding his UFO book, in which he looks like the scraggly, unwashed, beater-clad, wild-eyed dude you avoid sitting next to on the subway.  Don't say I didn't warn you.)

So, what do we have here?  Well, nonsense, but besides that?  What this seems to be is a guy with a fairly weak grip on reality whose hobby is collecting unsubstantiated anecdotes from credulous folks who think they've "seen something weird in the sky," and he's even cherry-picked that data (again, to use the word fairly loosely) by selecting the "UFO sightings" that occur in proximity to a measurable earthquake.  And since both measurable earthquakes and UFO sightings occur every day somewhere, they're bound to occur near each other sometimes.  Aha!  There's a correlation!  Not to mention causation!  Let's write a book about it!

On a more serious note, what bothers me about all this is not that some wacko has a theory.  Wackos always have theories; it's what wackos do.  What bothers me is when, as happened yesterday, something like this gets picked up by the popular media, and it becomes "news."  I'm sorry, Purveyors of Popular Media:  this is not news.  This is at best laughable fiction, and at worst publishing the rantings of someone who is delusional, and encouraging people who are easily duped to believe it just because they've seen it on the Yahoo! news, or the like.  It's hard enough to get people to think critically without the press printing stories like this.  I know; I teach critical thinking, and it's an uphill struggle, sometimes.

So I'd really appreciate it if the media wants to find "odd news" or "local color" stories, they'd stick with cute video clips of cats who like to sit in boxes and stories about would-be suicides jumping off buildings and being saved because they landed in a pile of garbage that the city garbage collectors had neglected to take away.  Stories about woo-woos who've discovered a connection between UFOs and earthquakes just make my job harder, and it's hard enough as it is.  I thank you, and so do the ninth grade earth science teachers.

Wednesday, January 5, 2011

Curses! Foiled again!

New from the "You'll Think I'm Making This Up, But I'm Not" Department, witches in Romania are up in arms about a new law that requires them to pay income tax on their earnings.

A rewrite of the tax code has included "witch, fortuneteller, and astrologer" as professions that are recognized as generating taxable income.  Now, like any other self-employed person, the wand-and-broomstick contingent will owe taxes on the fees they charge (the tax rate is 16% in Romania).  This, as you might imagine, has caused the Witches' Association of Romania, Local Kollective (WARLocK) to flip their tall pointy hats. 

And you can bet they aren't just going to take this lying down.  They threatened serious action.  Romania's head witch, Bratara Buzea, concocted a magic potion made of cat feces and a dead dog.  Besides the obvious deterrent effect that anything made of cat feces and dead dog would have, apparently this particular potion was meant to bring evil fortune to the lawmakers who voted for the new law.

"My curses always work," Buzea is quoted as saying.  (One source stated that she "cackled" the words in a "smoky voice."  I thought that was worth throwing in there, just for the added color.)  Other witches hurled poisonous mandrake plants into the Danube River, chanting magic spells in the direction of Bucharest.

The lawmakers, of course, couldn't tolerate these kinds of threats.  You don't just aim blobs of cat crap and dead dog at a congressperson, or throw random plants into a river, and somehow get away with it.  So the elected officials who were thus threatened took immediate and direct action; they all came to work wearing purple.

Wearing purple, as we all know, wards off evil.  (It's probably how the Queen Mum lived to such a ripe old age.)  So the government officials came through the dangerous ordeal unscathed after all, which I know will come as a great relief to us all.

It's no wonder there was such a hue and cry.  Creatures of darkness have a long history in Romania, and superstitions run rampant.  Remember that this is where Dracula got his start (so the tax officials might have some competition in the bloodsucking business).  Former Romanian president Nicolae Ceausescu even had his own personal witch, not that it did him much good; he was overthrown and he and his wife executed by firing squad.  Maybe his witch ran out of mandrakes or something. 

And in the interest of fairness, it bears keeping in mind that not every Romanian witch was angered at being expected to pay her fair share.  Mihaela Minca, a witch in the town of Mogosoia, supports paying taxes.  "It means that our magic gifts are recognized," she said.  "Now I can open my own practice."

So, all in all, things seem to be settling down.  This is good.  I can imagine that it'd be hard to get anything done with cat poop, dead dogs, and plants being hurled about in the halls of government, not to mention hexes and so forth. (It would, however, make it much more interesting to watch C-Span.)

One has to wonder where all of this will end,  however.  It's a slippery slope.  If "witches, fortunetellers, and astrologers" are now considered as professionals, pretty soon talk-radio hosts, advertising executives, and the members of the cast of Jersey Shore will expect to be recognized as productive members of society.  After that, it's only a matter of time before Ann Coulter is granted "human being" status.

Can the death of civilization be far behind?

Monday, January 3, 2011

Four and twenty blackbirds

It sounds like something from The X-Files.

Shortly before midnight on New Year's Eve, about three thousand Red-winged Blackbirds started falling from the sky, near the town of Beebe, Arkansas.  They were apparently dead before impact; one hit a police car, and another struck a woman out for a late-night walk with her dog.

The types of things this would immediately bring to mind -- poison, for example -- make no sense here.  A poison that only affects blackbirds is ridiculous (although, to be honest, apparently a few Common Grackles were also killed; but still).  If there'd been some kind of aerial spraying of a quick-acting toxin, you'd expect that lots of other animals would also have been killed.  Once this was ruled out, other theories began to circulate -- that the birds had been awakened, and startled into flight, by fireworks, and had flown into buildings; that they had been killed by a weather-related event, such as a high altitude hailstorm; that they had been struck by lightning.

None of these seem to hold water.

The frightened-into-collision hypothesis doesn't match the scatter pattern made by the carcasses; I've seen video clips and still photos (check out a video here) and many of the birds didn't land anywhere near buildings.  There aren't any tall buildings in Beebe, anyhow; and from the apparently random way they have fallen, they look to me like they were killed while still aloft and dropped to the ground, landing wherever they happened to land.  The hailstorm and lightning-strike explanations don't line up with the fact that the birds showed no sign of external injuries; hail strikes hard enough to kill would break bones, and lightning would singe feathers.  They seem to have simply... died, suddenly, mid-flight, and plummeted to the ground.

Necropsies performed today showed that many of the birds had internal blood clots sufficient to kill them; but this is by itself only a proximal cause.  What caused the clots to form?  It's hard to imagine anything that could happen to a bird in flight that could cause internal bleeding, much less something that could happen to cause internal bleeding in three thousand birds more-or-less simultaneously.

All of this has wildlife biologists scrambling for answers, and the townsfolk of Beebe are understandably spooked.  One man, interviewed by the local news, said he's not going to let his children play outside until this is solved.  One might accuse him of overreacting -- but honestly, isn't his fear justified?  I know if I woke up one morning to find my front and back yards littered with dead birds, I'd be more than a little skeeved out.

Me, I'm wondering where this will all go.  I'd lay even odds that we'll never figure out what caused the deaths, and it will be filed amongst scientists as one of those oddball phenomena which were never adequately explained -- and will become fuel to the fire to the conspiracy theorists and the-end-of-the-world-is-nigh types.  Already there are websites claiming that this is a sign of the approaching End Times -- although I don't recall from my reading of Revelations anything about birds dying en masse.

In any case, keep your eye on the news, not to mention the sky.  I'd imagine getting beaned by a dead blackbird would smart a little.