Skeptophilia (skep-to-fil-i-a) (n.) - the love of logical thought, skepticism, and thinking critically. Being an exploration of the applications of skeptical thinking to the world at large, with periodic excursions into linguistics, music, politics, cryptozoology, and why people keep seeing the face of Jesus on grilled cheese sandwiches.

Saturday, April 29, 2017

Awoo

Yesterday I was asked by one of my Critical Thinking Students if I'd ever heard of Florida Swamp Apes.  After a brief moment in which I wondered if he were asking about a sports team, I answered in the negative.  He brought out his cellphone, on which he had downloaded an admittedly creepy image, which I include below:



Imagine my surprise when I found out that there's a whole site devoted to this odd beast, also called the "Florida Skunk Ape" for its strong smell.  Considered to be the "southernmost Bigfoot species in the United States," the Florida Skunk Ape has been sighted all over southern Florida, but most commonly in the Everglades region.

As with most of these alleged animals, the claims of sightings are numerous and varied, and the hard evidence essentially non-existent.  There are a lot of photographs, but to borrow a line from the astronomer Neil DeGrasse Tyson, there probably is an "Add Bigfoot" button on PhotoShop, so we shouldn't consider the photographic evidence to be evidence at all.  Also on the website is an audio clip of a Skunk Ape's howls, which to my ear sounded more like a distant dog, or possibly a guy going "Awoo."  We also have an interview with Dave Shealy, who seems to be one of the people responsible for the whole Skunk Ape phenomenon (he is the director of the Skunk Ape Research Center of Ochopee, Florida, open 7 AM to 7 PM, admission $5, which I am definitely going to visit next time I'm in Florida).  Lastly, we are informed that Skulls Unlimited, a company which sells a virtually unlimited number of skulls (thus the name), is now offering resin models of Bigfoot skulls.   One has to wonder what they cast the mold from, but in the field of cryptozoology it is sometimes best not to ask too many questions.

I thought I had heard of most of the cryptozoological claims from the United States, but this one was new to me.  Of course, the Sasquatch of the Pacific Northwest is so familiar by now as to elicit yawns, and many of us know of the Boggy Creek Monster of Fouke, Arkansas, which generated not one, nor two, but five truly dreadful movies.  There's Mothman and the Flatwoods Monster in West Virginia, the Dover Demon of Massachusetts, the Enfield Monster of Illinois, Goatman of Maryland, and dozens of others.  But the Skunk Ape is one I'd never heard of before, and I'm finding myself wondering how I missed it.  It did cross my mind briefly that perhaps the Skunk Ape sightings were merely elderly Bigfoots from the north who had moved to Florida when they retired, but apparently this is incorrect, as one site talks about a sighting of a "young and vigorous animal, probably an adolescent" and another refers to "Skunk Ape mating season"  (May, if you're curious; but you might want to refrain from wandering around the swamps of Florida in May, because two female European tourists tell the story of being chased by a "huge male Skunk Ape with an erection."  They got away, fortunately.)

"Not everyone who sees a Skunk Ape reports it," Dave Shealy says.  "They don't want people to poke fun at 'em, or to tell 'em they're crazy. That's not the exception; that's pretty much the rule...  There's never been a documented case of anyone ever being physically attacked by a Skunk Ape.  But also, there's a lot of people that go into the Everglades that never come out." 

Which really isn't all that reassuring.

In any case, the Florida Skunk Ape gives us yet another line in the ledger of Extraordinary Claims Requiring Extraordinary Evidence Of Which There Seems To Be None.  It's just as well, because it's the last week of April, so Skunk Ape mating season is almost upon us, and if there really was evidence that this thing exists I would feel duty-bound to go investigate, and the last thing I want is to be chased around in some godforsaken swamp by a Bigfoot with a boner.  So I think I'll give this one a pass.  

Friday, April 28, 2017

Playing on the heartstrings

I'm a pretty emotional guy, and one of the things that never fails to get me is music.  Among the musical moments that always grab me by the feels and swing me around, sometimes to the point of tears, are:
Then, there are the ones that send chills up my spine.  A few of those:
I've always been fascinated by this capacity for music to induce emotion.  Such a response is nearly universal, although which music causes tears or that little frisson up the spine varies greatly from person to person.  Most of Mozart's music (with the exception of the Requiem and a couple of other pieces) really doesn't do much for me.  It's pleasant to listen to, but doesn't evoke much in me other than that.  I actively dislike Chopin, Brahms, and Mahler, and I know people for whom those are the absolute pinnacle of emotional depth in music.

[image courtesy of the Wikimedia Commons]

In a paper released just last week in Nature, neurophysiologists Kazuma Mori and Makoto Iwanaga of Osaka University looked into an explanation for how this phenomenon happens, if not exactly why it happens.  Their paper, "Two Types of Peak Emotional Responses to Music: The Psychopathology of Chills and Tears," describes experiments they ran in which they allowed test subjects to listen to music while monitoring their reactions not only via subjective description but by such physiological criteria as skin conductivity (a common measure of stress).

And what happened was pretty cool.  They found that (as I have done above) strongly evocative pieces of music tended to fall into two categories, ones that elicit tears and ones that elicit chills.  The authors write:
The present study investigated the psychophysiological responses of two types of peak emotions: chills and tears.  We used music as the stimuli because the chills response has been confirmed in music and emotion studies... The chills and tears responses were measured by self-report sensations during song listening.  We conducted an experiment measuring subjective emotions and autonomic nervous system activity.  The hypothesis was that tears would be different from chills in terms of both psychological and physiological responses.  With respect to psychophysiological responses, we predicted that chills would induce subjective pleasure, subjective arousal, and physiological arousal whereas tears would induce subjective pleasure, relaxation, and physiological calming.  In addition, we asked participants to rate song expression in terms of happiness, sadness, calm, and fear in order to understand the emotional property of chills-inducing songs and tear-inducing songs...  [The] results show that tears involve pleasure from sadness and that they are psychophysiologically calming; thus, psychophysiological responses permit the distinction between chills and tears.  Because tears may have a cathartic effect, the functional significance of chills and tears seems to be different.
Which supports the contention that my experience of bawling the first time I listened to Ralph Vaughan Williams's Fantasia on a Theme by Thomas Tallis served the purpose of emotional catharsis.  I know my mood was better after the last chords died out, with the exception of the fact that I felt a little like a wrung-out dishrag; and despite the fact that I don't exactly like crying, I listen to these tear-evoking pieces of music over and over.  So there must be something there I'm seeking, and I don't think it's pure masochism.  The authors write:
The current results found that the mixed emotion of chills was simultaneous pleasure, happiness, and sadness.  This finding means that chills provide mainly a positive experience but the sadness factor is necessary even though a favourite song is the elicitor.  Given that music chills activate reward-related brain regions, such an emotional property could make chills a unique experience and separate chills from other mixed emotional experiences.  Furthermore, as the mixed emotion of tears was simultaneous pleasure and sadness, it was different from the mixed emotion of chills.  The tears response contributes to the understanding of the pleasure of sad music.  As people generally feel displeasure for sad things, this is a unique mixed emotional response with regard to music.  Although previous studies showed that sad music induced relatively weak pleasure, the current tears’ results showed that sad songs induced strong pleasure.  It is difficult to account for why people feel sad music as pleasurable; however, the current results suggested that the benefit of cathartic tears might have a key role in the pleasure generated by sad music.  Therefore, the two types of peak emotional responses may uniquely support knowledge of mixed emotion.
So that's pretty awesome, and it's nice to know that I'm not alone in my sometimes overwhelming response to music.  And now I think I'll go listen to Shostakovich's Symphony #5 and have a nice long cry.  I know I'll feel better afterwards.

Thursday, April 27, 2017

Going to the dogs

I am the proud owner of two dogs, both rescues, who are at this point basically members of the family whose contributions to the household consist of barking at the UPS guy, sleeping most of the day, getting hair all over everything, and making sure that we get our money's worth out of the carpet steamer we bought five years ago.

First, there's Lena the Wonder-Hound:


And her comical sidekick, Grendel:


Both of them are sweet and affectionate and spoiled absolutely rotten.  Lena's ancestry is pretty clear -- she's 100% hound, probably mostly Blue-tick Coonhound, Redbone, and Beagle -- but Grendel's a bit of a mystery.  Besides his square face and coloration, other significant features are: (1) a curly tail; (2) a thick undercoat; and (3) a tendency to snore.  This last has made us wonder if he has some Pug or Bulldog in his background somewhere, but that's only speculation.

This all comes up because of a recent delightful study in one of my favorite fields, cladistics.  The idea of cladistics is to create a tree of descent for groups of species based on most recent common ancestry, as discerned from overlap in DNA sequences.  And a group of researchers -- Heidi G. Parker, Dayna L. Dreger, Maud Rimbault, Brian W. Davis, Alexandra B. Mullen, Gretchen Carpintero-Ramirez, and Elaine A. Ostrander of the Comparative Genomics Branch of the National Human Genome Research Institute -- have done this for 161 breeds of dog.

The authors write:
The cladogram of 161 breeds presented here represents the most diverse dataset of domestic dog breeds analyzed to date, displaying 23 well-supported clades of breeds representing breed types that existed before the advent of breed clubs and registries.  While the addition of more rare or niche breeds will produce a denser tree, the results here address many unanswered questions regarding the origins of breeds.  We show that many traits such as herding, coursing, and intimidating size, which are associated with specific canine occupations, have likely been developed more than once in different geographical locales during the history of modern dog.  These data also show that extensive haplotype sharing across clades is a likely indicator of recent admixture that took place in the time since the advent of breed registries, thus leading to the creation of most of the modern breeds.  However, the primary breed types were developed well before this time, indicating selection and segregation of dog populations in the absence of formal breed recognition.  Breed prototypes have been forming through selective pressures since ancient times depending on the job they were most required to perform.  A second round of hybridization and selection has been applied within the last 200 years to create the many unique combinations of traits that modern breeds display.  By combining genetic distance relationships with patterns of haplotype sharing, we can now elucidate the complex makeup of modern dogs breeds and guide the search for genetic variants important to canine breed development, morphology, behavior, and disease.
Which is pretty cool.  What I found most interesting about the cladogram (which you can see for yourself if you go to the link provided above) is that breeds that are often clustered together, and known by the same common name -- such as "terrier" -- aren't necessarily closely related.  This shouldn't be a surprise, of course; all you have to do is look at the relationships between birds called "buntings" or "sparrows" or "tanagers" to realize that common names tell you diddly-squat about actual genetic distance.  But it was still surprising to find that (for example) Bull Terriers and Staffordshire Terriers are more closely related to Bulldogs and Mastiffs than they are to (for example) Scottish Terriers; that Corgis are actually related to Greyhounds; and that Schnauzers, Pugs, Pomeranians, and Schipperkes are all on the same clade.  The outgroup (most distantly related branch) of the entire clade is the peculiar Basenji, a Central African breed with a strange, yodel-like bark, a curly tail, and pointed ears, whose image has been recorded almost unchanged all the way back to the time of the ancient Egyptians.

Anyhow, it's an elegant bit of research, and sure to be of interest to any other dog owners in the studio audience.  Me, I'm wondering where Grendel fits into the cladogram.  Considering his peculiar set of traits, he might have a branch all his own, and give the Basenji a run for its money as the oddest breed out there.

Wednesday, April 26, 2017

In your right mind

Another peculiarity of the human brain is lateralization, which is the tendency of the brain to have a dominant side.  It's most clearly reflected in hand dominance; because of the cross-wiring of the brain, people who are right-handed have a tendency to be left brain dominant, and vice versa.  (There's more to it than that, as some people who are right handed are, for example, left eye dominant, but handedness is the most familiar manifestation of brain lateralization.)

It bears mention at this juncture that the common folk wisdom that brain lateralization has an influence on your personality -- that, for instance, left brain dominant people are sequential, mathematical, and logical, and right brain dominant people are creative, artistic, and holistic -- is complete nonsense.  That myth has been around for a long while, and has been roundly debunked, but still persists for some reason.

I first was introduced to the concept of brain dominance when I was in eighth grade.  I was having some difficulty reading, and my English teacher, Mrs. Gates, told me she thought I was mixed-brain dominant -- that I didn't have a strongly lateralized brain -- and that this often lead to processing disorders like dyslexia.  (She was right, but they still don't know why that connection exists.)  It made sense.  When I was in kindergarten, I switched back and forth between writing with my right and left hand about five times until my teacher got fed up and told me to simmer down and pick one.  I picked my right hand, and have stuck with it ever since, but I still have a lot of lefty characteristics.  I tend to pick up a drinking glass with my left hand, and I'm strongly left eye dominant, for example.

Anyhow, Mrs. Gates identified my mixed-brainness, and the outcome apropos of my reading facility, but she also told me that there was one thing that mixed-brain people can learn faster than anyone else.  Because of our nearly-equal control from both sides of the brain, we can do a cool thing, which Mrs. Gates taught me and I learned in fifteen seconds flat.  I can write, in cursive, forward with my right hand while I'm writing the same thing backwards with my left.  (Because it's me, they're both pretty illegible, but it's still kind of a fun party trick.)

[image courtesy of the Wikimedia Commons]

Fast forward to today.  Some recent research has begun to elucidate the evolutionary reasons behind lateralization.  It's been known for years that lots of animals are lateralized, so it stands to reason that it must confer some kind of evolutionary advantage, but what that might be was unclear... until now.

Research by a team led by Onur Güntürkün, of the Institute of Cognitive Neuroscience at Ruhr-University Bochum, in Germany, has looked at lateralization in animals from cockatoos to zebra fish to humans, and has described the possible evolutionary rationale for having a dominant side of the brain.

"What you do with your hands is a miracle of biological evolution," Güntürkün says.  "We are the master of our hands, and by funneling this training to one hemisphere of our brains, we can become more proficient at that kind of dexterity.  Natural selection likely provided an advantage that resulted in a proportion of the population -- about 10% -- favoring the opposite hand. The thing that connects the two is parallel processing, which enables us to do two things that use different parts of the brain at the same time."

Additionally, Güntürkün says, our perceptual systems have also evolved that kind of division of labor.  Both left and right brain have visual recognition centers, but in humans the one on the right side is more devoted to image recognition, and the one on the left to word and symbol recognition.  And this is apparently a very old evolutionary innovation, long predating our use of language; even pigeons have a split perceptual function between the two sides of the brain (and therefore between their eyes).  They tend to tilt their heads so their left eye is scanning the ground for food while their right one scans the sky for predators.

So what might seem to be a bad idea -- ceding more control to one side of the brain than the other, making one hand more nimble than the other --turns out to have a distinct advantage.  And if you'll indulge me in a little bit of linguistics geekery, for good measure, even our word "dexterous" reflects this phenomenon.  "Dexter" is Latin for "right," and reflects the commonness of right-handers, who were considered to be more skillful.  (And when you find out that the Latin word for "left" is "sinister," you get a rather unfortunate lens into attitudes toward southpaws.)

Anyhow, there you have it; another interesting feature of our brain physiology explained, and one that has a lot of potential for increasing our understanding of neural development.  "Studying asymmetry can provide the most basic blueprints for how the brain is organized," Güntürkün says.  "It gives us an unprecedented window into the wiring of the early, developing brain that ultimately determines the fate of the adult brain.  Because asymmetry is not limited to human brains, a number of animal models have emerged that can help unravel both the genetic and epigenetic foundations for the phenomenon of lateralization."

Tuesday, April 25, 2017

Thanks for the memories

I've always been fascinated with memory.  From the "tip of the tongue" phenomenon, to the peculiar (and unexplained phenomenon) of déjà vu, to why some people have odd abilities (or inabilities) to remember certain types of information, to caprices of the brain such as its capacity for recalling a forgotten item once you stop thinking about it -- the way the brain handles storage and retrieval of memories is a curious and complex subject.

Two pieces of recent research have given us a window into how the brain organizes memories, and their connection to emotion.  In the first, a team at Dartmouth and Princeton Universities came up with a protocol to induce test subjects to forget certain things intentionally.  While this may seem like a counterproductive ability -- most of us struggle far harder to recall memories than to forget them deliberately -- consider the applicability of this research to debilitating conditions such as post-traumatic stress disorder.

In the study, test subjects were shown images of outdoor scenes as they studied two successive lists of words.  In one case, the test subjects were told to forget the first list once they received the second; in the other, they were instructed to try to remember both.

"Our hope was the scene images would bias the background, or contextual, thoughts that people had as they studied the words to include scene-related thoughts," said Jeremy Manning, an assistant professor of psychological and brain sciences at Dartmouth, who was lead author of the study.  "We used fMRI to track how much people were thinking of scene-related things at each moment during our experiment.  That allowed us to track, on a moment-by-moment basis, how those scene or context representations faded in and out of people's thoughts over time."

What was most interesting about the results is that in the case where the test subjects were told to forget the first list, the brain apparently purged its memory of the specifics of the outdoor scene images the person had been shown as well.  When subjects were told to recall the words on both lists, they recalled the images on both sets of photographs.

"[M]emory studies are often concerned with how we remember rather than how we forget, and forgetting is typically viewed as a 'failure' in some sense, but sometimes forgetting can be beneficial, too," Manning said.  "For example, we might want to forget a traumatic event, such as soldiers with PTSD.  Or we might want to get old information 'out of our head,' so we can focus on learning new material.  Our study identified one mechanism that supports these processes."

What's even cooler is that because the study was done with subjects connected to an fMRI, the scientists were able to see what contextual forgetting looks like in terms of brain firing patterns.   "It's very difficult to specifically identify the neural representations of contextual information," Manning said.  "If you consider the context you experience something in, we're really referring to the enormously complex, seemingly random thoughts you had during that experience.  Those thoughts are presumably idiosyncratic to you as an individual, and they're also potentially unique to that specific moment.  So, tracking the neural representations of these things is extremely challenging because we only ever have one measurement of a particular context.  Therefore, you can't directly train a computer to recognize what context 'looks like' in the brain because context is a continually moving and evolving target.  In our study, we sidestepped this issue using a novel experimental manipulation -- we biased people to incorporate those scene images into the thoughts they had when they studied new words.  Since those scenes were common across people and over time, we were able to use fMRI to track the associated mental representations from moment to moment."

In the second study, a team at UCLA looked at what happens when a memory is connected to an emotional state -- especially an unpleasant one.  What I find wryly amusing about this study is that the researchers chose as their source of unpleasant emotion the stress one feels in taking a difficult math class.

I chuckled grimly when I read this, because I had the experience of completely running into the wall, vis-à-vis mathematics, when I was in college.  I actually was a pretty good math student.  I breezed through high school math, barely opening a book or spending any time outside of class studying.  In fact, even my first two semesters of calculus in college, if not exactly a breeze, at least made good sense to me and resulted in solid A grades.

Then I took Calc 3.

I'm not entirely sure what happened, but when I hit three-dimensional representations of graphs, and double and triple integrals, and calculating the volume of the intersection of four different solid objects, my brain just couldn't handle it.  I got a C in Calc 3 largely because the professor didn't want to have to deal with me again.  After that, I sort of never recovered.  I had a good experience with Differential Equations (mostly because of a stupendous teacher), but the rest of my mathematical career was pretty much a flop.

And the worst part is that I still have stress dreams about math classes.  I'm back at college, and I realize that (1) I have a major exam in math that day, and (2) I have no idea how to do what I'll be tested on, and furthermore (3) I haven't attended class for weeks.  Sometimes the dream involves homework I'm supposed to turn in but don't have the first clue about how to do.

Keep in mind that this is 35 years after my last-ever math class.  And I'm still having anxiety dreams about it.


What the researchers at UCLA did was to track students who were in an advanced calculus class, keeping track of both their grades and their self-reported levels of stress surrounding the course.  Their final exam grades were recorded -- and then, two weeks after the final, they were given a retest over the same material.

The fascinating result is that stress was unrelated to students' scores on the actual final exam, but the students who reported the most stress did significantly more poorly on the retest.  The researchers call this "motivated forgetting" -- that the brain is ridding itself of memories that are associated with unpleasant emotions, perhaps in order to preserve the person's sense of being intelligent and competent.

"Students who found the course very stressful and difficult might have given in to the motivation to forget as a way to protect their identity as being good at math," said study lead author Gerardo Ramirez.  "We tend to forget unpleasant experiences and memories that threaten our self-image as a way to preserve our psychological well-being.  And 'math people' whose identity is threatened by their previous stressful course experience may actively work to forget what they learned."

So that's today's journey through the recesses of the human mind.  It's a fascinating and complex place, never failing to surprise us, and how amazing it is that we are beginning to understand how it works.  As my dear friend, Professor Emeritus Rita Calvo, Cornell University teacher and researcher in Human Genetics, put it: "The twentieth century was the century of the gene.  The twenty-first will be the century of the brain.  With respect to neuroscience, we are right now about where genetics was in 1917 -- we know a lot of the descriptive features of the brain, some of the underlying biochemistry, and other than that, some rather sketchy details about this and that.  We don't yet have a coherent picture of how the brain works.

"But we're heading that direction.  It is only a matter of time till we have a working model of the mind.  How tremendously exciting!"

Monday, April 24, 2017

Reality blindness

I read an article on CNN yesterday that really pissed me off, something that seems to be happening more and more lately.

The article, entitled "Denying Climate Change As the Seas Around Them Rise" (by Ed Lavandera and Jason Morris), describes the effects of climate change in my home state of Louisiana, which include the loss of entire communities to rising seas and coastline erosion.  An example is the village of Isle Jean Charles, mostly inhabited by members of the Biloxi-Chetimacha tribe, which basically has ceased to exist in the last ten years.

But there are people who will deny what is right in front of their faces, and they include one Leo Dotson of Cameron Parish.  Dotson, a fisherman and owner of a seafood company, "turned red in the face" when the reporters from CNN asked him about climate change.  Dotson said:
I work outside in the weather on a boat, and it's all pretty much been the same for me.  The climate is exactly the same as when I was a kid.  Summers hot, winters cold...  [Climate change] doesn't concern me...  What is science?  Science is an educated guess.  What if they guess wrong?  There's just as much chance for them to be wrong as there is for them to be right.  If [a scientist] was 500 years old, and he told me it's changed, I would probably believe him.  But in my lifetime, I didn't see any change.
Well, you know what, Mr. Dotson?  I'm kind of red in the face right now, myself.  Because your statements go way past ignorance.  Ignorance can be forgiven, and it can be cured.  What you've said falls into the category of what my dad -- also a fisherman, and also a native and life-long resident of Louisiana -- called "just plain stupid."

Science is not an educated guess, and there is not  "just as much chance for them to be wrong as there is for them to be right."  Climate scientists are not "guessing" on climate change.  Because of the controversy, the claim has been tested every which way from Sunday, and every scrap of evidence we have -- sea level rise, Arctic and Antarctic ice melt, earlier migration times for birds, earlier flowering times for plants, more extreme weather events including droughts, heat waves, and storms -- support the conclusion that the climate is shifting dramatically, and that we've only seen the beginning.


At this point, the more educated science deniers usually bring up the fact that there have been times that the scientific establishment has gotten it wrong, only to be proven so, sometimes years later.  Here are a few examples:
  1. Darwin's theory of evolution, which overturned our understanding of how species change over time.
  2. Mendel's experiments in genetics, later bolstered by the discovery of the role of DNA and chromosomes in heredity.  Prior to Mendel's time, our understanding of heredity was goofy at best (consider the idea, still prevalent in fairy tales, of "royal blood" and the capacity for ruling being inheritable, which you'd think that any number of monarchs who were stupid, incompetent, insane, or all three would have been sufficient to put to rest).
  3. Alfred Wegener's postulation of "continental drift" in 1912, which was originally ridiculed so much that poor Wegener was forced to retreated in disarray.  The fact that he was right wasn't demonstrated for another forty years, through the work of such luminaries in geology as Harry Hess, Tuzo Wilson, Fred Vine, Drummond Matthews, and others.
  4. The "germ theory of disease," proposed by Marcus von Plenciz in 1762, and which wasn't widely accepted until the work of Robert Koch and Louis Pasteur in the 1870s.
  5. Big Bang cosmology, discovered from the work of astronomers Georges Lemaître and Edwin Hubble.
  6. Albert Einstein's discovery of relativity, and everything that came from it -- the speed of light as an ultimate universal speed limit, time dilation, and the theory of simultaneity.
  7. The structure of the atom, a more-or-less correct model of which was first described by Niels Bohr, and later refined considerably by the development of quantum mechanics.
There.  Have I forgotten any major ones?  My point is that yes, prior to each of these, people (including scientists) believed some silly and/or wrong ideas about how the world works, and that there was considerable resistance in the scientific community to accepting what we now consider theory so solidly supported that it might as well be considered as fact.  But you know why these stand out?

Because they're so infrequent.  If you count the start of the scientific view of the world as being some time during the Enlightenment -- say, 1750 or so -- that's 267 years in which there have been only seven times there has been a major model of the universe overturned and replaced by a new paradigm.  Mostly what science has done is to amass evidence supporting the theories we have -- genetics supporting evolution, the elucidation of DNA's structure by Franklin, Crick, and Watson supporting Mendel, the discovery of the 3K cosmic microwave background radiation by Amo Penzias and Robert Wilson supporting the Big Bang.

So don't blather at me about how "science gets it wrong as often as it gets it right."  That's bullshit.  If you honestly believe that, you better give up modern medicine and diagnostics, airplanes, the internal combustion engine, microwaves, the electricity production system, and the industrial processes that create damn near every product we use.

But you know what?  I don't think Dotson and other climate change deniers actually do believe that.  I doubt seriously whether Dotson would go in to his doctor for an x-ray, and when he gets the results say, "Oh, well.  It's equally likely that I have a broken arm or not, so what the hell?  Might as well not get a cast."  He doesn't honestly think that when he pulls the cord to start his boat motor, it's equally likely to start, not start, or explode.

No, he doesn't believe in climate change because it would require him to do something he doesn't want to do.  Maybe move.  Maybe change his job.  Maybe vote for someone other than the clods who currently are in charge of damn near every branch of government.  So because the result is unpleasant, it's easier for him to say, "ain't happening," and turn red in the face.

But the universe is under no obligation to conform to our desires.  Hell, if it was, I'd have a magic wand and a hoverboard.  It's just that I'm smart enough and mature enough to accept what's happening even if I don't like it, and people like Dotson -- and Lamar Smith, and Dana Rohrabacher, and James "Snowball" Inhofe, and Scott Pruitt, and Donald Trump -- apparently are not.

The problem is, there's not much we can do to fix this other than wait till Leo Dotson's house floats away.  Once people like him have convinced themselves of something, there's no changing it.

I just have to hope that our government officials aren't quite so intransigent.  It'd be nice to see them wake up to reality before the damage done to our planet is irrevocable.

Saturday, April 22, 2017

Poll avoidance

I'm lucky, being an outspoken atheist, that I live where I do.  The people in my area of upstate New York are generally pretty accepting of folks who are outside of the mainstream (although even we've got significant room for improvement).  The amount of harassment I've gotten over my lack of religion has, really, been pretty minimal, and mostly centered around my teaching of evolution in school and not my unbelief per se.

It's not like that everywhere.  In a lot of parts of the United States, religiosity in general, and Christianity in particular, are so ubiquitous that it's taken for granted.  In my home town of Lafayette, Louisiana, the question never was "do you go to church?", it was "what church go you go to?"  The couple of times I answered that with "I don't," I was met with a combination of bafflement and an immediate distancing, a cooling of the emotional temperature, a sense of "Oh -- you're not one of us."

So no wonder that so many atheists are "still in the closet."  The reactions by friends, family, and community are simply not worth it, even though the other alternative is having a deeply important part of yourself hidden from the people in your life.  As a result, of course, this results in a more general problem -- the consistent undercounting of how many people actually are atheists, and the result that those of us who are feel even more isolated and alone than we did.

[image courtesy of creator Jack Ryan and the Wikimedia Commons]

Current estimates from polls are that 3% of Americans self-identify as atheists, but there's reason to believe that this is a significant underestimate -- in other words, people are being untruthful to the pollsters about their own disbelief.  You might wonder why an anonymous poll conducted by a total stranger would still result in people lying about who they are, but it does.  Jesse Singal, over at The Science of Us, writes:
So if you’re an atheist and don’t live in one of America’s atheist-friendly enclaves, it might not be something you want to talk about — in fact you may have trained yourself to avoid those sorts of conversations altogether.  Now imagine a stranger calls you up out of the blue, says they’re from a polling organization, and asks about your religious beliefs.  Would you tell them you don’t have any?  There’s a lot of research suggesting you might not.  The so-called social-desirability bias, for example, is an idea that suggests that in polling contexts, people might not reveal things — racist beliefs are the one of the more commonly studied examples — that might make them look bad in the eyes of others, even if others refers to only a single random person on the other end of the phone line.
As Singal points out, however, a new study by Will Gervais and Maxine B. Najle of the University of Kentucky might have come up with a way around that.  Gervais and Najle came up with an interesting protocol for estimating the number of atheists without having to ask the specific question directly.  They gave one of two different questionnaires to 2,000 people.  Each had a list of statements that could be answered "true" or "false" -- all the respondents had to do was to tell the researcher how many true statements there were, not which specific ones were true, thus (one would presume) removing a lot of the anxiety over admitting outright something that could be perceived negatively.  The first questionnaire was the control, and had statements like "I own a dog" and "I am a vegetarian."  The second had the same statements, with one additional one: "I believe in God."  Since one would presume that in any sufficiently large random sample of people, the same proportion of people would answer "yes" to any given statement, then any increase in the number of (in this case) "false" replies would have to be due to the additional statement about belief.

And there was a difference.  A significant one.  The authors write:
Widely-cited telephone polls (e.g., Gallup, Pew) suggest USA atheist prevalence of only 3-11%.  In contrast, our most credible indirect estimate is 26% (albeit with considerable estimate and method uncertainty).  Our data and model predict that atheist prevalence exceeds 11% with greater than .99 probability, and exceeds 20% with roughly .8 probability.  Prevalence estimates of 11% were even less credible than estimates of 40%, and all intermediate estimates were more credible.
So it looks like there are a lot more of us out there than anyone would have thought.  I, for one, find that simultaneously comforting and distressing.  Isn't it sad that we still live in a world where belonging to a stigmatized group -- being LGBT, being a minority, being atheist -- is still looked upon so negatively that there are that many people who feel like they need to hide?  I'm not in any way criticizing the decision to stay in the closet; were I still living in the town where I was raised, I might well have made the same choice, and I realize every day how lucky I am to live in a place where people (for the most part) accept who I am.

But perhaps this study will be a first step toward atheists feeling more empowered to speak up.  There's something to the "safety in numbers" principle.  It'd be nice if people would just be kind and non-judgmental regardless, even to people who are different, but when I look at the news I realize how idealistic that is.  Better, I suppose, to convince people of the truth that we're more numerous than you'd think -- and not willing to pretend any more to a belief system we don't share.

Friday, April 21, 2017

Run for your life

Back when I was in my thirties, I got into running in a big way.

I used to do four to five miles a day, pretty much no matter what the weather, all the more impressive because I live in upstate New York, where warm weather is in woefully short supply (this year, summer is scheduled for the second Thursday in July).  But unless we were knee-deep in snow, I was out there.

Then, in my forties, I began to develop some joint problems, which were (and still are) of unknown origin, and those only resolved a couple of years ago.  So I'm back at it, and in fact have my first semi-comptetitive 5K of 2017 three weeks from now.

What's funny is that while I'm running, mostly what I'm thinking about is, "merciful heavens, why do I do this to myself?"  My quads and calves ache, I'm breathing hard, all I want is to see that blessed sight of the Finish Line.  But afterwards... all I can say is that the feeling is euphoric.  Despite being tired and sweaty and having spaghetti legs, my general feeling is "Woo hoo!  Gotta do that again soon!"

So what's going on here?  Am I some kind of masochist who gets his jollies out of being miserable?  Or am I like the guy who pounds his head on the wall because it feels so good when he stops?

If so, I'm not alone -- and neuroscientists have just taken the first steps toward figuring out why.

Me with a medal and some serious post-race euphoria

Apparently, part of what's going on is that vigorous aerobic exercise stimulates the growth of neurons in the brain.  It was long the conventional wisdom that humans couldn't do that; you had a certain number of neurons at adulthood, and afterwards the number would only go one way.  You could only affect the rate at which the neurons declined, based on such things as alcohol and drug use, concussions, and the number of times you listen to Ken Ham trying to defend why Noah's Ark is actually real science.

But according to Karen Postal, president of the American Academy of Clinical Neuropsychology, that may not be true -- and one thing that affects not only preserving the gray matter you have, but increasing it, is exercise.  "If you are exercising so that you sweat — about 30 to 40 minutes — new brain cells are being born," said Postal, who is a runner herself.  "And it just happens to be in that memory area...  That's it.  That's the only trigger that we know about."

Other researchers have gone one step further than that.  Emily E. Bernstein and Richard J. McNally of Harvard University recently published a study called "Acute Aerobic Exercise Helps Overcome Emotion Regulation Deficits," which shows that our ability to modulate our negative emotions -- especially grief, helplessness, and anxiety -- can be improved dramatically by the simply expedient of going for a half-hour's run.  The authors write:
Although colloquial wisdom and some studies suggest an association between regular aerobic exercise and emotional well-being, the nature of this link remains poorly understood.  We hypothesised that aerobic exercise may change the way people respond to their emotions.  Specifically, we tested whether individuals experiencing difficulties with emotion regulation would benefit from a previous session of exercise and show swifter recovery than their counterparts who did not exercise.  Participants completed measures of emotion response tendencies, mood, and anxiety, and were randomly assigned to either stretch or jog for 30 minutes.  All participants then underwent the same negative and positive mood inductions, and reported their emotional responses... Interactions revealed that aerobic exercise attenuated [negative] effects.  Moderate aerobic exercise may help attenuate negative emotions for participants initially experiencing regulatory difficulties.  
This is no surprise to me, nor, I suspect, to anyone else who runs.  The process creates space in your mind, space that can then act as a springboard to creativity.  It's like one of my favorite authors, Haruki Murakami, says in his paean to the sport, What I Talk About When I Talk About Running: "I just run.  I run in void.  Or maybe I should put it the other way: I run in order to acquire a void."

Or as Melissa Dahl said in her piece in The Science of Us called "Why Running Helps to Clear Your Mind," "[T]here’s another big mental benefit to gain from running, one that scientists haven’t quiet yet managed to pin down to poke at and study: the wonderful way your mind drifts here and there as the miles go by.  Mindfulness, or being here now, is a wonderful thing, and there is a seemingly ever-growing stack of scientific evidence showing the good it can bring to your life.  And yet mindlessness — daydreaming, or getting lost in your own weird thoughts — is important, too."

Which is it exactly.  And with that, I think I'll wind up here.  Maybe go for a run.  And after that, who knows what I'll do with all those extra neurons?

Thursday, April 20, 2017

Beastly goings-on

Lately, it's seemed like the leaders of the conservative Christian Right have been going out of their way to make patently ridiculous statements.

As I commented a couple of weeks ago, we've had such pinnacles of clear thought as Pat Robertson babbling about how he hates being dominated by homosexuals, and Mary Colbert telling us that if we don't support Donald Trump, god will curse our grandchildren.  Even British Prime Minister Theresa May got in on the action, saying that Cadbury's decision to call this year's big event "The Great British Egg Hunt" is a deliberate slap in the face to Christians everywhere, because it didn't mention Easter, and we all know how central chocolate eggs are to the story of Jesus's resurrection.

Not to be outdone, today we have another luminary in the fundamentalist world, rabidly anti-gay Pastor Kevin Swanson, ranting on his radio show about the new live-action movie Beauty and the Beast.  But it's probably not about what you're thinking -- that the movie features a gay character.

No, that's small potatoes, and has been the subject of horrified diatribes from damn near every spokesperson for the Religious Right.  Swanson obviously disapproves of the gay character; but even more than that, he hates Beauty and the Beast...

... because it promotes inter-species mating.


Sadly, I'm not making this up.  Here's the direct quote:
Liberals [seem] to be okay with this inter-species breeding, and have been ever since Star Trek was on the air...  Christians, I don’t believe, can allow for this.  Humans are made in the image of God.  Humans are assigned a spouse which happens to be a member of the opposite sex.  Friends, God’s law forbids it…  Christians should not allow for this, man.  We cannot allow for humans to interbreed with other species. It’s just wrong, wrong, wrong.  It’s confusion, it’s unnatural...  We are in some of the most radical, most anti-biblical, the most immoral, the most unethical, the most wicked sexual environment that the world has ever known, right now.
Okay, can we just establish a few facts, here?
  1. Beauty and the Beast is fiction.
  2. So is Star Trek, although the way things are going down here on Earth, I'm ready for Zefram Cochrane to invent the warp drive so I can warp right the fuck out of here.
  3. Inter-species matings on Star Trek produced, to name three, Deanna Troi, Mr. Spock, and B'elanna Torres.  I'd take any of the three over Kevin Swanson in a heartbeat. 
  4. The character of the Beast in Beauty and the Beast is human.  In fact, that is sort of the whole point of the movie.  He's under a curse to look beastly, but the idea is that underneath, he's still human.
  5. Belle and the Beast don't actually have sex until the curse is broken and they're married, so even if we're accepting Swanson's message at face value, I'm not sure what there is to complain about.  There was beast/human dancing and beast/human singing and lots of beast/human talking in the movie, but no beast/human nookie. 
  6. As far as I can see, here in the real world things have not gotten a lot more wicked and immoral in the sex department lately.  People have always enjoyed Doing It, and what kind of Doing It they enjoy has always had substantial variation.  What we're moving towards -- not nearly fast enough, in my opinion -- is a place where no one can tell you how you should Do It, nor with whom, nor what your rights should be based around any such matters.
  7. In general, there's very little inter-species breeding in the natural world anyhow, because it doesn't produce offspring.  Actually, that's sort of the biological definition of "species."  A few closely-related species can manage -- horses and donkeys producing mules, for example -- but in general, it just doesn't work, and even in the case of mules, they're usually sterile.  But I wouldn't expect that kind of understanding of biology from a guy who thinks that Noah toddled off to Australia to pick up a pair of wombats while he was taking a break from building an enormous boat in the deserts of the Middle East by hand, then toddled back over to Australia to drop them off when the flood waters magically receded down a big drain in the ocean floor or something.
Of course, I always get a little suspicious when these ministerial types start railing against specific behaviors over and over.  The way things have been going, I wouldn't be surprised if Swanson's demented rant about bestiality in a Disney movie means he'll get arrested next month for having sex with an aardvark or something.

Anyhow, that's our latest salvo from the ultra-Christian wacko fringe.  I probably should simply stop commenting on these people, because they seem to be in some sort of bizarre contest to see which one can make the most completely idiotic statement.

On the other hand, the fact is that a significant fraction of Americans still listen to them.  So maybe it's worthwhile after all.  Although I doubt seriously whether the kind of people who are willing to boycott Beauty and the Beast because of Kevin Swanson are the same ones who'll make their way over here to Skeptophilia.  But you never know.

Wednesday, April 19, 2017

Alex Jones vs. the chickens

Every so often, there is justice in the world.

This time, the fabled chickens coming home to roost are casting their beady eyes on none other than Alex Jones, that purveyor of wacko fringe conspiracy theories about everything from the New World Order to "Pizzagate."  His wife, Kelly Jones, filed for divorce in 2015, and they are now in a custody battle over their three children.  Understandably, the fact that Alex Jones gives every evidence of being a raving maniac came up more than once.

"He’s not a stable person," Kelly Jones said in court.  "He says he wants to break Alec Baldwin’s neck.  He wants J Lo to get raped...  He broadcasts from home.  The children are there, watching him broadcast."

Which would certainly be enough for me, were I in her shoes.

Alex Jones's lawyer, Randall Wilhite, responded with an approach that strikes me as risky; he claims that Jones doesn't actually believe what he's saying.  "He's playing a character," Wilhite said. "He's a performance artist...  Using his on-air Infowars persona to evaluate him as a father would be like judging Jack Nicholson in a custody dispute based on his performance as the Joker in Batman."


Yes, well, no one is claiming that what the Joker says has any connection to reality, whereas there are lots of people who believe everything Alex Jones says, not least the President of the United States.  In fact, Donald Trump appeared on Infowars last year, and told Jones, "Your reputation is amazing.  I will not let you down."

That connection has only grown stronger since Trump won the election.  Two weeks ago, Jones said on air that Trump had invited him to Mar-a-Lago, but Jones had to respectfully decline "due to family obligations."

"I'm still in regular telephone contact with the president," Jones said.  "But I must apologize, because I can't always answer the phone when he calls."

Trump's not the only one who takes Jones seriously.  Just last week, Lucy Richards of Fort Lauderdale, Florida, was arrested after she missed her court date stemming from charges of making death threats to Leonard Pozner, whose six-year-old son Noah died in the Sandy Hook massacre.  Guess why Richards threatened Pozner?

She believed that the Sandy Hook killings were a government-staged "false flag," that no children were killed, and that the grieving parents were "crisis actors" who had been hired to play the parts of bereaved family members of the supposed murdered children.  She wanted Pozner to confess that he was a government plant, and 'fess up that he didn't actually have a son named Noah.

All of which she found out by listening to Infowars and other alt-right conspiracy sites.

Pozner himself said he'd like to be at Jones's trial.  "I wish I could be there in the courtroom to stare him down to remind him of how he’s throwing salt on a wound," Pozner said, "and so he can remember how he handed out salt for other people to throw on mine."

As for Jones, you'd think the threat of losing custody of his children would be sufficient to get him to reconsider his loony on-air persona, whether or not he actually believes what he's saying.  But no: just last Friday, Jones had as a guest alt-right spokesperson Mike Cernovich (himself the focus of some scrutiny because of some horrific statements he made to the effect that most cases of rape are false accusations).  On this show, Jones and Cernovich discussed why the Obamas were in French Polynesia, and came to the conclusion that it's not because it's a nice place for a vacation, it's because French Polynesia doesn't have an extradition treaty with the United States.  "Notice he’s staying out of the U.S. right as they move to try to overthrow Trump," Jones said.  About the Obamas' daughters, Sasha and Malia, Jones said, "The word is those are not even his kids."

"The word is."  Meaning "a goofy idea that Alex Jones just pulled out of his ass."

So apparently Jones doesn't think he's got anything to worry about regarding the upcoming custody case, even though if he wins it, he'll be effectively saying under oath "Your Honor, I am a big fat liar."  It's to be hoped that the judge won't buy this, and will slap him down hard, as he's richly deserved for some time now.  But the sad truth is that even if he does win -- in fact, even if he stood in the middle of Times Square and yelled, "Nothing I have ever said on air is the truth!  I lie every time I open my mouth!", it wouldn't diminish his popularity or trust amongst his listeners one bit.  Look at Trump's supporters; the man seems genetically incapable of uttering a true statement or living up to any of his campaign promises, but the diehards still consider him the next best thing to the Second Coming of Christ.  

Hell, they said Bill Clinton was slick.  I recall one comedian saying that Clinton could stand right in front of you and say, "I am not here," and everyone would look shocked and say, "Where'd he go?"  But Clinton was bush league with compared to either Trump or Jones.  The fact that Trump has a significant fraction of American voters convinced he's the Anointed One of God, despite the fact of being the only person I've ever seen who embodies all Seven Deadly Sins at the same time, is evidence of how fact-proof people have become.

And as for Jones, I am certain that however the custody trial comes out, he won't lose a single listener, and he'll be right there to launch the next round of horrible rumors and conspiracy theories.  Even if the chickens come home to roost, Jones probably won't have any difficulty converting most of them to fricassée.

Tuesday, April 18, 2017

The disappearance of Bruno

UFO enthusiasts are currently in a tizzy over the disappearance last week of a university student from Rio Branco, Brazil, who left behind a bizarre video about 16th century philosopher, scientist, and theologian Giordano Bruno and a room whose walls are covered with esoteric symbols.

The student's name is Bruno Borges (I wondered if his first name was in honor of Giordano, or whether it was a coincidence; of course, in the minds of the UFO conspiracy theorists, nothing is a coincidence).  He apparently had a reputation as being a bit of an odd duck even prior to his disappearance.  He was obsessed with aliens, and his fascination with the earlier Bruno came from the fact that the Italian philosopher/scientist was one of the first to speculate that other planets -- even planets around other stars -- might harbor life.  Borges hinted that Bruno's execution at the hands of the Inquisition was to keep him silent about the reality of aliens, when in reality it was just your average charges of heresy.  The church made eight accusations, claiming that Bruno was guilty of:
  • holding opinions contrary to the Catholic faith and speaking against it and its ministers
  • holding opinions contrary to the Catholic faith about the Trinity, divinity of Christ, and Incarnation
  • holding opinions contrary to the Catholic faith pertaining to Jesus as Christ
  • holding opinions contrary to the Catholic faith regarding the virginity of Mary, mother of Jesus
  • holding opinions contrary to the Catholic faith about both Transubstantiation and Mass
  • claiming the existence of a plurality of worlds and their eternity
  • believing in metempsychosis and in the transmigration of the human soul into brutes
  • dealing in magics and divination
Given the intolerance of the time, any one of these would be sufficient, but the Catholic Church is nothing if not thorough.  Bruno was sentenced to be burned at the stake, and supposedly upon hearing his fate made a rude gesture at the judges and said, "Maiori forsan cum timore sententiam in me fertis quam ego accipiam" ("Perhaps you pronounce this sentence against me with greater fear than I receive it"), which ranks right up there with Galileo's "Eppur si muove" as one of the most elegant "fuck you" statements ever delivered.

I suppose it's understandable that Borges thought Bruno was a pretty cool guy.  A lot of us science types do, although that admiration might be misplaced.  Hank Campbell writes over at The Federalist:
Bruno only agreed with Copernicus because he worshiped the Egyptian God Thoth and believed in Hermetism and its adoration of the sun as the center of the universe.  Both Hermes and Thoth were gods of…magic. 
The church and science did not agree with Bruno that pygmies came from a “second Adam” or that Native Americans had no souls, but they were also not going to kill him over it.  There is no evidence his “science” came up at any time.  He was imprisoned for a decade because the church wanted him to just recant his claims that Hermetism was the one true religion and then they could send him on his way.  When he spent a decade insisting it was fact, he was convicted of Arianism and occult practices, not advocating science.
So right off, we're on shaky ground, not that this was ever in doubt.  In any case, between Borges's devotion to Bruno and his fascination with aliens, he apparently went a little off the deep end.  He left behind over a dozen bound books, mostly written in code, and only a few of which have been deciphered.  Here's a sample passage from one of the ones that has been decrypted:
It is easy to accept what you have been taught since childhood and what is wrong.  It is difficult, as an adult, to understand that you were wrongly taught what you suspected was correct since you were a child.  In other words, if you fit into the system, your behaviour will be determined, making you at the mercy of beliefs already provided and well established in dogmas and rituals, with the masses.
Which is standard conspiracy theory fare.  He wouldn't tell his parents or his sister what he was up to, only that he was working on fourteen books that would "change mankind in a good way."  Besides the symbols painted on his walls, he also had a portrait of himself next to an alien:

Borges's apartment wall, showing the symbols, writing, and the portrait of him with a friend

Borges has now been missing for over a week, and his family is understandably frantic.  The UFO/conspiracy world is also freaking out, but for a different reason; they think that Borges knew too much (in this view of the world, people are always "finding out too much" and having to be dealt with), and either the people who don't want us to know about aliens, or else the aliens themselves, have kidnapped him.

But the whole thing sounds to me like the story of a delusional young man whose disappearance is a matter for the police, not for Fox Mulder and Dana Scully.  It's sad, but I'm guessing that aliens had nothing to do with it.  Of course, try to tell that to the folks over at the r/conspiracy subreddit, where such a statement simply confirms that I'm one of "the two s's" -- sheeple (dupe) or shill (complicit).  I'll leave it to wiser heads than mine to determine which is most likely in my case.

Monday, April 17, 2017

Tall tales of Don Juan

When I was in eleventh grade, I took a semester-long class called Introduction to Psychology.  The teacher was Dr. Loren Farmer -- I never found out if he actually had a Ph.D., or if people simple called him "Dr." Farmer because of the air of erudition he had.

The class was taught in an unorthodox fashion, to say the least.  Dr. Farmer was pretty counterculture, especially considering that this was Louisiana in the 1970s.  He stood on no ceremony at all; we were allowed to sit wherever we liked (my favorite perch was on a wide bookcase by the window), and class was more of a free-roaming discussion than it was the usual chalk/talk typical of high school back then.  Even his tests were odd; we had a choice on his final exam of ten or so short-answer/essay questions from which we were to answer seven, and I recall that one of them was "Draw and interpret three mandalas."  (I elected not to do this one.  My ability to sling the arcane-sounding bullshit was and is highly developed, but my artistic ability pretty much stalled out in third grade, and I didn't think I could pull that one off.)

Some time around the middle of the semester, he instructed us to go buy a copy of a book that would be assigned reading over the following few weeks.  The book was The Teachings of Don Juan: A Yaqui Way of Knowledge, by Carlos Castaneda.  I had never heard of it, but I dutifully purchased the book.


I was nothing short of astonished when it turned out to be about the use of hallucinogenic drugs.  Castaneda tells the story of his apprenticeship to Don Juan Matus, a Yaqui native from Mexico, wherein he was given peyote, Psilocybe mushrooms, and Datura (Jimson weed), inducing wild visions that Don Juan said weren't hallucinations; they were glimpses of an "alternate reality" that sorcerers could use to gain power and knowledge.  Castaneda starts out doubtful, but eventually goes all-in -- and in fact, wrote one sequel after another describing his journey deeper and deeper into the world of the brujo.

I was captivated by Castaneda's story.  I read the sequel to Teachings, A Separate Reality.  The third one, Journey to Ixtlan, was even better.  Then I got to the fourth one, Tales of Power, and I began to go, "Hmmm."  Something about the story seemed off to me, as if he'd gone from recounting his real experiences to simply making shit up.  I made it through book five, The Second Ring of Power, and the feeling intensified.  About two chapters into book six, The Eagle's Gift, I gave the whole thing up as a bad job.

But something about the stories continued to fascinate me.  The best parts -- especially his terrifying vision of a bridge to another world in the fog at night in A Separate Reality, and his witnessing a battle of power in Journey to Ixtlan -- have a mythic quality that is compelling.  But the sense that even apart from any supernatural aspects, which I predictably don't buy, the books were the product of a guy trying to pull a fast one on his readers left me simultaneously angry and disgusted.

I discovered that I'm not alone in that reaction.  Richard de Mille (son of Cecil), an anthropologist and writer, wrote a pair of analyses of Castaneda's books, Castaneda's Journey and The Don Juan Papersthat I just finished reading a few days ago, explaining my resurgence of interest in the subject.  De Mille pounced on something that had been in the back of my mind ever since reading Journey to Ixtlan -- that it would be instructive to compare the timeline of the first three books, as Ixtlan overlaps the years covered by the first two, Teachings and A Separate Reality.

And what de Mille found is that the books are full of subtle internal contradictions that one would never discover without doing what he did, which is to lay out all of the carefully-dated supposed journal entries Castaneda gives us in the first three books.  Among the more glaring errors is that Castaneda is introduced for the first time to a major character, the brujo Don Genaro, twice -- over five years apart.   Also separated by years are events in which Castaneda saw (the word in italics is used by Castaneda to describe a mystical sort of vision in which everything looks different -- humans, for instance, look like bundles of fibers made of light) and in which Don Juan tells his apprentice "you still have never seen."

Worse still is the fact that Ixtlan recounts a dozen or so mind-blowing experiences that allegedly occurred during the same time period as Teachings -- and yet which Castaneda didn't think were important enough to include in his first account.  Add to that the point de Mille makes in The Don Juan Papers that not only do the Yaqui not make use of hallucinogens in their rituals, Don Juan himself never tells Castaneda a single Yaqui name -- not one -- for any plant, animal, place, or thing they see.  Then there's the difficulty pointed out by anthropologist Hans Sebald, of Arizona State University, that Castaneda claims that he and Don Juan went blithely wandering around in the Sonoran desert in midsummer, often with little in the way of food or water, never once making mention any discomfort from temperatures that would have hovered around 110 F at midday.

The conclusion of de Mille and others is that Castaneda made the whole thing up from start to finish, and the books are the combination of scraps of esoteric lore he'd picked up in the library at UCLA and his own imagination.  There was no Don Juan, no Don Genaro, no glow-in-the-dark coyote that spoke to the author at the end of Ixtlan.  Distressing, then, that de Mille's rebuttals -- which were published in 1976 and 1980, respectively -- didn't stop Castaneda from amassing a huge, and devoted, following.  He founded a cult called "Tensegrity" which alleged to teach the acolyte the secrets of how to see Don Juan's alternate reality.  He surrounded himself with a group of women called "the nagual women" (unkinder observers called them the "five witches") who did his bidding -- Florinda Donner-Grau, Taisha Abelar, Patricia Partin,  Amalia Marquez, and Kylie Lundahl -- all of whom vanished shortly after Castaneda died of liver cancer in 1998.  There's been no trace discovered of any of them except for Partin, whose skeleton was discovered in Death Valley in 2006, but it's thought that all five committed suicide after their leader died.

So what began as a hoax ended up as a dangerous cult.  Castaneda seems to have started the story as a way of pulling the wool over the eyes of his advisers in the anthropology department at UCLA (it worked, given that Journey to Ixtlan is essentially identical to his doctoral dissertation), but as so often happens, fame went to his head and he moved from telling tall tales about an alleged Yaqui shaman to using the people who bought into his philosophy as a way to get money, sex, and power.

And it can be imagined how pissed off this makes actual Native Americans.  Castaneda hijacked and mangled their beliefs into something unrecognizable -- placing his books in with Seven Arrows as yet another way that non-Natives have appropriated and misrepresented Native culture.  (If you've not heard about Seven Arrows, by Hyemeyohsts Storm, it's a mystical mishmash containing about 10% actual facts about the Cheyenne, and 90% made-up gobbledygook.  Storm himself -- his actual name is Arthur -- claims to be half Cheyenne and to have grown up on the reservation, but the Cheyenne tribal authorities say they've never heard of him.)

What's saddest about all of this is that Castaneda could have simply written "fiction" after the title of his books, and they'd have lost nothing in impact.  It's not that fiction has nothing to teach us, gives us no inspiration, doesn't consider the profound.  In fact, I would argue that some of the most poignant lessons we learn come from the subtexts of the fiction we read.  (I have tried to weave that into my own writing, especially my novel Sephirot, which is about one man's Hero's Journey placed in the context of Jewish mystical lore.)

But instead Castaneda lied to his readers.  There's no kinder way to put it.  He told us that it was all real.  Not content with writing an excellent work of inspirational fiction, he instead is relegated to the ignominious ranks of clever hoaxers.  (Or at least should be; de Mille says there are still lots of college classes in which Castaneda's books are required reading, and not as an example of an anthropological hoax, but as real field work in ethnology and belief.)

So however entertaining, and even inspiring, his books are, the whole thing leaves me with a bad taste in my mouth.  In short, truth matters.  And the fact is, Carlos Castaneda was nothing more than a sly and charismatic liar.

Saturday, April 15, 2017

If the spirit is willing

Turns out, you have to be careful what you label "non-fiction."

Warner Brothers Studios is currently embroiled in a lawsuit that falls under what lawyers technically call "being between a rock and a hard place."  It all started when they released their horror film The Conjuring in 2013, which is all about fun and entertaining things like demons and curses and exorcisms and parents attempting to kill their children.  The Conjuring was based on the book The Demonologist, by Gerald Brittle, which told the story of two demon hunters named Ed and Lorraine Warren, who go from town to town rousting out evil spirits, which is apparently lucrative work these days.

Brittle had an exclusive contract to tell the Warrens' story, and when he found out that Warner Brothers had made a movie based on it, he told them they'd violated that agreement and (in essence) made a movie based on his book without permission or compensation.  Warner Brothers fired back that Brittle's book is labeled "non-fiction" -- meaning that he was claiming the events were true, and therefore part of the "historical record."  As such, they're open for anyone to exploit, and such accounts would come under fair use law.

This is where it gets interesting.  Brittle's attorney, Patrick C. Henry, says that Brittle now knows that the Warrens' account is "a pack of lies."  Further, Henry says that it was impossible for Warner Brothers to make the movie they did without basing it almost entirely on Brittle's book.  "It is very hard to believe that a large conglomerate such as Warner Brothers, with their army of lawyers and who specializes in intellectual property rights deals, would not have found The Demonologist book or the deals related to it, or Brittle for that matter," Henry says.  "The only logical conclusion is that the studio knew about the Warren's agreement with Brittle but just assumed they would never get caught."

So Henry filed a lawsuit on Brittle's behalf, to the tune of $900 million.  To Warner Brothers' defense that you can't invoke intellectual property rights law over events that actually occurred, Henry had an interesting response.

If Warner Brothers is claiming that The Demonologist (and therefore the events in The Conjuring) are real, then Brittle will drop the lawsuit -- if Warner Brothers can offer up concrete proof of ghosts.

[image courtesy of the Wikimedia Commons]

This puts the film giant in a rather awkward position.  If they can prove ghosts are real, then the claim that story in The Conjuring is non-fiction has at least some merit.  If they can't, it's fiction, and the studio is guilty of ripping off Brittle's work as the basis of their movie.

Sort of reminds me of the old method of determining whether someone is a witch.  They tie the accused hand and foot, and throw them in a pond.  If they drown, they're innocent.  If they survive, Satan was protecting them, and they're a witch, so you burn them at the stake.

Of course, the difference here is that Brittle is just one guy, and Warner Brothers is a multi-million dollar corporation that can afford huge legal expenses without any particular problem.  Although Brittle's defense is clever, I'd be willing to put money that the lawsuit is going to get settled out of court -- whether or not Warner Brothers can produce a ghost.

On the other hand, maybe they will find a spirit that's willing.  Then not only will Brittle have no choice but to drop his lawsuit, Warner Brothers will be in good shape to win the James Randi Million Dollar Challenge.  So stay tuned.

Friday, April 14, 2017

Bleach supplement

New from the A Little Bit of Knowledge Is Dangerous department, we have a guy in the UK who is selling a health supplement that contains "stabilized negative ions of oxygen."

The product is named, with no apparent irony intended, "Aerobic Oxygen."  Presumably to distinguish it from all of that anaerobic oxygen floating around out there.  The company that produces it, Vitalox, says that their product is "the foundation of good health," and that sixty drops of their product consumed per day, "in any cold drink," plus another few added to your toothpaste and mouthwash, will bring you to the peak of health.  "Cellulite," we're told, is what happens "when fat cells are starved of oxygen."  (Never mind that "cellulite" is simply fat in which the connective tissue surrounding it has developed minor hernias, but otherwise is indistinguishable from regular old fat.  But why start trying to be scientifically accurate now?)

Oh, and consuming "Aerobic Oxygen" will reduce your likelihood of developing cancer, heart disease, and high blood pressure.

You might be wondering what's in this miracle drug.  I know I was.  The ingredients list reads as follows: "Contains purified ionised water, sodium chloride 1.6 micrograms per serving, Stabilised Oxygen molecules."  Which doesn't tell us much beyond what their sales pitch said.  What sort of additive would provide "stabilized oxygen?"  It's not simply dissolved oxygen; that would diffuse out as soon as you opened the bottle, and in any case the solubility of oxygen in water is low enough that you can't dissolve enough in it to make a difference to anyone but a fish.

[image courtesy of the Wikimedia Commons]

So a chemist named Dan Cornwell, from Kings College - London, decided to test "Aerobic Oxygen" to see what was really there.  And what he found was that the "stabilized oxygen" in the solution is coming from a significant quantity of either sodium chlorite or sodium hypochlorite.

For the benefit of any non-chemistry types, sodium chlorite is not the same as sodium chloride.  Sodium chloride is table salt.  Sodium chlorite (and sodium hypochlorite as well) are highly alkaline, reactive compounds whose main use in industry is as a bleach.  Cornwell found that not only is "Aerobic Oxygen" a bleach solution, it has the same pH as Drano.

"The two main conclusions I can draw is that the Vitalox solution has a pH of about 13, putting it in the same region as concentrated household bleach – which contains sodium hypochlorite and sodium hydroxide – or an oven cleaner," Cornwell said.  "And when it combined with the potassium iodide it produced iodine, which shows that there’s a strong oxidizing reaction.  I’m not 100% sure of the nature of the oxidizing agent, but since it has a basic pH and gave a positive result with the iodine test it’s reasonable to say it’s probably sodium chlorite or something similar."

David Colquhoun, professor of pharmacology at University College - London, was even more unequivocal. "You don’t absorb oxygen through your stomach," Colquhoun said.  "There’s not the slightest reason to think it works for anything...  A few drops in a glass of water probably won't actually kill you, but that's a slim marketing claim."

But don't worry if the "Aerobic Oxygen" doesn't quite live up to its claims; Vitalox also has a "Spirituality Page," wherein we find out that "Our spirit is real and we all have one, like it or not.  Recognise it or not.  It needs feeding too or it will get sick and may even die."  One of the features of the "Spirituality Page" is a "A Radio Stream to Wet [sic] Your Appetite," which is an odds-on contender for the funniest misspelling I've seen in ages.

So the whole "Aerobic Oxygen" thing is not only bullshit, it's potentially dangerous bullshit.  The site boils down to "drink our expensive bleach solution, and even if it doesn't kill you, your spirit won't be sick."  The take-home message here is, don't be taken in by fancy-sounding sort-of-sciencey-or-something verbiage and fatuous promises.  Do your research, and find out what you're thinking of ingesting before you buy it.  And that goes double when someone tells you they're selling you "aerobic oxygen."

Thursday, April 13, 2017

Far beyond tone-deaf

Is it just me, or do the members of the Trump administration have a really poor sense of timing?

At a gathering to launch Black History Month, Trump caused some serious head-scratching with his comment that Frederick Douglass "is doing an amazing job," which is doubly impressive given that Douglass died 122 years ago.  He declared April "Sexual Assault Awareness Month," and on April 4 said that conservative pundit Bill O'Reilly "should not have settled" five cases in which he was accused of sexual harassment or inappropriate behavior.  (In fact, he said, "I don't think Bill did anything wrong" -- which, considering that he's said that you have to "treat women like shit" and that it's okay to "grab them by the pussy," might not be the most weighty endorsement O'Reilly could have hoped for.)

Then, the White House released an official statement on Holocaust Remembrance Day -- and never once mentioned the Jews.  This was followed up by Sean Spicer's bizarre comment two days ago that "even Hitler didn't sink to the level of using chemical weapons," ignoring the fact that chemical weapons were used to gas six million Jews.

Which statement he made in the middle of Passover.

[image courtesy of the Wikimedia Commons]

Look, this goes way beyond tone-deafness.  This is beginning to look like a deliberate campaign to minimize the suffering of anyone who doesn't directly contribute to the Republican party.  These aren't "gaffes;" a "gaffe" is John Kerry describing his nuanced approach to support for the Iraq War as "I actually did vote for the $87 billion before I voted against it."  Okay, that was stupid and inarticulate, and he was the recipient of well-deserved ridicule for saying it.

But this?  This goes way beyond stupid and inarticulate.  In fact, even "insensitive and insulting" don't begin to cover it.

Fortunately -- if there is a "fortunately" in this situation -- the backlash against the latest crazy comment was immediate and blistering.  Here is a sampling of responses to Spicer's Hitler comment from Twitter:
This doesn't really answer the question about why doing it from a plane is worse than building gas chambers in death camps, of course. 
Sean Spicer just called concentration camps "Holocaust centers" and said Hitler didn't use chemical weapons despite Zyklon B.  Happy Passover, guys. 
A list of things that come to mind when you think of Hitler: (1) mustache; (2) gassed people. 
Hey, Sean?  "Clarify" doesn't mean "make way worse." 
Being Press Secretary for Donald Trump is hard.  But it's not as hard as he makes it look.
Then, there's my favorite one:
PEPSI: Check out this PR disaster.
UNITED: That's amateur hour.  Watch this!
SEAN SPICER: Hold my beer.
The most trenchant comment of all, however, was from Spencer Ackerman, blogger, writer, and editor of The Guardian.  Ackerman wrote: "Hitler was not 'using the gas on his own people,' says Sean Spicer, writing German Jews out of history."

Which appears to be what this is about -- marginalizing people who don't fit in with the Trumpian version of Volksgemeinschaft, or worse, pretending that they simply don't exist.  You make a statement like that once, you can pass it off as shooting from the hip, bobbling an opportunity to make an inclusive, insightful statement, having a brain fart.  Maybe even twice.

But four times?  This is beginning to look deliberate.

I don't mean to sound like a conspiracy theorist, here.  But this administration is establishing an appalling pattern of cultural and racial insensitivity.  If you needed further evidence, Trump himself went on record as saying, regarding the threats against Jewish schools, Nazi graffiti defacing Jewish community centers, and desecration of Jewish graveyards -- all of which have increased drastically since Trump's win last November -- that it might be the Jews themselves perpetrating the attacks.  It's not always anti-Semites doing these things, Trump said.  "Sometimes it’s the reverse, to make people — or to make others — look bad."

So tell me again how all of the other things that have come out of the mouths of Trump and his spokespeople have been simple "gaffes."

I've been trying to hold my outrage in abeyance.  I understand that it's hard to think on your feet every time you're on the spot.  People in the public eye are scrutinized, they inevitably make mistakes, and their missteps are played and replayed and analyzed and reanalyzed.  But if this administration wants to regain its credibility, it needs to give more than lip service to stopping this kind of shit.  We need more than Trump's statement that "Number one, I am the least anti-Semitic person you've ever seen in your entire life.  Number two, the least racist."  These last few months have left me feeling a little dubious on that point.  I think a fitting place to end is with a quote from Rabbi Jonah Dov Pesner, director of the Religious Action Center of Reform Judaism, who said: "President Trump has been inexcusably silent as this trend of anti-Semitism has continued and arguably accelerated.  The president of the United States must always be a voice against hate and for the values of religious freedom and inclusion that are the nation’s highest ideals."