Skeptophilia (skep-to-fil-i-a) (n.) - the love of logical thought, skepticism, and thinking critically. Being an exploration of the applications of skeptical thinking to the world at large, with periodic excursions into linguistics, music, politics, cryptozoology, and why people keep seeing the face of Jesus on grilled cheese sandwiches.

Monday, March 2, 2015

A case of the blues

This post is brought to you by the color blue.

But not to worry: this is not about the damned dress, about which I have heard enough in the past week to last me several lifetimes.  This is about a different viral story, and one that has even less scientific validity than the whole what-color-is-this-dress thing.

I refer to a claim I've seen multiple times in the last few days that claims that because ancient languages had no word for the color blue, that they were unable to see blue.  Or, more accurately, that "blue" didn't exist in their mental and linguistic framework, so they were unable to see the difference between blue and colors that were nearby on the color wheel (especially green).  This, they say, explains Homer's "wine-dark seas," a metaphor I've always thought was as strange as it was evocative.  Kevin Loria, author of the article in question, writes:
In 1858, a scholar named William Gladstone, who later became the Prime Minister of Great Britain, noticed that this wasn't the only strange color description. Though the poet spends page after page describing the intricate details of clothing, armor, weaponry, facial features, animals, and more, his references to color are strange. Iron and sheep are violet, honey is green. 
So Gladstone decided to count the color references in the book. And while black is mentioned almost 200 times and white around 100, other colors are rare. Red is mentioned fewer than 15 times, and yellow and green fewer than 10. Gladstone started looking at other ancient Greek texts, and noticed the same thing — there was never anything described as "blue." The word didn't even exist. 
It seemed the Greeks lived in murky and muddy world, devoid of color, mostly black and white and metallic, with occasional flashes of red or yellow.
Loria goes on to tell us about the studies of a linguist named Lazarus Geiger, who found that there was also no word for blue in ancient Hebrew, Icelandic, Chinese, and Sanskrit.

The conclusion?  Our ability to see blue is a recent innovation -- and has to do with our having a linguistic category to put it in.  Without a linguistic category, we can't discriminate between blue and other colors.

There are two problems, of increasing severity, with this hypothesis.

The first is that this is a specific case of what is called the Strong Sapir-Whorf Hypothesis -- that our experience of the world depends on our having a linguistic framework for it, and without that framework, we are unable to conceptualize categories for things.

The Strong Sapir-Whorf Hypothesis has a difficulty -- which is that it doesn't square with what we know about either physiology or language evolution.  In the case of color discrimination, the fact is that in the absence of a physiological impairment (e.g. colorblindness), most people have similar neural responses to observing colored regions.  There are a small number of people, mostly female, who are tetrachromats -- they have four, instead of three, color-sensing pigments in the retinas of their eyes, and have a much better sense of color discrimination than we trichromats do.  But the physiology would argue that mostly we all experience color the same way.

The Strong Sapir-Whorf Hypothesis, apropos of color discrimination, fails on a second level, however; Brent Berlin and Paul Kay found, back in the 1960s, that languages have a very predictable order in which they add color words to their lexicon.  It goes like this:
  1. All languages contain terms for black and white (or "dark" and "light").
  2. If a language contains three terms, then it contains a term for red.
  3. If a language contains four terms, then it contains a term for either green or yellow (but not both).
  4. If a language contains five terms, then it contains terms for both green and yellow.
  5. If a language contains six terms, then it contains a term for blue.
  6. If a language contains seven terms, then it contains a term for brown.
  7. If a language contains eight or more terms, then it contains terms for purple, pink, orange, and/or gray.
This agrees with the way our eyes perceive color, in terms of the peaks of cone sensitivity. It is surmised that the greater the necessity to differentiate between different colors -- for example, in determining the difference between poisonous fruits and edible ones in a rain forest -- the greater the complexity of words for different shades and hues.  In the ancient world, understandably enough, you coined words for things that had survival value, and pretty much ignored everything else.

But what about Loria's claim that many ancient languages didn't have words for blue? This brings us to the second problem with the article -- which is that this is simply an untrue statement.

The ancient Greeks had the word κυανός, which means "dark blue" -- specifically the color of the mineral azurite, which was highly prized for jewelry and statuary.  It's the root of our word "cyan."  And the Greeks weren't the only ones; the Hebrews had the word t'chalet, as in Numbers 15:38:
Speak unto the children of Israel, and bid them that they make them fringes in the borders of their garments throughout their generations, and that they put upon the fringe of the borders a ribband of blue.
Even today, the tallit, or Jewish prayer shawl, is always decorated in blue.  (And it is no coincidence that the Israeli flag is blue and white.)

What about Old Icelandic?  They had the word blár, which meant, you guessed, it, "blue."  It's no new innovation, either; it's used in the 10th century Hrana saga hrings (The Saga Cycle of Hrani), in which we have the line, "Sýndi Hrani, hversu hún hafði rifit af honum klæði, og svo var hann víða blár go marinn," meaning, "Hrani showed that she had torn off his clothes, and he was widely blue and bruised."  (What?  It's an Icelandic saga.  You thought it was going to be about bunnies and rainbows?)

And I don't know any Chinese or Sanskrit, but I'd bet they had words for blue, too.  One of the most prized gemstones in the ancient world was lapis lazuli -- and according to an article posted at the website of the Gemological Institute of America:
Historians believe the link between humans and lapis lazuli stretches back more than 6,500 years. The gem was treasured by the ancient civilizations of Mesopotamia, Egypt, China, Greece, and Rome. They valued it for its vivid, exquisite color, and prized it as much as they prized other blue gems like sapphire and turquoise.
Hard to imagine why our distant ancestors would have done this if they saw blue stones as, in Loria's words, "muddy and murky... mostly black and white and metallic, with occasional flashes of red or yellow."

[image courtesy of photographer Hannes Grobe and the Wikimedia Commons]

So the whole premise is false, and it's based on zero biological evidence.  But that hasn't stopped it from being widely circulated, because as we've seen more than once, a curious and entertaining claim gets passed about even if it's entirely baseless.

I'll end here.  I'm feeling rather blue after all of this debunking business, and not gray or metallic at all.  And I suspect I'd feel that way even if I didn't have a word for it.

Saturday, February 28, 2015

A recall-the-idiots clause

There should be some kind of provision for removing from office politicians who unequivocally demonstrate that they are morons.

Being in public office is highly demanding, requires thinking on one's feet, and necessitates having a working knowledge of a great many different areas.  So I'm not expecting perfection, here.  Everyone makes missteps, and our leaders are no exception.  They should not be excoriated just for uttering a gaffe here or there.

But sometimes, there are examples of idiocy so egregious that they really should prompt a recall of some kind.  We need to be led by the best minds we have -- and if politicians demonstrate that their IQs are lower than their shoe sizes, they should be shown the door.

Because I'm guessing that some of these people are too stupid to find the door unassisted.

I bring this up because of four -- count 'em, four -- examples of deeply ingrained stupidity in our elected officials just from the past three days.  WARNING: put a pillow on your desk, because I'm guessing there will be multiple headdesks to follow.

Let's start with Representative Barry Loudermilk of Georgia, who was asked at a press conference about the link between vaccines and autism.  He responded that he chose not to vaccinate his own children.  "We didn't immunize," he said.  "They're healthy."  Which is analogous to a guy saying, "Seatbelts are unnecessary.  I drive without a seatbelt, and I'm still alive."

Oh, and have I mentioned that Representative Loudermilk is on the House Subcommittee for Science and Technology?

Then we have Idaho Representative Vito Barbieri, who was in a hearing about a bill that involved the use of telemedicine -- using tiny remote devices to give doctors information, such as a little camera that could be swallowed in place of a standard colonoscopy.  Barbieri asked a doctor who was giving testimony in the hearing if the same technique could be used to give doctors information about the fetus during pregnancy.

"Can this same procedure then be done in a pregnancy?" Barbieri asked.  "Swallowing a camera and helping the doctor determine what the situation is?"

The doctor patiently explained that that wouldn't work, because a woman's reproductive system isn't connected to her digestive tract.

"Fascinating," Barbieri replied.  "That makes sense."

Is it just me that finds it appalling that we're allowing men who don't know that a woman's uterus isn't connected to her colon to make decisions regarding women's health?

Even worse is New York Assemblyman Thomas Abinanti, who wants a bill passed blocking the use of GMOs in vaccines.  Here's the language he wants passed:
PROHIBITION ON THE USE OF VACCINES CONTAINING GENETICALLY MODIFIED ORGANISMS.
1. NO PERSON SHALL BE VACCINATED WITH A VACCINE THAT CONTAINS GENETICALLY MODIFIED ORGANISMS.
2. “GENETICALLY MODIFIED ORGANISM” SHALL MEAN: (A) AN ORGANISM THAT HAS BEEN ALTERED AT THE MOLECULAR OR CELLULAR LEVEL BY MEANS THAT ARE NOT POSSIBLE UNDER NATURAL CONDITIONS OR PROCESSES, INCLUDING RECOMBINANT DNA AND RNA TECHNIQUES, CELL FUSION, MICROENCAPSULATION, MACROENCAPSULATION, GENE DELETION AND DOUBLING, INTRODUCTION OF A FOREIGN GENE, AND A PROCESS THAT CHANGES THE POSITIONS OF GENES, OTHER THAN A MEANS CONSISTING EXCLUSIVELY OF BREEDING, CONJUGATION, FERMENTATION, HYBRIDIZATION, IN VITRO FERTILIZATION, OR TISSUE CULTURE; AND (B) AN ORGANISM MADE THROUGH SEXUAL OR ASEXUAL REPRODUCTION, OR BOTH, INVOLVING AN ORGANISM DESCRIBED IN PARAGRAPH (A) OF THIS SUBDIVISION, IF POSSESSING ANY OF THE ALTERED MOLECULAR OR CELLULAR CHARACTERISTICS OF THE ORGANISM SO DESCRIBED.
There's just one tiny problem with all of this.  The processes he's describing are the ones used to inactivate the viruses and bacteria used in vaccines.  If the bill passed, it would require that vaccines contain unmodified pathogens -- i.e., the strains of the microorganisms that cause disease.

Can't you hear what the doctors would have to tell parents?  "Just to let you know, Mrs. Fernwinkle.  One of the side effects of this tetanus vaccine is that your son will get lockjaw and die, because I'm injecting him with the tetanus bacteria itself."

But no one is a better candidate for the "You Are Too Stupid To Govern" award than Senator James Inhofe of Oklahoma, who is the chair of the Committee on the Environment and Public Works, and who this week brought a snowball into the Senate and threw it on the floor.  "In case we have forgotten, because we keep hearing that 2014 has been the warmest year on record, I ask the chair, 'You know what this is?'  It's a snowball, from outside here.  So it's very, very cold out.  Very unseasonable."

Which makes me want to scream, "There is a difference between weather and climate, you illiterate moron!"  "It just snowed" is not an argument against climate change, just as "I have lots of money" is not an argument against world poverty.  And ironically, the same day as Inhofe did his idiotic demonstration, scientists at Berkeley National Laboratory in California announced that they had data directly correlating carbon dioxide concentrations in the atmosphere with the trapping of thermal energy -- something that has been demonstrated many times in the lab, but never under ordinary conditions out in the environment.  The data -- which has been collected over a period of ten years from two widely-separated sites -- agrees exactly with climate-change models that nitwits like Inhofe think have been manufactured by evil scientists for their own personal gain.


So I really think we need to have an option for recall.  An "I'm sorry, we have to hold a revote, because we accidentally elected a blithering idiot to public office" clause.

I mean, seriously: do we want these people in charge of making decisions about our future?

Now, I have to go.  I've got an ice pack and some aspirin waiting for me.  My forehead hurts.

Friday, February 27, 2015

Civil disobedience as a moral imperative

Let me just say at the outset that I'm a law-abiding sort.  With the exception of getting pulled over twice for driving too fast, I've never had a single unpleasant run-in with the cops.  And both times I got caught speeding, I was able to argue my way out of a ticket.

While I'd like to think that my history of clean living is because I have a respect for authority and the rule of law, some of it is due to the simple fact that I hate complications and conflict.  If I come up to a stop sign in broad daylight, and it's clear that no oncoming car on either side is within a quarter-mile of the intersection, I'd rather stop, look both ways, and then go rather than run the stop sign and risk having a third opportunity to explain my actions to a cop.

But my question of the day is: are there times when deliberately, knowingly breaking the law is the right thing to do?

I'm talking, of course, about civil disobedience.  And in my opinion, sometimes putting your own legal record, safety, or (perhaps) life at risk to make a higher point is not only the right thing, it comes close to a moral imperative.

A 2010 sit-in in Budapest protesting forced evictions of the poor [image courtesy of the Wikimedia Commons]

The whole idea of breaking the law to bring attention to a greater wrong has been much on my mind lately, for two entirely different causes, both of which will be immediately evident to regular readers of this blog.  The first one is the "opt-out" possibility for standardized testing, which is coming to a head in a lot of states, most recently New Mexico -- where state education officials are using combative language to make the point that exempting students from standardized tests is illegal, and districts that do not compel all children to sit for mandated exams risk losing their funding.  A number of districts are rebelling, some even providing pre-printed forms to parents to sign that exempt their children from the PARCC (Partnership for Assessment of Readiness for College and Careers) exams.

And none did it with such panache as the Las Cruces School District, where the forms were printed with the statements, "Federal and state laws require all students to participate in state accountability assessments," and "These laws do not offer an exemption or right of refusal to test."  One has to wonder how close they were to adding, "But this form allows parents to exempt their kids anyway," and "You can kiss the Las Cruces School District's rosy-red ass, policy wonks."

The other area in my life in which civil disobedience is making some demands is in our area's attempts to block the storage of LPG (liquified petroleum gas) in unstable salt caverns beneath Seneca Lake.  Over 200 people, including my wife, have been arrested and charged with trespassing for blockading the gates of the facility, and I'm likely to be in the next round.  (Apparently they're not marching the protesters off in handcuffs, which I find kind of disappointing.  Such a missed opportunity for a photo-op.  But if someone can get a photograph of me being arrested, when it happens, I'll certainly find a way to post it here.)

Of course, what I'm talking about here is mild compared to the penalties you can incur in other countries.  Protesting against repressive governments in other countries can get you jailed and/or tortured, being that that's what repressive governments do.  Deliberately breaking the law to make a point reaches its pinnacle of risk in places like Saudi Arabia, where last week a young man was sentenced to death by public beheading for tearing up a qu'ran, hitting it with a shoe, and uttering curses against the prophet Muhammad.  Apparently the man is an atheist -- or, as they call them in that part of the world, an "apostate" -- and he was demonstrating his contempt for religion in general, and Islam in particular, by his actions.

And Saudi law being what it is, in a few weeks he'll almost certainly find himself kneeling in the city square of his home town of Hafr al-Batin, and his head will be severed with a sword.

Which brings up the question of when a cause is important enough to risk your own life.  Or, to put it another way, when is something legal, and at the same time so ethically wrong, that putting yourself in harm's way is the right thing to do?

Not easy questions to answer.  Human morality being the shaky thing it sometimes is, it's easy to conceive of someone breaking the law for his/her own selfish ends, and then justifying it by calling it civil disobedience.  It's also true that one person's civil disobedience is another person's immorality -- as in the parents who are putting other children at risk of disease by their insistence on their right not to vaccinate their own kids.

These are difficult things to sort out.  The best choice is to do a lot of soul-searching before you embark on such a course of action, not only to be certain you understand the risk, but to make sure that you're not engaging in equivocation to rationalize away something that you really shouldn't have done in the first place.  As we discuss at length in my Critical Thinking classes, morality is a deeply personal thing, and unfortunately the words "moral," "ethical," and "legal" don't always line up the way we might hope.  I'll end with a quote from that exemplar of the willingness to put one's life at risk for a higher cause, Martin Luther King, Jr., who wrote, in Letter from the Birmingham Jail:
There comes a time when the cup of endurance runs over, and men are no longer willing to be plunged into the abyss of despair.  I hope, sirs, you can understand our legitimate and unavoidable impatience.  You express a great deal of anxiety over our willingness to break laws.  This is certainly a legitimate concern.  Since we so diligently urge people to obey the Supreme Court's decision of 1954 outlawing segregation in the public schools, at first glance it may seem rather paradoxical for us consciously to break laws.  One may well ask: "How can you advocate breaking some laws and obeying others?"  The answer lies in the fact that there are two types of laws: just and unjust.  I would be the first to advocate obeying just laws.  One has not only a legal but a moral responsibility to obey just laws.  Conversely, one has a moral responsibility to disobey unjust laws. I would agree with St. Augustine that "an unjust law is no law at all."

Thursday, February 26, 2015

The fire of the mind

In Umberto Eco's masterful medieval murder mystery The Name of the Rose, we meet a villain who is willing to kill, over and over, to stop his fellow monks...

... from reading a book.

The following is a spoiler, so you can skip the next few paragraphs (scroll down to where it says [end spoiler alert]) if you haven't read Eco's novel.  Which I hope you all will, because it's brilliant.  But the punchline makes a point that needs to be made now, seven centuries after the time in which the novel is set, as strongly as it did then.

Europe of the early 14th century was a grim place, and life was, in Thomas Hobbes's words, "solitary, poor, nasty, brutish, and short."  Religion had an iron grip over people's lives, and the learned men of the time taught that the fear of god was paramount.  This fear was translated downwards into fear of the "hierarchy of heaven," as represented here on Earth by the Pope, Cardinals, Bishops, and the monastic system.  And within that system, questioning and freedom of thought was considered heresy, punishable by death.

In this system we find two opposing characters: the brilliant and curious scholar Brother William of Baskerville, and the stern and unyielding Brother Jorge of Burgos.  William is called in to solve a series of murders that have occurred in an unnamed abbey in the mountains of Italy.  The murders revolve around the abbey's magnificent library, the secrets of which are only accessible to the librarian and his assistants.  And one by one, the monks connected with the library are being picked off by someone who is bound and determined to keep some of its knowledge out of the hands of the monks (or anyone else).

The knowledge in question turns out to be a book by Aristotle that was thought lost; the second volume of his Poetics, in which he describes the proper use of comedy, and argues that laughter is freeing, proper, and purifying for the soul.  When Brother William solves the mystery, and discovers that Brother Jorge is behind the murders and has hidden the book, he confronts the old man, and asks him why it was so important to keep such a seemingly innocent volume out of people's hands.  Brother Jorge responds:
(L)aughter is weakness, corruption, the foolishness of our flesh... (H)ere, the function of laughter is reversed, it is elevated to art, the doors of the world of the learned are opened to it, it becomes the object of philosophy, and of perfidious theology...  You saw yesterday how the simple can conceive and carry out the most lurid heresies, disavowing the laws of God and the laws of nature.  But the church can deal with the heresy of the simple, who condemn themselves on their own, destroyed by their own ignorance...  Laughter frees the villein from fear of the Devil, because in the feast of fools the Devil also appears poor and foolish, and therefore controllable.  But this book could teach that freeing oneself from the fear of the Devil is wisdom.  When he laughs, as the wine gurgles in his throat, the villein feels he is master, because he had overturned his position with respect to his lord; but this book could teach learned men the clever, and from that moment, illustrious artifices that could legitimize the reversal.  
To Brother Jorge, it is worth killing, and dying, for his desperate necessity to keep others from knowing the justification of laughter, mirth, and irreverence.  And in the end, he destroys the book and burns down the library to keep that knowledge from the world.

[end spoiler alert]

Which brings us to what has happened in the Middle East in the last few days.

The depredations of ISIS have been all over the news lately, but none have seemed more bizarre and pointless to the western world as two that have occurred recently.  The Islamic State's arm in Libya four days ago burned a pile of musical instruments, saying that such things are "un-Islamic."  Then, just two days ago, ISIS members in Iraq burned the hundred-year-old library in the city of Mosul, destroying 8,000 rare books that were a treasure-trove of cultural information and history.  "900 years ago, the books of the Arab philosopher Averroes were collected before his eyes...and burned," wrote activist and blogger Rayan al-Hadidi.  "One of his students started crying while witnessing the burning.  Averroes told him... the ideas have wings...but I cry today over our situation."

[image courtesy of photographer Alan Levine and the Wikimedia Commons]

Why, in a situation where ISIS members are fighting daily to maintain ground and to keep control of the people they've conquered, would they stop what they're doing to burn musical instruments and ancient manuscripts?  It seems pointless.  Wouldn't they have better things to do with their time and energy?

No.  What ISIS is doing has its own pervasive, evil logic.

It has to do with exactly the same thing that Venerable Jorge hated the idea of: reading, laughter, and music free people from fear.  If you are going to control people, you must control their thoughts.  The first thing you do, therefore, is to destroy any opportunity for them to experience something outside of that control.

Music lifts our emotions into heights that cannot be measured.  When we read, our spirits are free to think any thought, put ourselves in other people's minds, other places, other times.  Dancing does the same thing, which probably explains why Saudi Arabia's "morality police" arrested some young men four days ago for dancing at a birthday party.

Can't have people experiencing anything outside of the narrowly prescribed range of thoughts, feelings, and actions.  If people go outside that range, anything could happen.  And would.

And then, the sword-bearing horrors who are now running much of the Middle East would not be in control any more.  People would learn that there's more to life than fear and obedience, more than living in terror of a grim, humorless cadre of thugs who are so afraid themselves of intellectual and emotional freedom that they will stop at nothing to prevent it from spreading to others.

I live in hope that in our world of interconnectedness and free flow of information via the internet, such control cannot be maintained for long.  We have seen the difficulty the Saudis are having in keeping the holes in the dam from leaking; bloggers and activists who openly criticize the regime are growing in numbers.  Some, such as Raif Badawi, have paid a horrible price for exercising that freedom.

But the truth that ISIS doesn't want their victims to realize is that the spirit of free thought burns hotter than the flames of destruction.  Even if you set fire to books and musical instruments, you can't really control thoughts, even through threats and terror.  The human mind is stronger and more resilient than that.  So even though I weep for the treasures that were lost in the burning of the Mosul Library, I remain optimistic that the desperate and amoral men of ISIS will one day be trod underfoot and forgotten to all but historians, just as their brothers-in-spirit -- the Inquisition of the 14th century -- have been.

Wednesday, February 25, 2015

The myth of the moral high ground

I have a big sign on my classroom wall that says, "Don't believe everything you think."

It's an important rule-of-thumb to keep in mind.  Far too many people become completely convinced that whatever has popped into their brain must be the truth -- sometimes to the point that they don't question it.  Especially if the "truth" under consideration appeals to a conjecture that they've already fallen for.

It's our old friend confirmation bias again, isn't it?  But instead of using slim evidence to support the claim, here you don't need any evidence at all.  "That seems obvious" is sufficient.

Which brings me to two studies released in the past two weeks that blow a pair of neat holes into this assumption.

In the first, a study by IBM's consulting arm looked into whether it's true that millennials -- people who reached their majority after the year 2000 -- are actually the entitled, lazy twits that many think they are.  Because that's the general attitude by the rest of the adult world, isn't it?  The stereotype includes:
  • having been taught by an emphasis on "self-esteem" that there's no reason to push oneself, that "everyone should get a prize" just for showing up
  • being idealists who want to save the world without doing any actual work
  • being narcissistic to the point of unwillingness to work on a team
  • having a severe aversion to criticism, and an even stronger one to using criticism constructively
  • having no respect for authority
And the study has shown pretty conclusively that every one of these stereotypes is wrong.

Or, more accurately, they're no more right about millennials than they are about any other generation. According to an article on the study, reported in The Washington Post:
The survey... didn't find any support for the entitled, everybody-gets-a-trophy millennial mindset.  Reports of their doting parents calling bosses to complain about performance reviews may be out there, but, on the whole, IBM's survey shows a different picture.  Millennials list performance-based recognition and promotions as a priority at the same rate as baby boomers do, and they cite fairness, transparency and consistency as the top three attributes they want in a boss.  Someone who "recognizes my accomplishments," meanwhile, comes in at only sixth place... 
If there's any big takeaway about millennials from IBM's study, it's that they want pretty much the same thing most employees want: an ethical and fair boss, inspirational leadership and the opportunity to move ahead in their careers. Where there were differences, they tended to be relatively small.
And at the risk of sounding cocky -- because I'm as prone to this bias as anyone else is --  I have to say that I wasn't surprised by its findings.  I've worked with teenagers for 28 years, and despite the frequent "kids these days!" and "we never got away with that when I was in school!" grousing I hear from my colleagues, my general attitude has always been that kids are kids.  Despite the drastic differences in cultural context between today and when I started teaching, there have always been lazy kids and hard-working kids, motivated kids and unmotivated kids, entitled kids and ones who accepted responsibility for their own failings.  The stuff around us changes, but people?  They remain people, with all of their foibles, no matter what.

The second study hits near to the quick for me.  It revolves around a common perception of atheists as angry ranters who are mad at the whole world, and especially the religious segment of it.  I've been collared about this myself.  "Why can't you atheists be more tolerant?" I've been asked, more than once.  "You just don't seem to be able to live and let live."

But according to a paper soon to be released in The Journal of Psychology, the myth of the angry atheist is just that -- a myth.  The study's authors write:
Atheists are often portrayed in the media and elsewhere as angry individuals. Although atheists disagree with the pillar of many religions, namely the existence of a God, it may not necessarily be the case that they are angry individuals.  The prevalence and accuracy of angry-atheist perceptions were examined in 7 studies with 1,677 participants from multiple institutions and locations in the United States.  Studies 1–3 revealed that people believe atheists are angrier than believers, people in general, and other minority groups, both explicitly and implicitly.  Studies 4–7 then examined the accuracy of these beliefs.  Belief in God, state anger, and trait anger were assessed in multiple ways and contexts.  None of these studies supported the idea that atheists are particularly angry individuals.  Rather, these results support the idea that people believe atheists are angry individuals, but they do not appear to be angrier than other individuals in reality.
Of course, there's a logical basis to this stereotype; it's the militant ranters who get the most press.  And not only do the angry individuals get the greatest amount of publicity, their most outrageous statements are the ones everyone hears about.  It's why, says Nicholas Hune-Brown, the public perception of Richard Dawkins is that he's the man who "seems determined to replace his legacy as a brilliant evolutionary biologist with one as 'guy who’s kind of a dick on Twitter'"

Once again, we should focus on the outcome of the study -- that atheists are no more likely to be angry than members of other groups.  It isn't saying that there aren't angry atheists; it's saying that there are also angry Christians, Muslims, Jews, and so on.  The perception of atheists as more likely to be intolerant and ill-tempered is simply untrue.

[image courtesy of photographer/artist Emery Way and the Wikimedia Commons]

So back to my original point.  It behooves us all to keep in mind that what we assume to be true may, in fact, not be.  How many times do we all overgeneralize about people of other political parties, religions, genders, gender preferences, even appearance and modes of dress?  It's easy to fall into the trap of saying "All you people are alike," without realizing that what seems like an obvious statement of fact is actually bigotry.

It may be impossible to eradicate this kind of bias, but I'll exhort you to try, in your own mind, to move past it.  When you find yourself engaging in categorical thinking, stop in your tracks, and ask yourself where those beliefs came from, and whether they are justified.  And, most importantly, whether there is any hard evidence that what your brain is claiming is true.

And if the answer to either of the latter questions is "No," then take a moment to suspend your certainty.  Look at the people you'd been judging without needing to make a judgment.  Get off the moral high ground.  I think you'll find that empathy and tolerance are, in general, a far better perspective from which to view the world.

Tuesday, February 24, 2015

The dark side

I love science, but sometimes scientists can be their own worst enemies.

The reason I say this is that scientists sometimes have a tendency to throw caution to the wind and engage in speculation, which then gets reported by the media as "scientific fact."  When said speculation turns out to be false, or is superseded by other models for which there is more evidence, laypeople get the wrong idea that scientists sit around all day making shit up, and when it turns out to be wrong, they just make more shit up, and on and on it goes.

So media bears a large share of the blame for this, as usual.  But that said, it would be nice if there was some way for scientists to identify in their academic papers when they're engaging in tentative hypothesis, and when they're elaborating on a well-established and rock-solid theoretical model.

Amongst the latter would be evolution and anthropogenic climate change.  Just had to throw that in there.

But as an example of the former, let's look at a paper by Michael Rampino, professor of biology at New York University, who recently published a paper in Monthly Notices of the Royal Astronomical Society proposing that the periodic mass extinctions that have occurred on Earth might be caused by the interaction between the Solar System and a thin layer of dark matter along the galactic plane.  Rampino writes:
A cycle in the range of 26–30 Myr has been reported in mass extinctions, and terrestrial impact cratering may exhibit a similar cycle of 31 ± 5 Myr. These cycles have been attributed to the Sun's vertical oscillations through the Galactic disc, estimated to take from ∼30 to 42 Myr between Galactic plane crossings. Near the Galactic mid-plane, the Solar system's Oort Cloud comets could be perturbed by Galactic tidal forces, and possibly a thin dark matter (DM) disc, which might produce periodic comet showers and extinctions on the Earth. Passage of the Earth through especially dense clumps of DM, composed of Weakly Interacting Massive Particles (WIMPs) in the Galactic plane, could also lead to heating in the core of the planet through capture and subsequent annihilation of DM particles. This new source of periodic heating in the Earth's interior might explain a similar ∼30 Myr periodicity observed in terrestrial geologic activity, which may also be involved in extinctions. These results suggest that cycles of geological and biological evolution on the Earth may be partly controlled by the rhythms of Galactic dynamics.
The difficulty, of course, is that dark matter is still yet to be detected, despite years of search.  We can observe that there's something out there that, from its gravitational effects, seems to make up most of the universe's mass.  But what it's made of, and what its properties are, are completely unknown.  "WIMPs" -- the Weakly Interacting Massive Particles Rampino references in his paper -- are one candidate for the constituents of dark matter.  But they, too, are yet to be confirmed to exist, despite multiple experiments designed to detect them at the Large Hadron Collider.

So Rampino is proposing that a 31 ± 5 million year mass extinction cycle (5 million years representing a 15% variability either way) links to a 30 to 42 million year galactic-plane-crossing cycle (which represents a 16% variability either way) via a mechanism connected to a type of matter we've never seen and whose properties can only be guessed at.

Map of the "dark matter halo" surrounding a galaxy [image courtesy of the Wikimedia Commons]

Now, don't get me wrong.  Thinking outside the box is the way great discoveries are made.  For example, it was Einstein's decision to throw away the "problem of the constancy of the speed of light" that led to the discovery of the Theory of Relativity.  Einstein's contemporaries had spent decades trying furiously to explain away the fact that in a vacuum, light seemed to move at the same speed in all reference frames, something that couldn't happen according to classical mechanics.  All sorts of wild ideas were proposed -- for example, a universal "ether" that permeated the universe, and through which light moved -- and one by one they were knocked down.

Einstein, however, decided to take the "problem of the constancy of the speed of light" and turn it into the "law of the constancy of the speed of light" and see what mathematical predictions came out of that assumption.  And then, run experiments to see if those predictions worked.  Lo: the Theory of Relativity, with its wild time dilation and Lorenz Contraction weirdness.

All of which is a long-winded way of saying that there's nothing wrong with speculation.  I just wish there was some way for scientists to differentiate between when they're proposing a speculative hypothesis and reporting on an experimentally-supported theory.

Maybe they should write speculative articles in "Comic Sans."  I dunno.

I say this because I'm seeing stories come up all over the place, just in the last couple of weeks, claiming that "dark matter killed the dinosaurs."  Which Rampino himself would admit is not justified at this time (note in the passage I quoted how many times he uses the words "could," "might," and "may").  And when someone else proposes a different mechanism to explain the periodicity of extinctions, it'll also get reported as fact, and laypeople will have further evidence that all scientists do is come up with wild tales all day long.

So I really should revise my initial statement.  It's not that scientists are their own worst enemies.  It's that popular media are the scientists' worst enemies.  That, and the fact that the public still doesn't really understand how science is done (look at the ongoing confusion about what the word "theory" means).

And given the fact that a significant proportion of the public still doesn't accept the findings of science that aren't speculative, the last thing we need is to sow more doubt in people's minds by misrepresenting the parts of science that are still only conjecture.

Monday, February 23, 2015

Backfires and improprieties

If there is one cognitive bias that makes me want to punch a wall, it's the backfire effect.

The backfire effect is a well-studied psychological tendency wherein people with strong opinions on a subject are presented with a logical, rational, fact-supported argument against what they believe in.  The result?  Those people double down on their original opinion.  In other words: given evidence against what people believe, they will believe it even more strongly.

Take, for example, something I've discussed more than once in this blog: anthropogenic climate change.  The data and the jury are both in.  The world is, on the average warming up, and this is due to human activity, especially the burning of fossil fuels.  This warm-up has destabilized the climate, resulting in the worst drought California has seen in recorded history, two years of record high temperatures in Alaska, and two consecutive winters where the northeastern and north central United States have been punished by a chunk of Arctic air that has been cut loose and sent southward by a meander in the jet stream, resulting in a series of snowstorms that buried Boston under eight feet of snow in the last four weeks.


[image courtesy of the Wikimedia Commons]

So there's no real doubt about all this any more.  But that doesn't mean there's not doubters, because that's not the same thing, apparently.  And we got a nasty dose of the backfire effect, apropos of said doubters, just in the last couple of days, with the announcement that one of the most prominent climate change deniers, Dr. Wei-Hock (Willie) Soon, has received a huge percentage of his funding from the petroleum industry...

... without disclosing that information in his scientific papers.

In scientific circles, this is known rather euphemistically as an "impropriety."

And we're not talking about small amounts of money, either.  Soon received a total of $1.2 million from the fossil-fuels industry, including $409,000 from Atlanta-based Southern Company, which has invested heavily in coal-fired electrical plants -- and which sponsors an anti-climate-change lobby in Washington, D.C.  Then we have the $230,000 Soon got from none other than the Koch brothers.  Additionally, he has been heavily funded by Donors Trust, an Alexandria (Virginia)-based funds transfer outfit that takes anonymous donations and passes them on to (mostly conservative) causes.

Can you say "conflict of interest," children?

I knew you could.

Soon, who is employed by the Smithsonian Institution, is likely to find himself in completely merited hot water over this.  W. John Kress, interim undersecretary for science at the Smithsonian, said about Dr. Soon's actions, "I am aware of the situation with Willie Soon, and I’m very concerned about it.  We are checking into this ourselves."

So this seems like something that would be hard for the deniers to explain away.  For years they've argued that all you have to do is "follow the money" -- that the climatologists are biased to find evidence for climate change where there is none, because that's the way they get funding.  The knife should cut both ways, shouldn't it?

Apparently not.  The screaming denier-machine swung into action almost immediately, with a slimy little smear piece appearing in Breitbart.com that made it look like Dr. Soon was the victim.  Referring to the people who accept that human activity is causing the planet to warm up as the "lickspittle posse" -- a phrase that may win the award for throw-up-a-little-in-your-mouth metaphor of the year -- the author, James Delingpole, portrays Dr. Soon as a beleaguered champion of the truth, fighting against what amounts to the environmental mafia.  But after blathering on in this fashion for a while, he goes on to say something that's actually kind of interesting:
I spoke to Soon last night. He told me that of course he receives private funding for his research: he has to because it’s his only way of making ends meet, especially since the Alarmist establishment launched its vendetta against him when, from 2009 onwards, he became more outspoken in his critiques of global warming theory. 
Harvard-Smithsonian strove to make his life harder and harder, first by banning him from working on anything even remotely connected with issues like climate change or CO2, then by moving his office away from the astrophysics department to a remote area Soon calls Siberia.
Of course Dr. Soon is allowed to accept private funding.  What is required by scientific ethical standards, however, is that he admit the source of those funds up front in his papers, which he has not done, and now he got caught.  But what's more interesting, here, is that Delingpole inadvertently points out one of the central problems with all of this.

Soon isn't a climatologist.  He's an astrophysicist.

Okay, it's vaguely connected, I suppose, because his claim all along has been that any warming is due to an increase in the radiant energy the Earth receives from the Sun.  But if you're trying to find the errors in the climate model, wouldn't you ask a climatologist to do it?

Oh, wait.  The climatologists are all in the pockets of Greenpeace.  Never mind that Soon is in the far, far deeper pockets of the Koch brothers.  That, apparently, is irrelevant.

But the overall effect is to make the deniers deny even harder, as the world warms up further (c.f. a Scientific American article in which we find out that there have been zero months since February 1985 that have had average temperatures below the overall 20th century average), and the climate continues to oscillate wildly, and we continue to do absolutely nothing about it.

Easier, I suppose, to accept the status quo than to change your habits.

Or your opinions.