Skeptophilia (skep-to-fil-i-a) (n.) - the love of logical thought, skepticism, and thinking critically. Being an exploration of the applications of skeptical thinking to the world at large, with periodic excursions into linguistics, music, politics, cryptozoology, and why people keep seeing the face of Jesus on grilled cheese sandwiches.

Saturday, October 17, 2015

Dialing in belief

A recent study at UCLA has both atheists and the religious buzzing.

A paper called "Neuromodulation of Group Prejudice and Religious Belief" describes research at UCLA by Colin Holbrook, Keise Izuma, Choi Deblieck, Daniel M. T. Fessler,  and Marco Iacoboni, and appeared  in the journal Social Cognitive and Affective Neuroscience last week.  And what it seems to show that down-regulating part of the brain can decrease both bigotry and religious belief.  Here's how Holbrook et al. describe their research:
People cleave to ideological convictions with greater intensity in the aftermath of threat.  The posterior medial frontal cortex (pMFC) plays a key role in both detecting discrepancies between desired and current conditions and adjusting subsequent behavior to resolve such conflicts.  Building on prior literature examining the role of the pMFC in shifts in relatively low-level decision processes, we demonstrate that the pMFC mediates adjustments in adherence to political and religious ideologies.  We presented participants with a reminder of death and a critique of their in-group ostensibly written by a member of an out-group, then experimentally decreased both avowed belief in God and out-group derogation by down-regulating pMFC activity via transcranial magnetic stimulation.  The results provide the first evidence that group prejudice and religious belief are susceptible to targeted neuromodulation, and point to a shared cognitive mechanism underlying concrete and abstract decision processes.  We discuss the implications of these findings for further research characterizing the cognitive and affective mechanisms at play.
 My sense has always been that who we are -- our beliefs, personality, fears, desires -- are a result of the interplay between electrical and chemical processes in our brains.  Change those processes, and who we are changes; the idea that our selves are somehow static, independent, unchanging whatever happens to our physical body, is simply not borne out by the evidence.

[image courtesy of the Wikimedia Commons]

But this still strikes me as a weird result.  Measuring a complex phenomenon like the strength of a person's religious belief isn't going to be easy; we don't have a ReligioMeter that points to 99.8 when you attach it to Pope Francis and 0.2 when you attach it to Richard Dawkins.  Any measurement of the intensity of belief has to be determined by self-reporting, which can be influenced by any number of things -- up to and including the tone of voice in which the researcher asks the question.  Here's how Holbrook et al. did it:
{R]eligious belief was measured using a version of the Supernatural Belief Scale (Jong et al., 2013) modified to create two scales which mirror “positive” and “negative” aspects of Western religious belief, comparable to the “positive” and “negative” immigrant authors in the ethnocentrism measure.  The items were presented in random order and rated according to the same scale employed in the immigrant ratings.  The positive scale consisted of: (a) “There exists an all-powerful, all-knowing, loving God”; (b) “There exist good personal spiritual beings, whom we might call angels”; (c) “Some people will go to Heaven when they die”; (α = .90).  The negative scale consisted of: (a) “There exists an evil personal spiritual being, whom we might call the Devil”; (b) “There exist evil, personal spiritual beings, whom we might call demons”; (c) “Some people will go to Hell when they die” (α = .93).  An overall religiosity variable combining both scales was calculated by averaging all six items (α = .95).
Which seems like a pretty simplistic measure, if you're looking for a subtle result.  Add to this the fact that there were only 38 participants, and the scale change for subjects treated with TCMS showed a statistically insignificant reduction only in their positive religious beliefs, and you have to wonder what all the hype is about.  Might it be that TCMS is simply affecting your emotional state?

Now, I'm not saying it isn't an interesting result.  Certainly, the effect on prejudice (which was greater) is fascinating in and of itself.  But both religious and atheist media are giving the impression that "if you turn off part of the brain, you lose your religious convictions," and each crowing about it for different reasons, and both seem not to have read anything more than the abstract of the paper itself.

If there's one thing that becomes clear when reading psychological research, it's that isn't simple.  We're only at the very beginning of understanding how the brain works.  That there exists a neurological underpinning to religiosity seems very likely -- just as there's almost certainly a neurological underpinning to believing in conspiracy theories.  It's just that we don't know what it is yet.

And the idea that we can now turn such beliefs on or off with a switch is entirely premature.

Friday, October 16, 2015

Stellar anomaly

Given that my interests are pretty well known to my friends and family, whenever anything interesting happens on the Bigfoot, ghosts, or aliens scene, I'm sure to be sent the relevant links more than once.

This time it's aliens, with the discovery of an anomalous light-dimming pattern in a star with the euphonious and easy-to-remember name of KIC 8462852.   You probably know that light-dimming is one of the main ways that astrophysicists locate exoplanets -- if a telescope on Earth detects a periodic dimming of the light from a star, it is likely to mean that a planet is in transit across it, temporarily blocking its light.  From the period and the extent of the dimming, inferences can be made about the size of the planet and its distance from its home sun.

But this time scientists have found something odd, because whatever is causing the dimming of KIC 8462852 is not acting in a regular or predictable fashion.  And whatever it is seems to be large.  Even a Jupiter-sized planet only blocks 1% of a star's light.  This star is undergoing an irregular diminution of its light... of up to 22%!

[graph of light intensity over time, after Boyajian et al.]

The most mysterious thing about the phenomenon is its lack of periodicity.  At the moment, scientists simply don't know what this means.  And idle speculation, without a good model for what's going on, is not usually fruitful in science, so the astronomers and astrophysicists are being circumspect.  Here's what astronomer Phil Plait had to say:
The authors of the paper went to some trouble to eliminate obvious causes.  It’s not something in the telescope or the processing; the dips are real.  It’s not due to starspots (like sunspots, but on another star).  My first thought was some sort of planetary collision, like the impact that created the Moon out of the Earth billions of years ago; that would create a lot of debris and dust clouds.  These chunks and clouds orbiting the star would then cause a series of transits that could reproduce what’s seen.
Plait admits the downside of this idea, which is that dust and debris should emit infrared light as it's warmed by the star it surrounds, and we're not seeing that.  Others have suggested clouds of comets...  or an alien megastructure.

Seriously.  Years ago Freeman Dyson proposed that a sufficiently advanced civilization could disassemble planets to build a huge sphere around its star, thus capturing (and utilizing) virtually all of the star's emitted energy.  (Dyson spheres show up all the time in science fiction, most famously in the Star Trek: The Next Generation episode "Relics," and in Larry Niven's book Ringworld.)  A partially-constructed Dyson sphere, or one that had been damaged, might be expected to have the irregular light-dimming profile we're seeing in KIC 8462852.


But even the people who work at SETI (Search for Extraterrestrial Intelligence) are being cautious.  There are other possible explanations that have to be ruled out before we can say with any kind of confidence that we're looking at something other than a purely natural phenomenon.  Recall that the discovery of pulsars back in 1967 by Jocelyn Bell Burnell was at first thought to be evidence of an alien signaling device -- in fact, the first pulsars to be detected were nicknamed LGMs (Little Green Men).  Fairly quickly, of course, it was found that there was a completely natural explanation for the observation.

As Jason Wright, astronomer at Pennsylvania State University, put it, "We have to keep in mind Cochran's Command to Planet Hunters: Thou shalt not embarrass thyself and thy colleagues by claiming false planets."

But SETI, quite rightly, is already requesting radio telescope time to study the phenomenon.  If this is evidence of an intelligent alien civilization, there should be a way to support this with additional evidence.  Until then, it's premature to state with confidence that this is anything other than an unexplained stellar anomaly.

It hardly needs to be added that I would be beside myself if it turns out we are looking at extraterrestrial intelligence.  Finding evidence that we are not alone has been one of my dearest desires since I was a child, probably explaining why I have various posters in my classroom featuring aliens, including Fox Mulder's famous UFO poster from The X Files with the legend, "I Want to Believe."  But I, like Plait and Wright and Tabetha Boyajian, the astronomer who discovered the anomaly, want to move forward cautiously here.  There is a long list of weird observations that have at first been touted as evidence of aliens and other fringe-y claims, and have not borne up under additional study.  The best I can say at the moment is that this one looks hopeful -- and certainly deserving of intense further investigation.

Thursday, October 15, 2015

The science of beauty

I got a curious response to my post yesterday about finding out that my previously-held explanation for why people become conspiracy theorists was probably wrong.

Here's the email:
Dear Mr. Skepto, 
You sound pretty worried that you don't have an explanation for everything.  People aren't always explainable!  They do things because they do them.  That's it.  Some people believe weird stuff and some people like the explanations from science.  Just like some people like the Beatles and some people like Beethoven.  It's silly to wear yourself out trying to figure why. 
Do you worry about why your loved ones love you?  Maybe it's some chemical thing in their brain, right?  Do you tell your wife that's what love means?  Maybe it's a gene or something that's why I think flowers are pretty.  If so, the explanation is uglier than the flowers are.  I'd rather look at the flowers. 
All your scientific explanations do is turn all the good things in life into a chemistry class.  I think they're worth more than calling them brain chemicals.  I'll take religion over science any day.  At least it leaves us with our souls. 
Think about it. 
L. D.
Well, L. D., thanks for the response.  I find your views interesting -- mostly because they're just about as opposite to the way I see the world as they could be.

But you probably already knew that.

There is a reason why musical tastes exist.  We're nowhere near the point in brain research where we could discern the explanation; but an explanation does exist for why Shostakovich's Waltz #2 gives me goosebumps, while Chopin's waltzes do nothing for me whatsoever.  Nothing just "is because it is."

And I can't fathom how knowing the explanation devalues your appreciation of the thing itself.  Me, I would love to know what's happening in my brain when I hear a piece of music I enjoy.  We're beginning to get some perspective on this, starting with a 2011 study that found that the neurological response to hearing a piece of music we love is similar to the brain's response to sex.

Cool, yes?  I think that's awesome.  How would knowing that make me appreciate music less?

Or sex either?

I find flowers even more beautiful knowing that their shapes and colors evolved to attract pollinators, and understanding a bit about the chemistry of photosynthesis.

[image courtesy of the Wikimedia Commons]

Understanding light refraction doesn't make me shrug my shoulders at a rainbow.  And even love -- which L. D. evidently thinks lies entirely in the mystical realm -- is made no less by my knowledge that its underpinning has to do with brain chemistry.  It's like that old song with the verse:
Tell me why the stars do shine
And tell me why the ivy twines
And tell me why the sky is blue,
And I will say why I love you.
A more scientific type added a verse, to wit:
Nuclear fusion is why the stars do shine.
Thigmotropism is why the ivy twines.
Rayleigh scattering is why the sky's so blue,
And testicular hormones are why I love you.
Which I think is a good deal more realistic than attributing it all to souls and people "doing things because they do them."

In short: science itself is beautiful.  Understanding how the world works should do nothing but increase our sense of wonder.  If scientific inquiry isn't accompanied by a sense of "Wow, this is amazing!", you're doing it wrong.  I'll end with a quote from Nobel Prize winning physicist Richard Feynman, who in his 1988 book What Do You Care What Other People Think? had the following to say:
I have a friend who's an artist, and he sometimes takes a view which I don't agree with.  He'll hold up a flower and say, "Look how beautiful it is," and I'll agree.  But then he'll say, "I, as an artist, can see how beautiful a flower is.  But you, as a scientist, take it all apart and it becomes dull."  I think he's kind of nutty. …  There are all kinds of interesting questions that come from a knowledge of science, which only adds to the excitement and mystery and awe of a flower.  It only adds.  I don't understand how it subtracts.

Wednesday, October 14, 2015

Back to the drawing board

A while back, I was interviewed by Robert Chazz Chute on his online show The Cool People Podcasts, and I was asked an interesting question.

A number of interesting questions, actually, but one specific one stands out.  Chute asked me if my dedication to skepticism and evidence-based argument had ever shown me to be wrong about something I previously believed to be true.

I said, "Sure," but when pressed, the only examples I could think of were fairly low-key, such as when I found out that low-level laser light can stimulate wound healing, something that initially sounded like woo to me.

But that's not really the same as having a prior belief overturned.  So I came up empty-handed, which was a little awkward, although I did maintain that if even my deepest-held beliefs were shown to be false by hard evidence, I would have no choice but to revise my worldview.

I had a more interesting opportunity to walk the talk yesterday, when I came across a new scholarly study of conspiracy theorists,  a topic near and dear to my heart.  I have claimed more than once that I thought that the heart of conspiracy theories was a desire to find meaning in chaos -- that any pattern, even a horrible one, was better than there being no pattern at all.

Which conclusion was completely unsupported by Sebastian Dieguez, Pascal Wagner-Egger, and Nicole Gauvrit of the Department of Psychology at the University of Fribourg.  Their paper, entitled "Nothing Happens by Accident, or Does It?  A Low Prior for Randomness Does Not Explain Belief in Conspiracy Theories," found no correlation between belief in conspiracies and a belief that things can't happen at random.

Here's how Dieguez et al. explain their findings:
Belief in conspiracy theories has often been associated with a biased perception of randomness, akin to a nothing-happens-by-accident heuristic.  Indeed, a low prior for randomness (i.e., believing that randomness is a priori unlikely) could plausibly explain the tendency to believe that a planned deception lies behind many events, as well as the tendency to perceive meaningful information in scattered and irrelevant details; both of these tendencies are traits diagnostic of conspiracist ideation. In three studies, we investigated this hypothesis and failed to find the predicted association between low prior for randomness and conspiracist ideation, even when randomness was explicitly opposed to malevolent human intervention.  Conspiracy believers’ and nonbelievers’ perceptions of randomness were not only indistinguishable from each other but also accurate compared with the normative view arising from the algorithmic information framework.  Thus, the motto “nothing happens by accident,” taken at face value, does not explain belief in conspiracy theories.
I was pretty surprised by this, largely because I was so certain that I was on to the root cause of conspiracy theories.  But apparently, the Truth-Is-Out-There Cadre are no more likely to see meaning in noise than the rest of us.

So what, then, does unite the True Believers?  Because they have some pretty wacko ideas, and those have to come from somewhere, you know?  Just in the last few days, we have had:
This last one generated the greatest number of wackos coming out of the woodwork, and resulted in comments like the following:
What people fail to take into account is the molecular destabilization and rapid metamorphosis that occurred when the the pyramids power source failed.  This likely occurred in the time of the flood of Noah.  So the Noah’s pickup truck hypothesis is not that far fetched as it would seam [sic].

Look this up: Limestone, Concrete and Granite are the same material only in different metamorphic states.  I think when the pyramids went haywire things got very molecularly unstable for a period of time.  This theory explains all the crazy imprints in what should have been solid rock found all around the world. Particularly in granite.

In fact perhaps that is what actually weakened the crust enough to open “the fountains of the deep” (reference: hydroplate theory). 
For anyone who has been slacking the pyramids where [sic] something like a Tesla coil energy system and at one time likely housed a power source known as the Tetragrammaton.
So yeah.  What would make someone believe that, if not a desire to make sense of a world that is mostly composed of chaos?  I mean, that's honestly why science appeals to me; it puts at least some sense of order to the randomness, gives us deep explanations of the perplexing, provides a heuristic for winnowing out fact from fiction.

And science has a pretty good track record for being right.  Unlike crazy talk about Tesla coils powering pyramids to cause "molecular destabilization."

[image courtesy of the Wikimedia Commons]

But if Dieguez et al.'s research bears up under scrutiny, the appeal of conspiracy theories must lie elsewhere.  Are they generated from fear, from the same primitive drive that makes us imagine monsters when we hear noises at night?  Is it a misfire of our application of the scientific method, where we try to apply the rules, but make mistakes in judging evidence or constructing arguments, and come to the wrong conclusions?

Or is it something else entirely?

That's another thing about science; you can't engage in scientific thought without being willing to say the dreadful words, "I don't know."  Once your hypothesis is shown to be unsupported, it's back to the drawing board you go.  But there's nothing so very bad about that, honestly.  As Neil deGrasse Tyson said, "Scientists are always at the drawing board.  If you're not at the drawing board, you're not doing science.  You're doing something else."

Tuesday, October 13, 2015

Woof

I was discussing the alleged phenomenon of hauntings with one of my students, and he said, "There's one thing I don't understand.  Some people believe that the souls of humans can survive after death, and become ghosts.  If humans can become ghosts, why can't other animals?"

Well, after pointing out the obvious problem that I'm not really the right person to state with authority what a soul, human or otherwise, could or could not do, I mentioned that there are many cases of supposed hauntings by animals.  The most famous of these is the haunting of Ballechin House in Scotland.

Ballechin House shortly before its demolition [image courtesy of the Wikimedia Commons]

Ballechin House was a beautiful manor house, built in 1806 near Grandtully, Perthshire, Scotland, on a site that had been owned by the Stuart (or Stewart or Steuart or Steward, they seemed to spell it a new way every time the mood took them) family since the 15th century.  The story goes that a scion of this family (sources seem to point to his being the son of the man who had the house built), one Major Robert Steuart, was a bit of a wacko who had more affection for his dogs than he did for his family.  That said, he provided quarters for his sister Isabella, who was a nun -- I'm not sure why she wasn't living with her fellow sisters in a convent, but some claim that it was because she'd had an illegitimate child and gotten herself, um... de-habited?  Anyhow, she lived with them for a time, finally dying and being buried on the property.  As for Major Steuart, he apparently took enough time away from his dogs to marry and have at least one child, John.

As the Major got older, he got more and more peculiar, and finally started claiming that after he died he was going to be reincarnated as a dog.  One runs into these ideas pretty frequently today, but back then, it must have been a sore shock to his nearest and dearest.  So this partly explains why when the Major did go to that Big Kennel In The Sky, his son John rounded up all of the Major's dogs and shot them.

I say "partly" because I fail to understand how, even if you believed that the Major was going to be reincarnated as a dog, killing dogs that were currently alive and therefore presumably none of whom were actually the Major would help.  But that's what he did.

And boy was he sorry.

Almost immediately thereafter, John Steuart and his family and servants began to experience spooky stuff.  They heard doggy noises -- panting, wagging of tails, sniffing, and the really nasty slurping sounds dogs make when they are conducting intimate personal hygiene.  (Okay, I'm assuming that they heard that last sound.  I certainly hear it enough from my own dogs.)  Steuart's wife several times felt herself being pushed by a wet doggy nose, and reported being in a room and suddenly being overpowered by a strong doggy smell.

Other apparitions began -- the sighting of a ghostly nun, all dressed in gray, in the garden; doors that would open and close by themselves; and the sound of limping footsteps (the Major apparently walked with a limp).  Steuart himself was not long to worry about them, because he was killed in an accident, supposedly the day after hearing a knocking sound on the wall.  (Maybe it was a coded message from the Major that meant, "The dogs and I can't wait to see you!")

[image courtesy of the Wikimedia Commons]

In the 1890s the hauntings were investigated on the urging of a certain Lord Bute -- I can't figure out whether by that time Bute was the owner of the house, or just a busybody.  Thirty-five psychics descended upon the house, which created such a cosmic convergence of woo-wooness that you just know something was gonna happen.  And it did.  A Ouija board spelled out "Ishbel" (recall that Major Steuart's sister who was a sister was named Isabella, and recall also that this entire family seemed to have difficulty with spelling their own names).  The psychics experienced various doggy phenomena; one of the psychics, who had brought her own dog along, reported that one evening her dog began to whimper, and she looked over, and there were two disembodied dog paws resting on the bedside table.  (I'd whimper, too.)

In the interest of honesty, it must be recorded that the house was let several times during this period, once to a Colonel Taylor who belonged to the Society for Psychical Research, which is known for its skeptical and scientific approach toward claims of the paranormal.  And Taylor's diary, sorry to say, records that he slept in the Major's bedroom on more than one occasion and experienced nothing out of the ordinary.

Be that as it may, Ballechin House acquired the reputation of being "the most haunted house in Scotland," and by the 1920s became impossible to rent.  It fell into increasing disrepair, and finally was torn down in 1963.  I think this is a little sad -- I'd have loved to visit it.  I might even have brought my dogs.  My hound Lena is highly alert, even if she has the IQ of a loaf of bread, and would certainly let us know if there were any other dogs present.  I see no reason why it would matter that the canine residents of the house were a bunch of dogs who, technically, were dead.  The "doggy smell" would be adequate motivation for her to bark her fool head off, as would the whole leaving-your-front-paws-on-the-nightstand thing.

So, the believers in Survival seem to, for the most part, believe that dogs have an eternal soul.  However, this opens up a troubling question.  Why stop there?  If dogs have an eternal soul, do cats?  (My own cat seems to be more of a case of demonic possession, frankly.)  How about bunnies?  Or weasels?  Or worms?  Or Japanese beetles?  (I'd be willing to believe that if there are gardens in hell, there'll be Japanese beetles there to eat the roses.)  I find this a worrisome slippery slope.  It may be a cheering thought that something of Woofy's nature will survive his demise, even if he terrorizes the guests with sticking his spectral wet nose into said guests' private regions, but I'm not sure I want to be stung by ghostly yellowjackets, or have to spray my plants for ghostly aphids.  The real kind are enough of a problem.

Monday, October 12, 2015

Remembrance of things past

In the movie Memento, the main character, a fellow named Leonard Shelby, has anterograde amnesia, a brain disorder that prevents the formation of new short-term memories.  He forms his knowledge of the world from labeled Polaroid snapshots of people, and (for really important things) information that he has tattooed on his own skin.  The problem is, because he has no ability to reference memories of those people and events, he doesn't know if what he has written on the photographs and on himself is true -- if he was wrong, or being lied to, when he wrote the information down.

It's a fantastic, but highly unsettling, film.  Our worlds are made of a skein of remembered events, and without that network of referents, we are completely adrift.

The problem is that even for those of us who do not suffer from anterograde amnesia, what we remember is far less reliable than we think it is.  Consider that even those of us who admit "I have a terrible memory" are still quite convinced that what they do remember of the past is accurate.

And they should not be.  None of us should be.

[image courtesy of the Wikimedia Commons]

The whole topic comes up because of a ridiculous claim sent to me by a long-time loyal reader of Skeptophilia.  In it, a rather hysterical sounding guy tells us that the scientists at CERN have gone back in time and altered the past so as to change the title of Interview with a Vampire to Interview with the Vampire.   As evidence, he shows us that the top search words beginning with "interview with" are "interview with a vampire."  Q.e.d., apparently.

As for why CERN physicists would bother to do such a thing, the narrator says, "Maybe it's just a test.  I dunno."  But whatever the reason, his two-minute video sure struck a chord with some people. Amongst the comments we find:
Its [sic] interview with a vampire!!!!!  i remember it distinctly and clearly..... when i checked google after watching your video i got goose bumps, felt strange..... if you search for cached files on google its listed as 'A' vampire on an old Amazon link, and there are some old videos posted with the title 'Interview with a vampire 2'..... in the words of Tom Baker "somethings [sic] going on contrary to the laws of time!, i must find out what!!"
and
this is totally fucked up whats [sic] going on I am really scared...  what really really scares me is how some of us remember it the other way cause that means some of use [sic] are being manipulated and they changed our thinking some how and we don't know... but the caches support our old claims cause if it was always the new alternate reality the old searches for the proper names would not be cached in the search engines... 
The possibility of simply misremembering is apparently much more far-fetched to these people.

And the problem is that all of us misremember.  A lot.  Lawrence Patihis, a psychological researcher at the University of California-Irvine, found that even people with "Highly Superior Autobiographical Memory" -- the rare individuals who can tell you what they wore, did, and ate for breakfast on October 23, 2004 -- still get it wrong some of the time.   For their ordinary memories (at least the ones that can be cross-checked against hard evidence), they have a three-percent failure rate; but when they are presented with information that deliberately screws around with their recollections, they end up with false/implanted memories 20% of the time.  Here's how science writer Erika Hayasaki describes the experiment:
Twenty people with such memory were shown slideshows featuring a man stealing a wallet from a woman while pretending to help her, and then a man breaking into a car with a credit card and stealing $1 bills and necklaces.  Later, they read two narratives about those slideshows containing misinformation.  When later asked about the events, the superior memory subjects indicated the erroneous facts as truth at about the same rate as people with normal memory. 
In another test, subjects were told there was news footage of the plane crash of United 93 in Pennsylvania on September 11, 2001, even though no actual footage exists.  When asked whether they remembered having seen the footage before, 20 percent of subjects with Highly Superior Autobiographical Memory indicated they had, compared to 29 percent of people with regular memory. 
“Even though this study is about people with superior memory, this study should really make people stop and think about their own memory,” Patihis said.  “Gone are the days when people thought that [only] maybe 20, 30 or 40 percent of people are vulnerable to memory distortions.”
The bottom line is that even the best of us have unreliable memories, which puts the rest of us slobs straight into "You're probably remembering almost everything incompletely and incorrectly" territory.  It's a frightening conclusion; what we remember seems so solid, so incontrovertible, that it's hard to imagine that what is in our memory centers is an amalgam of actual fact, stuff we were told by others, and stuff that we spun straight from whole cloth.  And without anything to compare it to, we're not much better off than Leonard Shelby with his photographs and tattoos.

So there's no need to accuse the scientists at CERN of going back in time to change one word in a book title.  First off, they have many better things to do, such as creating black holes to destroy the Earth and developing targeted weather death rays to send hurricanes to places that strangely enough always get hurricanes anyway.  It's not that we don't know that people forget things or get things wrong... it's just that we get uncomfortable when we realize that our memories are far worse than we like to admit.

Saturday, October 10, 2015

The price of micromanagement

It's pretty clear, from a number of different studies in various contexts, that micromanagement doesn't work.

Micromanaged employees have lower productivity, lower job satisfaction, and less willingness to work in teams than employees given more freedom.  A 2011 study at Concordia University's School of Business by researchers Marylène Gagné and Devasheesh Bhave made it clear that when workers have more autonomy, they are happier and work harder.

"Autonomy is especially likely to lead to better productivity when the work is complex or requires more creativity," Gagné said.  "In a very routine job, autonomy doesn’t have much impact on productivity, but it can still increase satisfaction, which leads to other positive outcomes.  When management makes decisions about how to organize work, they should always think about the effect on people’s autonomy."


Some companies have taken this to heart.  Consider the software company Atlassian, which several years ago initiated a new policy called "ShipIt" Days -- once a month, employees are given a day at work of total autonomy.  You can do whatever you want, as long as you're willing to share what you did with your coworkers and employers.  Here's how Atlassian's website describes "ShipIt" Days:
Atlassian’s ShipIt Days have influenced companies from Ennova to Yahoo! to encourage employees to step out of their day-to-day mindset, think creatively about anything that relates to their business and then deliver a solution.  ShipIt Days are inspired by the idea to “ship in a day.”  Participants are given 24 hours to develop a working prototype that “scratches an itch,” innovates around an area related to their personal or team operations, or demonstrates something awesome and inspiring.  The competition is fueled with Atlassian-sponsored pizza and beer and concludes with an edge-of-seat “show-and-tell” where employees vote for a winner.  Along with company-wide recognition and personal bragging rights, the winner takes home a trophy and limited-edition t-shirt. 
“With Atlassian ShipIt Day, employee creativity and team spirit soar,” said Carol Ganz, director of business development for Six Feet Up, a developer of open source web applications.  “ShipIt Days are like playtime which is ironic because this is probably the 24 hours when we work the hardest.”
Now, compare this to the teaching profession in the United States, where teachers are micromanaged from the first bell to the last.  Under the guise of "accountability" and "improving standards," teachers are given scripted "modules" to teach from.  We are not only prohibited from designing our own assessments -- something we were trained to do -- we are prohibited from grading our students' final exams, because of fears that we'll cheat.  There is a drive -- not yet implemented, but likely to be in the next couple of years -- that will prevent principals from observing and evaluating teachers in their own buildings, because apparently the upper administration doesn't even trust the principals.  Teacher observations have to be done by administrators from other schools, thus tying up more of principals' time, and assuring that the people writing evaluations are ones who know little about the teacher or the students they're watching.

It's like educational administrators read the Concordia study upside down and backwards, or something, and concluded that productivity rises when Big Brother Is Watching You.

It's something I found out years ago as a teacher; when I tried one-second-to-the-next micromanagement of the students in my classroom, it always backfired.  The best approach has always been freedom within structure.  In my Critical Thinking classes, the final project is a personal essay describing how your ideas and attitudes have changed over the course of the semester.  I tell them that the instructions are "follow your nose."  If something we studied in the class really made an impact, tell me about it.  Take your thoughts and go deep.  I want a coherent essay that is, in essence, a critical analysis of your own brain, not a list of answers to questions I've created.

"What's the grading rubric?" I often am asked.

My answer:  "I don't have one."

Some students get pretty panicky when they hear that.  We've trained them for years that life is a giant fill-in-the-blank test, and you better fill in the blanks with the right answers.  But nearly all of them rise to the challenge of this project -- some afterwards telling me that it was one of the hardest essays they ever wrote -- and I've seen phenomenal results, despite naysayers telling me when I first started doing this, "This is just giving kids license to bullshit."

I've shown some of the doubters examples of the papers that come out of this project.  They're not naysayers any more.

There are places in the world where teachers and students are given more autonomy.  Schools in Finland, for example, are often lauded as being exemplars of excellence.  Pasi Sahlberg, visiting fellow at the Harvard University Graduate School of Education, suggests why this is:
Teachers [in Finland] have time in school to do other things than teach.  And people trust each other.  A common takeaway was that Finnish teachers seem to have much more professional autonomy than teachers in the United States to help students to learn and feel well. 
We do know that teachers’ workplaces provide very different conditions for teaching in different countries. 
First, teachers in the US work longer hours (45 hours/week) than their peers in Finland (32 hours/week).  They also teach more weekly, 27 hours compared to 21 hours in Finland.   This means that American teachers, on average, have much less time to do anything beyond their teaching duties (whether alone or with colleagues) than teachers in most other OECD countries. 
In Finland, teachers often say that they are professionals akin to doctors, architects and lawyers.  This means, they explain, that teachers are expected to perform in their workplaces like pros: use professional judgment, creativity and autonomy individually and together with other teachers to find the best ways to help their students to learn.
 In the absence of common teaching standards, Finnish teachers design their own school curricula steered by flexible national framework.  Most importantly, while visiting schools, I have heard Finnish teachers say that due to absence of high-stakes standardized tests, they can teach and assess their students in schools as they think is most appropriate. 
The keyword between teachers and authorities in Finland is trust.  Indeed, professional autonomy requires trust, and trust makes teacher autonomy alive.
Did you catch the gist of all of that?  Finnish teachers teach fewer hours per week, are given more freedom, don't give standardized tests... and they get better results.

I've often wondered why American educational leaders are so far behind the curve when it comes to taking psychological and management research and incorporating it into the structure of schools.  We've known about these results for years; but due to fears of failing schools and anxiety over what would happen if the leash is loosened, we've seen an increasingly oppressive top-down management style that is known to decrease productivity and job satisfaction.

I've said it before: I've taught for 29 years, and by and large have loved my job.  But if I were a college student today, no way in hell would I become a teacher.

It's to be hoped that we'll wise up one day.  But the delay comes at a cost; there are students now, in classes today, who are being harmed by our attitudes toward teachers and schools.  Maybe it's time we take a page from Atlassian's playbook.  Let go of the reins a little, and see what happens.  It could hardly be worse than our current module-driven, exam-laden, joyless world of the Common Core.