Skeptophilia (skep-to-fil-i-a) (n.) - the love of logical thought, skepticism, and thinking critically. Being an exploration of the applications of skeptical thinking to the world at large, with periodic excursions into linguistics, music, politics, cryptozoology, and why people keep seeing the face of Jesus on grilled cheese sandwiches.
Showing posts with label dorsolateral prefrontal cortex. Show all posts
Showing posts with label dorsolateral prefrontal cortex. Show all posts

Friday, August 4, 2017

Refusing to play by the rules

As a fiction writer, I'm frequently asked where I get the ideas for my stories.  I sometimes respond, "Being dropped on your head as an infant will do that to you," but the truth is, I have no idea.  A few of them have a clear moment of origin (such as my novel Gears, the plot for which first came to me when I read a paper on the Antikythera Mechanism).

For most of them, however, the genesis is not so clear.  I've had stories that came from a single powerful image that begs explanation, such as my short story "The Hourglass," which resulted from a vivid mental image of two young men, ostensibly strangers to each other, having a peculiar conversation over pints of Guinness at a dimly-lit bar.  I then had to figure out what they were talking about, and why... and what it all meant.

A lot of my ideas pop into my head at unexpected moments, when my mind and/or body is otherwise occupied.  I've had plot lines (or solutions to plot problems) suddenly appear while showering, while on a run, while mowing the lawn, while trying to get to sleep (the latter is especially annoying, because it necessitates my getting up and writing it down, lest I forget what I'd come up with).

In any case, most of the time, the origins of my own creative expression are as mystifying to me as they are to my readers.  So the most honest answer to the question "Where do you get your ideas?" is "I simply don't know."  But a recent bit of research has elucidated at least a piece of the origin of creativity.

Apparently, you become more creative when your rational thought processes are suppressed.

[image courtesy of the Wikimedia Commons]

The study, by Caroline Di Bernardi Luft, Ioanna Zioga, Michael J. Banissy, and Joydeep Bhattacharya of the University of London, which appeared in Nature last month, is entitled "Relaxed Learning Constraints Through Cathodal IDCS on the Left Dorsolateral Prefrontal Cortex," and at first probably sounds like something that would only be of interest to serious neuroscience geeks.  Here's how the authors describe their own work:
We solve problems by applying previously learned rules.  The dorsolateral prefrontal cortex (DLPFC) plays a pivotal role in automating this process of rule induction.  Despite its usual efficiency, this process fails when we encounter new problems in which past experience leads to a mental rut.  Learned rules could therefore act as constraints which need to be removed in order to change the problem representation for producing the solution.  We investigated the possibility of suppressing the DLPFC by transcranial direct current stimulation (tDCS) to facilitate such representational change.  Participants solved matchstick arithmetic problems before and after receiving cathodal, anodal or sham tDCS to the left DLPFC.  Participants who received cathodal tDCS were more likely to solve the problems that require the maximal relaxation of previously learned constraints than the participants who received anodal or sham tDCS.  We conclude that cathodal tDCS over the left DLPFC might facilitate the relaxation of learned constraints, leading to a successful representational change.
In other words, if you suppress the part of the brain that understands and obeys the rules, you have more flexibility with regards to seeing solutions that require lateral, or "outside-of-the-box," thinking.

As an example of one of the problems the researchers gave their subjects that required lateral thinking, try out the following.

You're shown a (false) equation made of matchsticks that looks like this:

III = III + III

How can you make this a true statement with only moving one matchstick?

It turns out that there are two ways to do it, but both involve the expedient of adjusting not the numbers, but the equal or plus sign.  You could do this:

III = III = III

Or you could take any of the matchsticks and lay it across the equals sign to make an "is not equal to" sign -- one possibility of which is:

II ≠ III + III

Both, of course, require a bit of creative thinking.  As Luft put it, "[Problems like this one] are very hard because in mathematics it is not a valid operation at all – we normally don’t decompose the plus sign, you see that as an entire entity."

It turns out that we become better at seeing these kinds of solutions when we are given transcranial direct current stimulation (tDCS) that temporarily suppresses the activity of the aforementioned left dorsolateral prefrontal cortex.  Nick Davis, a professor of psychology at Manchester Metropolitan University (and who was not involved in the research), found the study by Luft et al. to be fascinating.  "Creativity is highly prized in most areas of our lives, from work to leisure to politics and war," Davis said.  "When the [left dorsolateral prefrontal cortex] was ‘cooled down’, the brain seems to have stopped applying old rules, and been more successful at finding new rules – this is the essence of creativity in problem-solving."

All of which makes me wonder if the most creative people have less activity in the left DLPFC to begin with, at least intermittently.  And also, if so-called "mindless" activities -- such as running, showering, or mowing the lawn -- naturally slow down the left DLPFC, allowing creative ideas to bubble up unimpeded.

I'd love to see that researched... maybe it's a direction that Luft and her team could go.

From there, of course, the next step would be to find a way to switch the rational, rules-obeying brain module off and on at will.  I, for one, would love that, especially now, because I'm at a point in my work-in-progress where I've kind of painted myself into a corner.  I know I'll find my way out eventually -- I always seem to -- but while you're there, operating within what Luft et al. call "learned constraints" it's pretty damn frustrating.

Thursday, September 8, 2016

The political teeter-totter

During election seasons, you often find out far more than you wanted about your friends' political leanings, pet issues, biases, and blind spots.  We all have them, of course; but the natural tendency is to feel like everyone else is falling for fallacious thinking, whereas we are (in Kathryn Schulz's words) "looking out at the world through a completely crystal-clear window, seeing everything as it actually is."

The problem is, it's amazingly difficult to root out errors in thinking.  People are prone to the backfire effect -- being presented with facts supporting an opposing point of view often make people double down and believe what they already did more strongly.  But it goes deeper than that. A paper written by Tali Sharot, Cass Sunstein, Sebastian Bobadilla-Suarez and Stephanie Lazzaro was released this week in the Social Science Research Network, and showed that not only does presentation with the facts often cause people to veer back into their previous thinking, it increases polarization in general.

The research team used three hundred test subjects, first giving them questionnaires designed to determine their attitude about anthropogenic climate change.  From their answers, they divided the subjects into three groups -- strong believers, moderate believers, and weak believers.  Each group was asked what their estimate was of the increase in global average temperature by the years 2100.   Unsurprisingly, the strong believers had the highest estimate (6.3 degrees on average), the weak believers the lowest (3.6 degrees) and the moderate believers were in the middle (5.9 degrees).

When it got interesting was when the researchers presented half of each group with data that was good news for the planet (global warming isn't going to be as bad as predicted) and the other half with bad news (global warming is going to be far worse than predicted).  Afterwards, they were reassessed about their opinions, and asked to revise their estimate for the change.  The strong believers presented with bad news revised their estimates upwards; those presented with good news revised their estimates downward, but only a little (0.9 degrees on average).  The weak believers were highly responsive to the good news -- lowering their estimate by a degree on average -- but didn't respond to the bad news at all!

What this shows is rather frightening.  Presented with facts, both the believers and the doubters will change -- but always in such a way as to increase the overall polarization of the group.  This sort of backfire effect will result in a society where the degree of separation between two opposing factions will inevitably increase.

Sobering stuff.  But not as much as a different study, which shows how easily our political beliefs can be changed...

... without our realizing it.

According to a study in Frontiers in Human Neuroscience, all scientists had to do was stimulate one part of the brain -- the dorsolateral prefrontal cortex -- and it caused test subjects' views to tilt to the right.

The paper, entitled "Alteration of Belief by Non-invasive Brain Stimulation," describes research by Caroline Chawke and Ryota Kanai, of the University of Sussex - Brighton's School of Psychology.  They begin with the sentence, "People generally have imperfect introspective access to the mechanisms underlying their political beliefs, yet can confidently communicate the reasoning that goes into their decision making process" -- which sums up in only a few words how little real faith we should have in the stuff our brain comes up with.

Previous research had suggested that the dorsolateral prefrontal cortex was involved in political decision-making (via its role in resolving cognitive conflict).  Specifically, it was observed that DLPFC activity was higher when people were challenged on their preconceived opinions with regard to political views.  So what Chawke and Kanai did was to stimulate that area of the brain while showing participants a campaign video from either the Labour (liberal) or Conservative party.  The expectation was that when the DLPFC was activated, it would push cognitive conflict resolution by moving both left- and right-leaning individuals toward more centrist beliefs.

That's not what happened.  The people shown a Labour video showed a movement toward the right -- but so did the people shown a Conservative video.  In other words, you stimulate the DLPFC, and everyone becomes more conservative.


Ready for the scariest part?  Let me give it to you in their own words:
It is also interesting to note that none of the participants in the current study reported any awareness of changes to their political beliefs... conclusively disagreeing with the possibility that political thoughts and values had been altered in any way.  Therefore, during the conscious deliberation of political statements, it appears as though implicit cognitive control processes may have biased subsequent belief formation in the absence of conscious awareness.  Although research has argued that rationalization and reappraisal must require some degree of conscious deliberation, the findings of the current study would provide reason to speculate an unconscious role of the DLPFC in changing political orientation.
The authors suggest the explanation that the DLPFC may have evolved as a structure whose activity is involved in perceptions of security, certainty, and social dominance, all characteristics that are associated with conservative ideology.  But wherever it comes from, the most bizarre part of all of this is how little we seem to be choosing our political leanings based on anything logical -- or even conscious.

So, there you are.  More reason to distrust the whole political process, as if this year you needed another one.  Myself, I think I'm being forced to the opinion that, as Alexis de Tocqueville said in Book II of Democracy in America: "In the United States, the majority undertakes to supply a multitude of ready-made opinions for the use of individuals, who are thus relieved from the necessity of forming opinions of their own."  Little did he know how accurate that statement was -- not only about Americans, but about everyone.