Skeptophilia (skep-to-fil-i-a) (n.) - the love of logical thought, skepticism, and thinking critically. Being an exploration of the applications of skeptical thinking to the world at large, with periodic excursions into linguistics, music, politics, cryptozoology, and why people keep seeing the face of Jesus on grilled cheese sandwiches.
Showing posts with label cognition. Show all posts
Showing posts with label cognition. Show all posts

Tuesday, March 11, 2025

Music and cognition

When educational budgets are cut -- which they are, every year -- inevitably what is hit the hardest are programs for the arts, music, theater, and other electives.

This is ridiculous, and I say that as someone who spent thirty-two years teaching science, a so-called "core" subject.  And I don't mean to criticize the importance of having a good "core" education; we all need to be able to read and write, do mathematics, understand the history of humanity, and have a basic and broad grasp of scientific principles.

But that's not the be-all-end-all of education, or at least it shouldn't be.  I mean, consider not what gets you a job, what allows you to do mundane chores like balancing your checkbook, but what actually brings joy to your life.  What are your hobbies, things you spend your spare time doing, things you'd spend much more time doing if you had the leisure?  My guess is very few of us fill our free time doing chemistry experiments, even admitted science nerds like me.  No, we paint, sculpt, garden, play an instrument, sing in the choir, play or watch sports (or both), cook elaborate meals, write stories.  And while those do take a basic 3-Rs education -- I wouldn't be much of a fiction writer if I had a lousy vocabulary or didn't know how to write grammatically -- for many of us, our real fascinations were discovered in the classes that go under throwaway names like "electives" and "specials" and "optional courses."

So cutting these subjects is, for many students, taking away the one thing about school that makes it tolerable, and robbing them of the opportunity to find hidden talents and undiscovered passions that will bring them joy for a lifetime.

But a study has shown that it's more than that.  Research by Katherine Sledge Moore and Pinar Gupse Oguz of Arcadia University, and Jim Meyer of Elmhurst College, has found that music education correlates strongly with the development of flexible intelligence -- and that those gains translate across disciplines.

[Image licensed under the Creative Commons Jacob Garcia from Reus, Spain, The Cello Player, CC BY 2.0]

In "Superior Fluid Cognition in Trained Musicians," published in the journal Psychology of Music, the researchers found that the degree of experience a person has in playing music (or singing), the higher they score on a variety of metrics -- episodic memory, working memory, attention, executive function, and processing speed.

It's hardly surprising when you think about it.  As the researchers put it, fluid intelligence skills "are highlighted in musical training," which involves "quickly comprehending a complex symbolic system, multitasking, reasoning, and more."  I can say from personal experience that performing music -- not just playing it at home for your own entertainment -- takes those skills up an additional notch.  I was a performing musician for years, playing flute in a Celtic dance band called Crooked Sixpence.  Being up on stage requires that you think on your feet, and often make lightning-fast alterations to what you're doing.  As an example, most of what my band played were medleys of three or four tunes, and we almost never planned ahead how many times we were going to play any one of them (nor who'd be playing melody and who'd be playing harmony).  Our fiddler, who was more-or-less in charge of the band, just gave me a wiggle of the eyebrow if she wanted me to take a solo, and said "hep!" if we were switching tunes.  Sometimes the inevitable happened -- the fiddler and I both jumped to harmony at the same time, or something -- but almost always, one of us recognized it in under two seconds and slipped right back into playing melody.  Despite the complexity of what we did, the times we had a real crash-and-burn on stage were very few and far between.

So this study is spot-on.  And its conclusions are further evidence that we should be expanding arts and electives programs, not cutting them.

Not, honestly, that I expect it will have an effect.  Sorry to end on a pessimistic note, but the educational establishment has a long track record of completely ignoring research on developmental psychology in favor of "we've always done it this way."  The most egregious example is our determination to start foreign language instruction in seventh or eighth grade, when we've known for years that our brain's plasticity with respect to learning new languages peaks around age three or four, and declines steadily thereafter.

Or, as one of my students put it, "So we start teaching kids languages at the point they start to suck at it."

A close second is that researchers have been saying for years -- with piles of evidence to support them -- that children need recess or some other unstructured play time in order to improve overall behavior and attitudes about being in school.  Not only that, but recess time correlates with better scores on tests, so like music, it's an investment that pays off across the board.  Nevertheless, schools across the country have been gradually reducing unstructured leisure time, in some places to twenty minutes or less per week, in favor of devoting more time to preparing for standardized tests.

Now there's a way to make kids look forward to going to school in the morning.

I'd like to think that this research will influence educational establishments and (especially) budgetary decisions, but I'm not holding my breath.  Any change on that count is likely to be very slow to come.  But still, every piece of evidence counts.  And anything we can do to foster the development of fluid intelligence, positive attitudes, and confidence in children is movement in the right direction.

****************************************


Friday, February 17, 2023

Canine mathematics

I remember a while back reading an interesting paper that concluded that dogs have a concept of fairness and morality.

There have been a number of studies confirming this, most strikingly an investigation involving border collies.  Pairs of dogs were trained to do a task, then rewarded with doggie biscuits.  The thing was, Dog #1 was rewarded for correctly doing the task with one biscuit, and Dog #2 with two biscuits for doing the same task.

Within a few rounds, Dog #1 refused to cooperate.  "I'm not working for one biscuit when he gets two," seemed to be the logic.  So -- amazing as it seems -- at least some dogs understand fair play, and will forego getting a treat at all if another dog is getting more.

It also implies an understanding of quantity.  Now, "two is more than one" isn't exactly differential calculus, but it does suggest that dogs have at least a rudimentary numeracy.  The evolutionary advantage of a sense of quantity is obvious; if you can do a quick estimate of the number of predators chasing you, or the size of the herd of antelope you're chasing, you have a better sense of your own safety (and such decisions as when to flee, when to attack, when to hide, and so on).

Guinness, either pondering Fermat's Last Theorem or else trying to figure out how to open the kitchen door so he can swipe the cheese on the counter

But how complex dogs' numerical ability is has proven to be rather difficult to study.  Which is why I found a paper I stumbled across in Biology Letters so fascinating.

Entitled, "Canine Sense of Quantity: Evidence for Numerical Ratio-Dependent Activation in Parietotemporal Cortex," by Lauren S. Aulet, Veronica C. Chiu, Ashley Prichard, Mark Spivak, Stella F. Lourenco, and Gregory S. Berns, of Emory University, this study showed that when dogs are confronted with stimuli differing only in quantity, they process that information in the same place in their brains that we use when doing numerical approximation.

The authors write:
The approximate number system (ANS), which supports the rapid estimation of quantity, emerges early in human development and is widespread across species.  Neural evidence from both human and non-human primates suggests the parietal cortex as a primary locus of numerical estimation, but it is unclear whether the numerical competencies observed across non-primate species are subserved by similar neural mechanisms.  Moreover, because studies with non-human animals typically involve extensive training, little is known about the spontaneous numerical capacities of non-human animals.  To address these questions, we examined the neural underpinnings of number perception using awake canine functional magnetic resonance imaging.  Dogs passively viewed dot arrays that varied in ratio and, critically, received no task-relevant training or exposure prior to testing.  We found evidence of ratio-dependent activation, which is a key feature of the ANS, in canine parietotemporal cortex in the majority of dogs tested.  This finding is suggestive of a neural mechanism for quantity perception that has been conserved across mammalian evolution.
The coolest thing about this study is that they controlled for stimulus area, which was the first thing I thought of when I read about the experimental protocol.  What I mean by this is that if you keep the size of the objects the same, a greater number of them has a greater overall area, so it might be that the dogs were estimating the area taken up by the dots and not the number.  But the researchers cleverly designed the arrays so that although the number of dots varied from screen to screen, the total area they covered was the same.

And, amazing as it sounds, dogs not only had the ability to estimate the quantity of dots quickly and pick the screen with the greatest number, they were apparently doing this with the same part of their brains we use for analogous tasks.

"We went right to the source, observing the dogs' brains, to get a direct understanding of what their neurons were doing when the dogs viewed varying quantities of dots," said study lead author Lauren Aulet, in a press release in Science Daily.  "That allowed us to bypass the weaknesses of previous behavioral studies of dogs and some other species...  Part of the reason that we are able to do calculus and algebra is because we have this fundamental ability for numerosity that we share with other animals.  I'm interested in learning how we evolved that higher math ability and how these skills develop over time in individuals, starting with basic numerosity in infancy."

I wonder, though, how this would work with our dogs. As I've mentioned before, Cleo (our Shiba Inu) has the IQ of a lima bean, and even has a hard time mastering concepts like the fact that regardless how many times she lunges at her own tail, it's going to remain firmly attached to her butt.  Guinness is smarter (not that the bar was set that high), but I don't know how aware of quantity he is.  He's more of an opportunist who will take advantage of any situation that presents itself, be it a single CheezDoodle someone dropped on the floor or (as happened a while back) a half-pound of expensive French brie that was left unguarded for five minutes on the coffee table.

I doubt he worried about quantity in either case, frankly.

But the Aulet et al. study is fascinating, and clues us in that the origins of numeracy in our brains goes back a long, long way.  The most recent common ancestor between humans and dogs is on the order of eighty million years ago -- predating the extinction of the dinosaurs by fourteen million years -- so that numerical brain area must be at least that old, and is probably shared by most mammalian species.  It's a little humbling to think that a lot of the abilities we humans pride ourselves on are shared, at least on a basic level, with our near relatives.

But now y'all'll have to excuse me, because Cleo is barking like hell at something.  Maybe it's the evil UPS guy, whom she and Guinness both hate.  Maybe a squirrel farted somewhere in this time zone.  Maybe she's frustrated by the fact that she still can't quite catch her own tail.  

Or maybe she's stuck on one of her linear algebra homework problems.  You can see how that's a possibility.

****************************************


Wednesday, September 21, 2022

Memory offload

In James Burke's brilliant series The Day the Universe Changed, there's a line that never fails to shock me when I think about it, but which goes by so quickly you might miss it if you're not paying attention.  (This is typical of Burke -- I've heard his deservedly famous series Connections as being like "watching a pinball game on fast-forward.")

The line comes up at the beginning of the last episode, "Worlds Without End," in which he's giving a quick summary of humankind's progression through technology.  He says, "In the fifteenth century, the invention of the printing press took our memories away."

Recording our knowledge in some kind of semi-permanent fashion is at odds with our need to keep anything important in memory.  I'm riffing on that concept in my current work-in-progress, The Scattering Winds, which is about a post-apocalyptic world in which some parts of society in what is now the United States have gone back to being non-literate.  All of the knowledge of the culture is entrusted to the mind of one person -- the Keeper of the Word -- whose sacred task it is to remember all lore, language, music, and history.

Then... because of a refugee from another place -- the apprentice to the Keeper learns about written language, and acquires the rudiments of reading, then goes in search of any books that might have survived the disasters and plagues that ended the world as we know it.  He realizes that this (re)discovery will end the vocation he's studied his whole life for, but the lure of lost knowledge is too powerful to resist even so.

He knows that in a very real sense, the rediscovery of written language will take his memory away.

The internet, of course, has only deepened the scope of the problem.  A few years ago, I had a student who had what seemed to me a weird approach to figuring things out.  When presented with a question he didn't know the answer to, his immediate response was to pull out his school-issued iPad and Google it.  Often, he didn't even give his brain a chance to wrestle with the question; if the answer wasn't immediately obvious, out came the electronics.

"What have you learned by doing that?" I recall asking him, trying to keep the frustration out of my voice.

"I got the right answer," he said.

"But the answer isn't the point!"  Okay, at that point my frustration was pretty clear.

I think the issue I had with this student comes from two sources.  One is the education system's unfortunate emphasis on Getting The Right Answer -- that if you have The Right Answer on your paper, it doesn't matter how you got it, or whether you really understand how to get there.  But the other is our increasing reliance on what amounts to external memory.  When we don't know something, the ease and accessibility of answers online makes us default to that, rather than taking the time to search our own memories for the answer.


The loss of our own facility for recall because of the external storage of information was the subject of a study in the journal Memory.  Called "Cognitive Offloading: How the Internet is Increasingly Taking Over Human Memory," the study, by cognitive psychologists Benjamin Storm, Sean Stone, and Aaron Benjamin, looked at how people approach the recall of information, and found that once someone has started relying on the internet, it becomes the go-to source, superseding one's own memory:
The results revealed that participants who previously used the Internet to gain information were significantly more likely to revert to Google for subsequent questions than those who relied on memory.  Participants also spent less time consulting their own memory before reaching for the Internet; they were not only more likely to do it again, they were likely to do it much more quickly.  Remarkably, 30% of participants who previously consulted the Internet failed to even attempt to answer a single simple question from memory.
This certainly mirrors my experience with my students.  Not all of them were as hooked to their electronics as the young man in my earlier anecdote, but it is more and more common for students to bypass thinking altogether and jump straight to Google.

"Memory is changing," lead author Storm said.  "Our research shows that as we use the Internet to support and extend our memory we become more reliant on it.  Whereas before we might have tried to recall something on our own, now we don't bother.  As more information becomes available via smartphones and other devices, we become progressively more reliant on it in our daily lives."

What concerns me is something that the researchers say was outside the scope of their research; what effect this might have on our own cognitive processes.  It's one thing if the internet becomes our default, but that our memories are still there, unaltered, should the Almighty Google not be available.  It's entirely another if our continual reliance on external "offloaded" memory ultimately weakens our own ability to process, store, and recall.  It's not as far-fetched as it sounds; there have been studies that suggest that mental activity can stave off or slow down dementia, so the "if you don't use it, you lose it" aphorism may work just as much for our brains as it does for our muscles.

In any case, maybe it'd be a good idea for all of us to put away the electronics.  No one questions the benefits of weightlifting if you're trying to gain strength; maybe we should push ourselves into the mental weightlifting of processing and recalling without leaning on the crutch of the internet.  And as Kallian discovers in The Scattering Winds, the bounty of information that comes from the external storage of information -- be it online or in print -- comes at a significant cost to our own reverence for knowledge and depth of understanding.

****************************************


Wednesday, March 17, 2021

Becoming the character

When I was about fourteen, I read Richard Adams's novel Watership Down.

I had never experienced being completely swallowed up by a book the way this one did.  I couldn't put it down -- read, literally, all day long, including over breakfast and lunch.  (Couldn't get away with reading during dinner.  That was verboten in my family.)  It didn't bother me that it's a story about rabbits; in Adams's hands they are deeply real, compelling characters, while never losing their core rabbit-ness.  Their adventure is one of the most gripping, exciting stories I've ever read, and it's still in my top ten favorite books ever.

One of the primary reasons for this is the main character, Hazel.  Hazel is a true leader, bringing his intrepid band through one danger after another to get to a new and safe home, and he accomplishes this without being some kind of high-flung hero.  He's determined, smart, and loyal, but other than that quite ordinary; his main skill is in using all the talents of his friends to their utmost, leading through cooperation and respect rather than through fear.  (And if that point wasn't clear enough, when you meet his opposite, the terrifying General Woundwort, the contrast is obvious -- as is why Hazel and his friends ultimately win the day.)

[Image licensed under the Creative Commons CSIRO, CSIRO ScienceImage 1369 European rabbit, CC BY 3.0]

We love Hazel because we can be him, you know?  He's not an archetypical warrior whose feats are beyond the ability of just about all of us.  I loved (and still love) a lot of sword-and-sorcery fantasy, but it's never the Lords and Ladies of the Elves, the ones always featured on the book covers, whom I identify with.  It's the Samwise Gamgees that capture my heart every time.  Maybe King Aragorn is the hero of Lord of the Rings, but even he told Sam, "You kneel before no one."

In a passage that is kind of a meta-representation of my own absorption in the story, about a third of the way through Watership Down, Hazel and his friends meet two other rabbits from their home warren, and find out that those two are the only other survivors left after the warren was destroyed by humans so the property could be developed for residences.  Adams's description of the characters listening to the horrific account of their escape -- and of their friends who were not so lucky -- parallels what we feel reading the larger story:

Hazel and his companions had suffered extremes of grief and horror during the telling of Holly's tale.  Pipkin had cried and trembled piteously at the death of Scabious, and Acorn and Speedwell had been seized with convulsive choking as Bluebell told of the poisonous gas that murdered underground...  [But] the very strength and vividness of their sympathy brought with it a true release.  Their feelings were not false or assumed.  While the story was being told, they heard it without any of the reserve or detachment that the kindest of civilized humans retains as he reads the newspaper.  To themselves, they seemed to struggle in the poisoned runs...  This was their way of honoring the dead.  The story over, the demands of their own hard, rough lives began to reassert themselves in their hearts, in their nerves, their blood and appetites.

The reason I thought of Watership Down, and this passage in particular, is because of a paper I read in the journal Social, Cognitive, and Affective Neuroscience a couple of days ago.  In "Becoming the King in the North: Identification with Fictional Characters is Associated with Greater Self/Other Neural Overlap," by Timothy Broom and Dylan Wagner (Ohio State University) and Robert Chavez (University of Oregon), participants were asked to evaluate how closely they identified with fictional characters -- in this case, from Game of Thrones -- and then the researchers looked at the volunteers' brain activity in the ventral medial prefrontal cortex (vMPFC), an area associated with our perception of self, when thinking about the various characters in the story.

When thinking about the characters the test subjects liked best, there was much stronger activity in the vMPFC, suggesting that the participants weren't only experiencing enjoyment or appreciation, they were -- like Hazel's friends -- becoming the character.  The authors write, "These results suggest that identification with fictional characters leads people to incorporate these characters into their self-concept: the greater the immersion into experiences of ‘becoming’ characters, the more accessing knowledge about characters resembles accessing knowledge about the self."

"For some people, fiction is a chance to take on new identities, to see worlds though others’ eyes and return from those experiences changed," study co-author Dylan Wagner said, in a press release from Ohio State University.  "What previous studies have found is that when people experience stories as if they were one of the characters, a connection is made with that character, and the character becomes intwined with the self.  In our study, we see evidence of that in their brains."

"People who are high in trait identification not only get absorbed into a story, they also are really absorbed into a particular character," co-author Timothy Broom explained.  "They report matching the thoughts of the character, they are thinking what the character is thinking, they are feeling what the character is feeling.  They are inhabiting the role of that character."

So there's a neurological underpinning to our absorption into a truly fine story -- or, more specifically, a character we care about deeply.  It's what I hope for when people read my own books; that they will not just appreciate the plot but form an emotional connection to the characters.  My contention is that however plot-driven a genre is, all stories are character stories.  The plot and scene-setting can be brilliant, but if we don't care about the characters, none of that matters.

It's fascinating that we can be so transported by fiction, and suggests that we've been storytellers for a very long time.  When reading or hearing a profoundly moving story, we are able to drop the veneer of what Adams describes as our "reserve and detachment... [while reading] the newspaper."  We get swallowed up, and our brain activity reflects the fact that on some level, we're actually there experiencing what the character experiences.

Even if that character is "just a rabbit."

***************************************

I've always been in awe of cryptographers.  I love puzzles, but code decipherment has seemed to me to be a little like magic.  I've read about such feats as the breaking of the "Enigma" code during World War II by a team led by British computer scientist Alan Turing, and the stunning decipherment of Linear B -- a writing system for which (at first) we knew neither the sound-to-symbol correspondence nor even the language it represented -- by Alice Kober and Michael Ventris.

My reaction each time has been, "I am not nearly smart enough to figure something like this out."

Possibly because it's so unfathomable to me, I've been fascinated with tales of codebreaking ever since I can remember.  This is why I was thrilled to read Simon Singh's The Code Book: The Science of Secrecy from Ancient Egypt to Quantum Cryptography, which describes some of the most amazing examples of people's attempts to design codes that were uncrackable -- and the ones who were able to crack them.

If you're at all interested in the science of covert communications, or just like to read about fascinating achievements by incredibly talented people, you definitely need to read The Code Book.  Even after I finished it, I still know I'm not smart enough to decipher complex codes, but it sure is fun to read about how others have accomplished it.

[Note: if you purchase this book using the image/link below, part of the proceeds goes to support Skeptophilia!]



Saturday, October 3, 2020

The illusion of understanding

I've written before about the Dunning-Kruger effect, the cognitive bias that gives rise to the perception that everyone you ask will verify being an above-average driver.  We all have the sense of being competent -- and as studies of Dunning-Kruger have shown, we generally think we're more competent than we really are.

I just ran into a paper from about a long while ago that I'd never seen before, and that seems to put an even finer lens on this whole phenomenon.  It explains, I think, why people settle for simplistic explanations for phenomena -- and promptly cease to question their understanding at all.  So even though this is hardly a new study, it was new to me, and (I hope) will be new to my readers.

Called "The Misunderstood Limits of Folk Science: An Illusion of Explanatory Depth," the paper was written by Leonid Rozenblit and Frank Keil of Yale University and appeared in the journal Cognitive Science.  Its results illustrate, I believe, why trying to disabuse people of poor understanding of science can be such an intensely frustrating occupation.

The idea of the paper is a simple one -- to test the degree to which people trust and rely on what the authors call "lay theories:"
Intuitive or lay theories are thought to influence almost every facet of everyday cognition.  People appeal to explanatory relations to guide their inferences in categorization, diagnosis, induction, and many other cognitive tasks, and across such diverse areas as biology, physical mechanics, and psychology.  Individuals will, for example, discount high correlations that do not conform to an intuitive causal model but overemphasize weak correlations that do.  Theories seem to tell us what features to emphasize in learning new concepts as well as highlighting the relevant dimensions of similarity... 
The incompleteness of everyday theories should not surprise most scientists.  We frequently discover that a theory that seems crystal clear and complete in our head suddenly develops gaping holes and inconsistencies when we try to set it down on paper.  
Folk theories, we claim, are even more fragmentary and skeletal, but laypeople, unlike some scientists, usually remain unaware of the incompleteness of their theories.  Laypeople rarely have to offer full explanations for most of the phenomena that they think they understand.  Unlike many teachers, writers, and other professional “explainers,” laypeople rarely have cause to doubt their naïve intuitions.  They believe that they can explain the world they live in fairly well.
Rozenblit and Keil proceeded to test this phenomenon, and they did so in a clever way.  They were able to demonstrate this illusory sense that we know what's going on around us by (for example) asking volunteers to rate their understanding of how common everyday objects work -- things like zippers, piano keys, speedometers, flush toilets, cylinder locks, and helicopters.  They were then (1) asked to write out explanations of how the objects worked; (2) given explanations of how they actually do work; and (3) asked to re-rate their understanding.

Just about everyone ranked their understanding as lower after they saw the correct explanation.

You read that right.  People, across the board, think they understand things better before they actually learn about them.  On one level, that makes sense; all of us are prone to thinking things are simpler than they actually are, and can relate to being surprised at how complicated some common objects turn out to be.  (Ever seen the inside of a wind-up clock, for example?)  But what is amazing about this is how confident we are in our shallow, incomplete knowledge -- until someone sets out to knock that perception askew.

It was such a robust result that Rozenblit and Keil decided to push it a little, and see if they could make the illusion of explanatory depth go away.  They tried it with a less-educated test group (the initial test group had been Yale students.)  Nope -- even people with less education still think they understand everything just fine.  They tried it with younger subjects.  Still no change.  They even told the test subjects ahead of time that they were going to be asked to explain how the objects worked -- thinking, perhaps, that people might be ashamed to admit to some smart-guy Yale researchers that they didn't know how their own zippers worked, and were bullshitting to save face.

The drop was less when such explicit instructions were given, but it was still there.  As Rozenblit and Keil write, "Offering an explicit warning about future testing reduced the drop from initial to subsequent ratings.  Importantly, the drop was still significant—the illusion held."

So does the drop in self-rating occur with purely factual knowledge?  They tested this by doing the same protocol, but instead of asking people for explanations of mechanisms, they asked them to do a task that required nothing but pure recall, such as naming the capitals of various countries.  Here, the drop in self-rating still occurred, but it was far smaller than with explanatory or process-based knowledge.  We are, it seems, much more likely to admit we don't know facts than to admit we don't understand processes.

The conclusion that Rozenblit and Keil reach is a troubling one:
Since it is impossible in most cases to fully grasp the causal chains that are responsible for, and exhaustively explain, the world around us, we have to learn to use much sparser representations of causal relations that are good enough to give us the necessary insights: insights that go beyond associative similarity but which at the same time are not overwhelming in terms of cognitive load.  It may therefore be quite adaptive to have the illusion that we know more than we do so that we settle for what is enough.  The illusion might be an essential governor on our drive to search for explanatory underpinnings; it terminates potentially inexhaustible searches for ever-deeper understanding by satiating the drive for more knowledge once some skeletal level of causal comprehension is reached.
Put simply, when we get to "I understand this well enough," we stop thinking.  And for most of us, that point is reached far, far too soon.

And while it really isn't that critical to understand how zippers work as long as it doesn't stop you from zipping up your pants, the illusion of explanatory depth in other areas can come back to bite us pretty hard when we start making decisions on how to vote.  If most of us truly understand far less than we think we do about such issues as the safety of GMOs and vaccines, the processes involved in climate and climate change, the scientific and ethical issues surrounding embryonic stem cells, and even issues like air and water pollution, how can we possibly make informed decisions regarding the regulations governing them?

All the more reason, I think, that we should be putting more time, money, effort, and support into education.  While education doesn't make the illusion of explanatory depth go away, at least the educated are starting from a higher baseline.  We still might overestimate our own understanding, but I'd bet that the understanding itself is higher -- and that's bound to lead us to make better decisions.

I'll end with a quote by author and blogger John Green that I think is particularly apt, here:


*******************************

To the layperson, there's something odd about physicists' search for (amongst many other things) a Grand Unified Theory, that unites the four fundamental forces into one elegant model.

Why do they think that there is such a theory?  Strange as it sounds, a lot of them say it's because having one force of the four (gravitation) not accounted for by the model, and requiring its own separate equations to explain, is "messy."  Or "inelegant."  Or -- most tellingly -- "ugly."

So, put simply; why do physicists have the tendency to think that for a theory to be true, it has to be elegant and beautiful?  Couldn't the universe just be chaotic and weird, with different facets of it obeying their own unrelated laws, with no unifying explanation to account for it all?

This is the question that physicist Sabine Hossenfelder addresses in her wonderful book Lost in Math: How Beauty Leads Physicists Astray.  She makes a bold statement; that this search for beauty and elegance in the mathematical models has diverted theoretical physics into untestable, unverifiable cul-de-sacs, blinding researchers to the reality -- the experimental evidence.

Whatever you think about whether the universe should obey aesthetically pleasing rules, or whether you're okay with weirdness and messiness, Hossenfelder's book will challenge your perception of how science is done.  It's a fascinating, fun, and enlightening read for anyone interested in learning about the arcane reaches of physics.

[Note: if you purchase this book using the image/link below, part of the proceeds goes to support Skeptophilia!]


Wednesday, September 30, 2020

The emotional thermostat

For a variety of reasons I've decided to go back to work part time.

Retirement is great, but because the universe has a perverse sense of humor, we found out the week after I retired that our house needed forty thousand dollars of foundation work in order to stop it from sliding down the hill into our creek.  The project involved putting piers under the foundation, meaning holes had to be drilled through and around the foundation down to the bedrock.  The good news, if there's any good news in a scenario like this one, is that we caught it before any serious structural damage was done to the house.

Not that you'd be able to tell from what the interior of the formerly-finished basement looked like once they were done.  The foundation work required stripping the whole thing down to the joists, studs, and cement floor, removing all the pre-existing carpet, tile, walls, and ceilings.  All of which now has to be re-installed.

L to R: Me and my son, mid-demolition.  You can probably see the family resemblance between us.

So needless to say, we ended up with a hefty home improvement loan to pay off.  This was for me the main impetus to getting another job.  So this week I started work helping out seniors with yard work and house work, and providing companion care for people who aren't able to get out much.

I've always been inordinately worried about money.  During the time I was a single dad, I was literally down to nickels at the end of every pay period, and constantly looking for ways to economize.  Putting aside extra against eventualities like unexpected car repairs was pretty much impossible.  I've had a "poverty mentality" ever since, and even though now we're doing okay financially, I still have the constant expectation that the bottom is going to drop out.

It's all part and parcel of how my depression and anxiety seem to operate.  Reality checks (like looking at our bank statement and seeing that we do, in fact, have enough to pay our mortgage this month) don't make a dent in the panicked emotional state I seem to live in most of the time.  The way I've described it is that it's like I have two brains, an emotional one and a rational one, and they are not on speaking terms with each other.

Turns out this is a remarkably accurate description of what's going on.  A study published this week in The Journal of Neuroscience by Mary Kate P. Joyce, Miguel Ángel García-Cabezas, Yohan J. John, and Helen Barbas of Boston University, entitled "Serial Prefrontal Pathways Are Positioned to Balance Cognition and Emotion in Primates," we find out that there is a part of the brain called "Area 32" (not to be confused with Area 51) which links two other reasons, the dorsolateral prefrontal cortex (DLPFC) and the subgenual cortex.  The subgenual cortex is connected with emotional expression; the DLPFC is essentially like a thermostat, speaking through Area 32 to the subgenual cortex and allowing emotional equilibrium.

A neurological analog to the "reality check" I mentioned earlier.

But when the DLPFC is quiet for some reason, no longer relaying inhibitory signals through Area 32, the subgenual cortex kind of stages a coup, leading to a runaway emotional reaction. 

Identifying the regions of the brain involved in certain abnormal responses is the first step toward targeted therapy.  I can speak from experience that this is really needed in the case of depression.  I've been through the wringer trying to find an antidepressant that works and doesn't give me horrible side effects -- it took three years to finally get one that seems to blunt the edge of the worst of it.  The strangest part of all this is how unpredictable it is; one person can have excellent results from an antidepressant that is either useless or actually detrimental for someone else, and honestly, no one knows why this is.

Research like this provides some hope that we may be narrowing in on what's going on.  Which is incredible news to people like me who've suffered from depression and anxiety for decades.

But now I need to get this posted and get going.  I've got to get out to my client's garden and finish cutting back her perennials.  All in a day's work.

*******************************

To the layperson, there's something odd about physicists' search for (amongst many other things) a Grand Unified Theory, that unites the four fundamental forces into one elegant model.

Why do they think that there is such a theory?  Strange as it sounds, a lot of them say it's because having one force of the four (gravitation) not accounted for by the model, and requiring its own separate equations to explain, is "messy."  Or "inelegant."  Or -- most tellingly -- "ugly."

So, put simply; why do physicists have the tendency to think that for a theory to be true, it has to be elegant and beautiful?  Couldn't the universe just be chaotic and weird, with different facets of it obeying their own unrelated laws, with no unifying explanation to account for it all?

This is the question that physicist Sabine Hossenfelder addresses in her wonderful book Lost in Math: How Beauty Leads Physicists Astray.  She makes a bold statement; that this search for beauty and elegance in the mathematical models has diverted theoretical physics into untestable, unverifiable cul-de-sacs, blinding researchers to the reality -- the experimental evidence.

Whatever you think about whether the universe should obey aesthetically pleasing rules, or whether you're okay with weirdness and messiness, Hossenfelder's book will challenge your perception of how science is done.  It's a fascinating, fun, and enlightening read for anyone interested in learning about the arcane reaches of physics.

[Note: if you purchase this book using the image/link below, part of the proceeds goes to support Skeptophilia!]


Wednesday, June 3, 2020

Keeping count

I'm fortunate to have been raised in a bilingual home.  My mother's first language was French, and my dad (son of a father of French and a mother of Scottish descent) was also fluent, although much more comfortable in English.  The result is that my parents spoke French in front of me (and also with our older relatives) when they didn't want me to understand, which was a hell of an incentive to learn listening comprehension, although -- as I found out later -- a bit of a problem when you're actually called upon to speak it yourself.

My Uncle Sidney, my mother's brother, didn't help matters much, because he was extremely fluent in the art of the French swear word.  He taught me a good many really creative expressions when I was still quite young, but I found out pretty quickly that when Uncle Sidney said, "Go ask your mother what ____ means," it was better to remain in ignorance than to incite my prim and prudish mom's ire.

Eventually, despite the impediments, I learned to speak French fairly well.  I distinctly recall, though, how baffled I was when I first learned the French counting system.

Even living in a Francophone household, it struck me as weird right from the get-go.  One through ten, no problem.  Like English, eleven and twelve have their own special names (onze and douze).  But... so do thirteen through sixteen.  Then seventeen, eighteen, and nineteen translate, respectively, to ten-seven, ten-eight, and ten-nine.

Things don't go really off the rails until you hit seventy.  Sixty is soixante; seventy is soixante-dix (sixty-ten). Then we reach eighty -- quatre-vingt -- literally, "four-twenty."

For what it's worth, ninety-seven is quatre-vingt dix-sept -- four-twenty ten-seven.

I read French pretty well, but when I hit a number with more than two digits, I still have to stop and do some mental arithmetic to figure it out.

Turns out I'm not alone.  A study by Iro Xenidou-Dervou of Vrije Universiteit Amsterdam et al. found that even when you control for other factors, the language a child speaks (so the counting system (s)he learns) has an effect on the facility with which the child learns arithmetic.  The more it corresponds to a simple, regular base-10 system, the better the child is at learning math.

On the extremely logical side, we have Chinese.  In Mandarin, ninety-two is jiǔ shí èr -- "nine-ten-two."  We've already looked at French (where it would be "four-twenty-twelve").  But for my money, the winner in the what-the-fuck-were-you-thinking department would be Danish, where ninety-two is tooghalvfems, where halvfems (ninety) is an abbreviation of the Old Norse halvfemsindstyve, or "four and a half times twenty."

And don't even get me started about Roman numerals. [Image licensed under the Creative Commons Дмитрий Окольников, Roman numerals!, CC BY-SA 4.0]

"The fact that they were the same in every other aspect, apart from the condition where two digits showed up, shows you that it's the language that is making the difference," said study lead author Xenidou-Dervou, in an interview with BBC.  "The effects are small, and yet this is numeracy at its most basic, just estimating a number on a line.  As adults, we're doing very complicated tasks in our daily lives, and so even small difficulties caused by the number naming system could potentially be an additive hurdle to everyday mathematical skills."

All of this brings back a subject that's fascinated me since my days as a graduate student in linguistics: the Sapir-Whorf hypothesis.  This is the idea that the language we grow up with profoundly influences our brain wiring -- so not only does our cognitive development influence our language learning, our language learning influences our cognitive development.  I found out about a particularly cool example of this when I was reading the brilliant book The Last Speakers, by K. David Harrison, which was an attempt to chronicle some of the world's most endangered languages.  When Harrison was traveling with a tribe in Siberia, he was intrigued to find out that they had no words for "right," "left," "behind," and "in front of."  Everything was described in terms of the cardinal directions.  So right now, my computer isn't in front of me; it's south of me.  (They also have direction-related words meaning "upstream" and "downstream.")

Anyhow, Harrison was trying to talk to one of the tribal elders about why that was, and all he got from him for a time was frank bafflement.  It was as if he couldn't even quite understand what Harrison was driving at.  Then, all of a sudden, he burst into gales of laughter.  "Really?" he said to Harrison.  "That's how you see the world?  So that tree is in one place, but if you turn around, it's now in a different place?  Everything in the world is relative to the position of your body, so when you move, the entire universe shifts around you?  What an arrogant people you must be!"

I know that these days, Sapir-Whorf is kind of out of vogue with linguists, but studies like the one by Xenidou-Dervou et al. make me realize how deeply woven together our cognition and our language is.  We create our world with our words, and the words we learn shape the reality we are able to see.

Including whether we say "ninety-two" or -- in old-system Welsh -- dau ar ddeg a phedwar ugain.

Literally, "two on ten and four twenties."

************************************

This week's Skeptophilia book recommendation of the week is a fun one -- George Zaidan's Ingredients: The Strange Chemistry of What We Put In Us and On Us.  Springboarding off the loony recommendations that have been rampant in the last few years -- fad diets, alarmist warnings about everything from vaccines to sunscreen, the pros and cons of processed food, substances that seem to be good for us one week and bad for us the next, Zaidan goes through the reality behind the hype, taking apart the claims in a way that is both factually accurate and laugh-out-loud funny.

And high time.  Bogus health claims, fueled by such sites as Natural News, are potentially dangerous.  Zaidan's book holds a lens up to the chemicals we ingest, inhale, and put on our skin -- and will help you sort the fact from the fiction.

[Note: if you purchase this book using the image/link below, part of the proceeds goes to support Skeptophilia!]




Monday, February 3, 2020

Mathematical stumbles

In the first part of my teaching career, I taught mainly physics and math, before switching to biology (which I then taught for the rest of my 32 years).  During my time as a physics and math teacher, I was fascinated by the number of students who didn't seem to be able to think numerically.  Some of them were quite skilled at equation manipulation, and so got good grades on quizzes.  The trouble started when they punched something into their calculator wrong, and got an answer that was wildly off -- and then didn't recognize that anything was amiss.

Probably the most extreme example of this was a girl in my physics class.  While we were studying electrostatics, there was a problem set up that was intended to lead you in the end to a value for the mass of an electron.  Well, she entered the numbers wrong, or divided when she was supposed to multiply, or some other simplistic careless error -- and got an answer of 86 kilograms.

She called me over, because when she checked her answer against the accepted value, it wasn't the same.  (Really not the same.  The mass of an electron is about 9 x 10^-31 kilograms -- a decimal point, followed by thirty zeroes, ending with a nine.)

"I must have done something wrong," she said.

I laughed and said, "Yeah, that's kind of heavy for an electron."

She gave me a baffled look and said, "It is?"

I thought she was kidding, but it became obvious quickly that she wasn't.  She knew 86 kilograms wasn't the number in the reference tables, but she honestly had no idea how far off she was.

"86 kilograms is almost two hundred pounds," I said.

She went, "Oh."

I saw this kind of thing over and over, and the problem became worse when you threw scientific notation into the mix, which I suspect was part of the problem with my student.  It was all too common for students to believe that whatever came out of the calculator must be right -- many of them seemed to have no ability to give an order-of-magnitude check of their answers to see if they even made sense given the parameters of the problem they were trying to solve.

[Image is in the Public Domain]

It's easy for those of us who are mathematically adept to be feeling a little smug right now.  But what is interesting is that if you change the context of the question, all of us start having similar troubles -- even expert mathematicians.

A group of psychologists at the Université de Genève set up two different sorts of (very simple) math problems, one of which requires you to think in sets, the other in linear axes.  Here's an example of each:
  • Set thinking:  Jim has fourteen pieces of fruit in his shopping basket, a combination of apples and pears.  John has two fewer pears than Jim, but the same number of apples.  How many total pieces of fruit does John have?
  • Axes thinking:  When Jane stands on a tall ladder, she can reach a spot fourteen feet high on the side of a house.  Jane is the same height as her twin sister Jill.  If Jill stood on the same ladder, but on a step two feet lower down, how high could she reach?
Both of these problems have the same parameters.  There are pieces of information missing (in the first, the number each of apples and pears Jim has; in the second, Jane's height and the height of the step she's standing on).  In each case, though, the missing information is unnecessary for solving the problem, and in each the solution method is (the same) simple subtraction -- 14 - 2 = 12.

What is extraordinary is that when asked to solve the problems, with an option to answer "no solution because there is insufficient information," people solved the axes problems correctly 82% of the time, and the sets problems only 47% of the time!

Even more surprising were the results when the same problems were given to expert mathematicians.  They got 95% of the axis problems correct -- but only 76% of the sets problems!

I found these results astonishing -- almost a quarter of the mathematicians thought that the information in the "apples and pears" problem above, and others like it, was insufficient to answer the question.

"We see that the way a mathematical problem is formulated has a real impact on performance, including that of experts, and it follows that we can't reason in a totally abstract manner," said Emmanuel Sander, one of the researchers in the study.

"One out of four times, the experts thought there was no solution to the problem even though it was of primary school level," said Hippolyte Gros, another of the authors of the paper, which appeared in the journal Psychonomic Bulletin and Review.  "And we even showed that the participants who found the solution to the set problems were still influenced by their set-based outlook, because they were slower to solve these problems than the axis problems...  We have to detach ourselves from our non-mathematical intuition by working with students in non-intuitive contexts."

What this shows is that the inability to think numerically -- what researchers term innumeracy -- isn't as simple as just a stumbling block in quantitative understanding.  Presumably expert mathematicians aren't innumerate (one would hope not, anyway), but there's still something going awry with their cognitive processing in the realm of sets that does not cause problems with their thinking about linear axes.  So it's not a mental math issue -- the mental math necessary for both problems is identical -- it's that somehow, the brain doesn't categorize the two different contexts as having an underlying similarity.

Which I find fascinating.  I'd love to have the same experiment run while the participants are hooked to an fMRI machine, and see if the regions of the brain activated in sets problems are different from the parts in axes problems.  I'd bet cold hard cash they are.

However, it still probably wouldn't answer what was amiss with the student who had the 86 kilogram electron.

*********************************

This week's Skeptophilia book of the week is both intriguing and sobering: Eric Cline's 1177 B.C.: The Year Civilization Collapsed.

The year in the title is the peak of a period of instability and warfare that effectively ended the Bronze Age.  In the end, eight of the major civilizations that had pretty much run Eastern Europe, North Africa, and the Middle East -- the Canaanites, Cypriots, Assyrians, Egyptians, Babylonians, Minoans, Myceneans, and Hittites -- all collapsed more or less simultaneously.

Cline attributes this to a perfect storm of bad conditions, including famine, drought, plague, conflict within the ruling clans and between nations and their neighbors, and a determination by the people in charge to keep doing things the way they'd always done them despite the changing circumstances.  The result: a period of chaos and strife that destroyed all eight civilizations.  The survivors, in the decades following, rebuilt new nation-states from the ruins of the previous ones, but the old order was gone forever.

It's impossible not to compare the events Cline describes with what is going on in the modern world -- making me think more than once while reading this book that it was half history, half cautionary tale.  There is no reason to believe that sort of collapse couldn't happen again.

After all, the ruling class of all eight ancient civilizations also thought they were invulnerable.

[Note: if you purchase this book using the image/link below, part of the proceeds goes to support Skeptophilia!]





Wednesday, December 11, 2019

The smell of time passing

We once owned a very peculiar border collie named Doolin.  Although from what I've heard, saying "very peculiar" in the same breath as "border collie" is kind of redundant.  The breed has a reputation for being extremely intelligent, hyperactive, job-oriented, and more than a little neurotic, and Doolin fit the bill in all respects.

As far as the "intelligent" part, she's the dog who learned to open the slide bolts on our fence by watching us do it only two or three times.  I wouldn't have believed it unless I'd seen it with my own eyes.  She also took her job very seriously, and by "job" I mean "life."  She had a passion for catching frisbees, but I always got the impression that it wasn't because it was fun.  It was because the Russian judge had only given her a 9.4 on the previous catch and she was determined to improve her score.

There were ways in which her intelligence was almost eerie at times.  I was away from home one time and called Carol to say hi, and apparently Doolin looked at her with question marks in her eyes.  Carol said, "Doolin, it's Daddy!"  Doolin responded by becoming extremely excited and running around the house looking in all of the likely spots -- my office, the recliner, the workshop -- as well as some somewhat less likely places like under the bed.  When the search was unsuccessful, apparently she seemed extremely worried for the rest of the evening.

Not that this was all that different from her usual expression.


One thing that always puzzled us, though, was her ability to sense when we were about to get home.  Doolin would routinely go to the door and stand there on guard before Carol's car pulled into the driveway.  She did the same thing, I heard, when I was about to arrive.  In each case, there was no obvious cue that she could have relied on; we live on a fairly well-traveled stretch of rural highway and even if she heard our cars in the distance, I can't imagine they sound that different from any of the other hundreds of cars that pass by daily.  And my arrival time, especially, varied considerably from day to day, because of after-school commitments.  How, then, did she figure out we were about to get home -- or was it just dart-thrower's bias again, and we were noticing the times she got it right and ignoring all the times she didn't?

According to Alexandra Horowitz, a professor of psychology at Barnard University, there's actually something to this observation.  There are hundreds of anecdotal accounts of the same kind of behavior, enough that (although there hasn't been much in the way of a systematic study) there's almost certainly a reason behind it other than chance.  Horowitz considered the well-documented ability of dogs to follow a scent trail the right direction by sensing where the signal was weakest -- presumably the oldest part of the trail -- and heading toward where it was stronger.  The difference in intensity is minuscule, especially given that to go the right direction the dog can't directly compare the scent right here to the scent a half a kilometer away, but has to compare the scent here to the scent a couple of meters away.

What Horowitz wondered is if dogs are using scent intensity as a kind of clock -- the diminishment of a person's scent signal after they leave the house gives the dog a way of knowing how much time has elapsed.  This makes more sense than any other explanation I've heard, which include (no lie) that dogs are psychic and are telepathically sensing your approach.  Biological clocks of all kinds are only now being investigated and understood, including how they are entrained -- how the internal state is aligned to external cues.  (The most obvious examples of entrainment are the alignment of our sleep cycle to light/dark fluctuations, and seasonal behaviors in other animals like hibernation and migration in response to cues like decreasing day length.)

So it's possible that dogs are entraining this bit of their behavior using their phenomenally sensitive noses.  It'll be interesting to see what Horowitz does with her hypothesis; it's certainly worth testing.  Now, I need to wrap this up because Guinness's biological clock just went off and told him it was time to play ball.  Of course, that happens about fifty times a day, so there may not be anything particularly surprising there.

***********************

This week's Skeptophilia book of the week is brand new; Brian Clegg's wonderful Dark Matter and Dark Energy: The Hidden 95% of the Universe.  In this book, Clegg outlines "the biggest puzzle science has ever faced" -- the evidence for the substances that provide the majority of the gravitational force holding the nearby universe together, while simultaneously making the universe as a whole fly apart -- and which has (thus far) completely resisted all attempts to ascertain its nature.

Clegg also gives us some of the cutting-edge explanations physicists are now proposing, and the experiments that are being done to test them.  The science is sure to change quickly -- every week we seem to hear about new data providing information on the dark 95% of what's around us -- but if you want the most recently-crafted lens on the subject, this is it.

[Note: if you purchase this book from the image/link below, part of the proceeds goes to support Skeptophilia!]