Skeptophilia (skep-to-fil-i-a) (n.) - the love of logical thought, skepticism, and thinking critically. Being an exploration of the applications of skeptical thinking to the world at large, with periodic excursions into linguistics, music, politics, cryptozoology, and why people keep seeing the face of Jesus on grilled cheese sandwiches.
Showing posts with label nature vs. nurture. Show all posts
Showing posts with label nature vs. nurture. Show all posts

Thursday, August 4, 2022

What's bred in the bone

A friend of mine was chatting with me about irritating situations at work, and she mentioned that she'd really lost her cool with a supervisor the previous week who apparently is notorious for being a bit of an asshole.  I mentioned that I tend to put up with such nonsense and later wish I'd spoken up for myself -- that it has to be pretty bad before I'll blow up (at a supervisor or anyone else).

She laughed and said, "Of course I have a quick temper.  My family's Italian.  It's in our genes."

She was joking, of course, no more serious than my father was when he quipped that our family was "French enough to like to drink and Scottish enough not to know when to stop."  But it's a common enough view, isn't it?  We get our personality traits from some nebulous genetic heritage, despite the fact that a great many of us are pretty thorough mixtures of ancestry, and that all humans regardless of race or ethnicity are well over 99.9% similar anyhow.  As geneticist Kenneth Kidd put it, "Race is not biologically definable.  We are far too similar."

Ha.  Take that, racists.

[Image is in the Public Domain]

The whole thing gets complicated, however, because race and ethnicity certainly have a cultural reality, and that can certainly affect how your personality develops as you grow up.  If you're raised in a family where arguments are regularly settled through shouting and waving your arms around (apparently true in my friend's case), then you learn that as a standard of behavior.  (Or, sometimes, decide, "That was a miserable way to live, I'm never going to treat people that way," and swing to the opposite extreme.)  All of this is just meant to highlight that teasing apart the genetic components of behavior (and there certainly are some) from the learned ones is no simple task.

All of this just gained an additional complication with a study last week in the journal Social Cognition that looked at another factor contributing to our behavior -- how our notions about our genetic makeup influence how we think we should be acting.

The study, by Ryan Wheat and Matthew Vess (of Texas A & M) and Patricia Holte (of Wake Forest University), was simple enough.  What they did was to take a group of test subjects, gave them a (bogus) saliva test, and split the group in two.  They were then given the "results," regarding what the sample said about their genetic makeup for a variety of characteristics.  The salient part, though was that half were told that their genetic sample showed they had an unusually high propensity for risk-taking, and the other half were told their genes said they tended to avoid risk.

Afterward, they were given a personality test, and only one thing was important; the questions that evaluated them for risk-tolerance.  Across the board, the people who were told their genes predisposed them to taking risks scored higher on the risk-tolerance questions than did the people who were told their genes made them risk-averse.

So not only do we have how we were raised complicating any sort of understanding of the genetic component of human behavior, we have our subconscious conforming to our perception of how people with our genetic makeup are thought to behave.

So even if there is no Italian gene for quick temper, maybe my friend's short fuse comes from her belief that there is.

Coupled, of course, with having been raised in a shouty family.  The "nurture" side of "nature vs. nurture" is not inconsequential.  All the more reason that question of whether behavior is learned or innate has been going on for a century and still hasn't been decisively settled.

In any case, I better wrap this up.  I think I'm going to go get another cup of coffee.  It's a little early for a glass of red wine, and you know us people with French blood.  It's either one or the other.

****************************************


Saturday, August 31, 2019

Sex, choice, and genes

Sometimes a piece of research makes me simultaneously think, "Okay, that's pretty interesting," and "Oh, no, this is not going to end well."

That was my reaction to the latest study of the genetics of sexuality and sexual orientation, which appeared in Science this week.  The paper, entitled "Large-Scale GWAS Reveals Insights Into the Genetic Architecture of Same-Sex Sexual Behavior," was the work of a huge team headed by Andrea Ganna of the Center for Genomic Medicine at Massachusetts General Hospital, and looked at genetic correlations amongst almost 500,000 individuals with their self-reported same-sex sexual behavior.

Before we launch off into how this is being spun, let's look at what Ganna et al. actually wrote:
In the discovery samples (UK Biobank and 23andMe), five autosomal loci were significantly associated with same-sex sexual behavior.  Follow-up of these loci suggested links to biological pathways that involve sex hormone regulation and olfaction.  Three of the loci were significant in a meta-analysis of smaller, independent replication samples.  Although only a few loci passed the stringent statistical corrections for genome-wide multiple testing and were replicated in other samples, our analyses show that many loci underlie same-sex sexual behavior in both sexes.  In aggregate, all tested genetic variants accounted for 8 to 25% of variation in male and female same-sex sexual behavior, and the genetic influences were positively but imperfectly correlated between the sexes [genetic correlation coefficient (rg)= 0.63; 95% confidence intervals, 0.48 to 0.78]...  Additional analyses suggested that sexual behavior, attraction, identity, and fantasies are influenced by a similar set of genetic variants (rg > 0.83); however, the genetic effects that differentiate heterosexual from same-sex sexual behavior are not the same as those that differ among nonheterosexuals with lower versus higher proportions of same-sex partners, which suggests that there is no single continuum from opposite-sex to same-sex preference.
To put it succinctly, and without all the scientific verbiage: sexuality, sexual orientation, and gender are complex, and the differences we see amongst humans are not attributable to a single cause.

Which you'd expect, I'd think.  The old binary divisions of male vs. female and heterosexual vs. homosexual are so clearly wrong it's a wonder anyone still thinks they're correct.  Transsexual and anatomically intersex individuals are hardly rare; and I know for a fact bisexuality exists, because I've been equally attracted to women and men since I was aware of sexual attraction at all.

[Image licensed under the Creative Commons Benson Kua, Rainbow flag breeze, CC BY-SA 2.0]

But this doesn't square with how some people want the world to work, so immediately this paper was published, it began to be twisted out of all recognition.

First, there was the "we wish the world was simple" approach, as exemplified by Science News, which for the record I'm about fed up with because for fuck's sake, they should know better.   Their headline regarding the study was "There's No Evidence That a 'Gay Gene' Exists," which is one of those technically-true-but-still-misleading taglines the media seems to be increasingly fond of.

No, there is no single "gay gene."  But reread the passage from the original paper I quoted above; the gist is that there is a host of factors, genetic and otherwise, that correlate with sexual orientation.  Here's a more accurate phrasing of the paper's conclusion, from Melinda Mills, writing about the study in the "Perspectives" column of Science: "The genetic correlation identified in the GWAS of whether a person had ever engaged in sex with someone of the same sex and the more complex measure of proportion of same-sex partners was 0.73 for men but only 0.52 for women.  This means that genetic variation has a higher influence on same-sex sexual behavior in men than in women and also demonstrates the complexity of women's sexuality."

Even the lower 0.52 correlation for women is pretty damn significant, considering that correlation runs on a scale of 0 to 1 where 0 means "no correlation at all" and 1 means "perfectly correlated."

But that didn't stop the next level of misinterpretation from happening, predictably from the anti-LGBTQ evangelicals and other crazy right-wingers, who would prefer it if people like me didn't exist.  All they did is read the headline from Science News (or one of the large number of media outlets that characterized the research the same way) and start writing op-ed pieces crowing, "See?  No gay gene!  We told you homosexuality was a choice.  Now science proves we were right all along."  Add to that the alarmists who went entirely the other direction and suggested that the Ganna et al. research could be used to identify non-heterosexuals for the purposes of persecution, or even eugenics, and you've got a morass of hyperemotional responses that miss the main conclusions of the study entirely.

So can I recommend that all of you read the fucking research?  For the Right-Wing NutJobs, let me just say that if you have to lie about what a study actually says to support your viewpoint, your position must be pretty tenuous from the get-go.  And while I sympathize with the alarmists' fears, it's hard to see how the Ganna et al. research could be used for any sort of nefarious purposes, when the best genetic correlates to homosexuality numbered around a half-dozen, not all of them showed up in every LGBTQ person studied, and even aggregated only predicted correctly around half the time.

So the whole thing got me kind of stirred up, as measurable by the number of times I felt obliged to use the f-bomb to express my frustration.  Which you'd have predicted, given my (1) bisexuality, (2) background in genetics, and (3) hatred of popular media mischaracterizing science.

In any case, the take-home message here is threefold:
  1. The universe is a complex place.  Deal with it.
  2. Wherever human sexuality comes from, it isn't a choice.  If that offends your sensibilities or conflicts with your worldview, you might want to re-examine your sensibilities and worldview, because as far as I can tell reality doesn't give a rat's ass about what you'd like to believe.
  3. Don't trust headlines.  Always go back to the original research before forming an opinion.  Yes, reading scientific papers is challenging for non-scientists, but that's the only way you'll know your understanding is on solid ground.
So that's the latest highly equivocal piece of the nature-nurture puzzle, the outcome of which you'd probably have expected from knowing the history of the question.  As much as I'd like it if these matters were simple, I'm much happier knowing the truth.  I'll end with a quote from the inimitable Carl Sagan: "For me, it is far better to grasp the universe as it really is than to persist in delusion, however satisfying and reassuring."

********************************

This week's Skeptophilia book recommendation is about a subject near and dear to my heart; the possibility of intelligent extraterrestrial life.  In The Three-Body Problem, Chinese science fiction writer Cixin Liu takes an interesting angle on this question; if intelligent life were discovered in the universe -- maybe if it even gave us a visit -- how would humans react?

Liu examines the impact of finding we're not alone in the cosmos from political, social, and religious perspectives, and doesn't engage in any pollyanna-ish assumptions that we'll all be hunky-dory and ascend to the next plane of existence.  What he does think might happen, though, makes for fascinating reading, and leaves you pondering our place in the universe for days after you turn over the last page.

[Note: if you purchase this book from the image/link below, part of the proceeds goes to support Skeptophilia!]





Thursday, May 23, 2019

Dog days

The nature/nurture debate has been going on for some time -- whether our behavior and personalities are controlled by our genetics or our environment.  Most scientists believe it's both -- the differences lie in what amount and which parts of our personalities are due to each factor.

The search for answers has led to some rather startling results.  Separated-twin studies -- involving locating sets of identical twins who were separated at birth and raised in different homes -- resulted in some correspondences that were astonishing.  The weirdest one, done back in 1979, found a pair of identical twin brothers who were separated at age four weeks and didn't even know of each other's existence -- and they were both named Jim, drove the same type of car, both had tension headaches and were chronic nail-biters, were both firefighters, and chain-smoked the same brand of cigarettes.

Now, even the most die-hard proponent of the personality-is-inborn explanation wouldn't claim that the kind of car you drive is genetic.  Some of these similarities are clearly due to the Law of Large Numbers -- in a big enough sample size, you'll find strange coincidences that don't really mean anything profound.  Add to that Dart-Thrower's Bias -- our tendency to notice and remember outliers -- and the Two Jims aren't really that hard to explain.  (And, of course, out of the thousands of pairs of twins studied, the media is going to point out the one that is the oddest.)

But still.  Consider some of the other similarities.  While cigarette brand choice is certainly not genetic, addictive behaviors have an inheritable componentSo does anxiety, which accounts for the headaches and the nail-biting.  And to be a firefighter, you have to have physical strength, courage, and an ability to take risks -- all features of personality that could well have an origin in our DNA.

Last week, a paper appeared in Scientific Reports that supports a strange conjecture -- that dog ownership is partly genetic.  The research, which came out of Uppsala University (Sweden), looked at the concordance rates of dog ownership between identical twins (which share 100% of their DNA) and fraternal twins (which share, on average, half of their DNA).  And the data were clear; the concordance between identical twins is far higher, supporting a large degree of heritability for dog ownership.

"We were surprised to see that a person's genetic make-up appears to be a significant influence in whether they own a dog," said Tove Fall, professor of medical epidemiology and lead author of the study.  "As such, these findings have major implications in several different fields related to understanding dog-human interaction throughout history and in modern times.  Although dogs and other pets are common household members across the globe, little is known how they impact our daily life and health.  Perhaps some people have a higher innate propensity to care for a pet than others."


"The study has major implications for understanding the deep and enigmatic history of dog domestication" said Keith Dobney, Chair of Human Palaeoecology in the Department of Archaeology, Classics and Egyptology at the University of Liverpool, who co-authored the study.  "Decades of archaeological research have helped us construct a better picture of where and when dogs entered into the human world, but modern and ancient genetic data are now allowing us to directly explore why and how."

My sense of why I have dogs is that I've somehow become convinced that my house just isn't filthy enough.  The entryway from our back yard was once off-white linoleum, but years of tracking by various dogs we've owned has left it way more off than white.  And I'm assuming the indoor-outdoor carpeting in our basement has been tan all along, but who knows?

Of course, we own dogs for more than just the random carpet stains and pieces of dismembered squirrel.  They're sweet, cuddly, love to play, and have a boundless enthusiasm for enjoying life that I can only aspire to.  Every time we lose one -- inevitable, given their shorter life span -- it leaves me grieving for months.  And while I've sometimes contemplated not replacing them when they're gone, within the year I'm already perusing PetFinder looking for another rescue puppy to give a home.

Unsurprisingly, given the Fall et al. study, my parents were also major dog lovers.  My dad had a little terrier named Max whom he adopted while working for the post office as a letter carrier.  He had a walking route, and Max would meet him every morning and follow him the entire way.  My dad started bringing snacks for him, and eventually Max's owners said, "You know, we don't have time for him anyway, would you like to take Max home?"  My dad agreed -- and his canine pal spent the rest of his life following him around, even after Max had gone complete blind from cataracts.  In fact, Max would walk behind my dad, keeping track of where he was by sound and smell, and when my dad stopped Max would keep going till he bumped into my dad's leg, then stand there, nose pressed against him, until he started walking again.

Old habits die hard.

Anyhow, this gives us a new perspective on dog ownership, and the strange relationship between our genes and our behavior.  But I need to wind this up, because Guinness wants to play ball, and track more mud around the basement.  You know how it goes.

***********************************

Back in 1989, the United States dodged a serious bullet.

One hundred wild monkeys were imported for experimental purposes, and housed in a laboratory facility in Reston, Virginia, outside of Washington DC.  Soon afterwards, the monkeys started showing some odd and frightening symptoms.  They'd spike a fever, become listless and glassy-eyed, and at the end would "bleed out" -- capillaries would start rupturing all over their body, and they'd bleed from every orifice including the pores of the skin.

Precautions were taken, but at first the researchers weren't overly concerned.  Most viruses have a feature called host specificity, which means that they tend to be infectious only in one species of host.  (This is why you don't need to worry about catching canine distemper, and your dog doesn't need to worry about catching your cold.)

It wasn't until someone realized the parallels with a (then) obscure viral outbreak in 1976 in Zaire (now the Republic of Congo) that the researchers realized things might be much more serious.  To see why, let me just say that the 1976 epidemic, which completely wiped out three villages, occurred on...

... the Ebola River.

Of course, you know that the feared introduction of this deadly virus into the United States didn't happen.  But to find out why -- and to find out just how lucky we were -- you should read Richard Preston's book The Hot Zone.  It's a brilliantly-written book detailing the closest we've come in recent years to a pandemic, and that from a virus that carries with it a 95% mortality rate.  (One comment: the first two chapters of this book require a bit of a strong stomach.  While Preston doesn't go out of his way to be graphic, the horrifying nature of this disease makes some nauseating descriptions inevitable.)

[Note:  If you purchase this book through the image/link below, part of the proceeds will go to supporting Skeptophilia!]





Thursday, November 7, 2013

Math, nature, nurture, and effort

The Atlantic ran a story last week by Miles Kimball and Noah Smith called "The Myth of 'I'm Bad at Math.'"  In it, we get the hopeful message that people who have claimed all along that they are "bad at math" may not be, that ability at mathematics comes from hard work, not genetics.

(Photograph courtesy of AdamK and the Wikimedia Commons)

They cite a number of sources (and their own experience with educating students) in supporting their assertion.  The most interesting evidence comes from a study at Columbia University by Lisa Blackwell, Kali Trzesniewski, and Carol Dweck, which showed that students who agreed with the statement "You can greatly change how intelligent you are" achieved higher grades than those who agreed with the statement "You have a certain amount of intelligence, and you can't really do much to change it."  Further, convincing students who agreed with the second statement that intelligence was actually under their control had the effect of raising their grades -- and their self-confidence.

On one level, this is hardly surprising.  No one seriously believes that intelligence, or even a more limited slice of it -- like mathematical ability -- is entirely inborn.  We all know examples of people who seem to have a great deal of talent but who are lazy and never develop it.  They cite the Japanese culture as one in which hard work is valued above innate talent, and imply that this is one of the reasons Japanese children score, on average, better than American children on math assessments.  Kimball and Smith state, in their closing paragraph,
Math education, we believe, is just the most glaring area of a slow and worrying shift. We see our country moving away from a culture of hard work toward a culture of belief in genetic determinism. In the debate between “nature vs. nurture,” a critical third element—personal perseverance and effort—seems to have been sidelined. We want to bring it back, and we think that math is the best place to start.
And while I agree with their general conclusion -- that everyone could probably do with putting out a great deal more effort -- I can't help but think that Kimball and Smith are overstating their case.

I have a 27-year-long baseline of watching students attempting to master technical concepts, and there is a difference in the native ability students bring to bear on the topics they are trying to learn.  I still remember one young lady, in one of my AP Biology classes years ago, who spent many frustrated hours attempting to master statistical genetics, and who failed fairly catastrophically.  Her habit of hard work, and an excellent ability with verbal information, led to success in most of the other areas we studied -- in which a capacity for remembering names of things, and the connections between them, matter more than a quantitative sense.  But in statistical genetics, where you have to be able to understand how numbers work on a very fundamental level, that combination of hard work and verbal ability didn't help.

I recall her saying to me one day, after an hour-long fruitless attempt to understand how the Bateson-Punnett method of mapping genes works, "I guess I just have a genetics-proof brain."

In no activity during the year in my introductory biology class do I notice this dichotomy between the math brains and the math-proof brains more than the one we did last week.  It's a common lab, and I bet many of you did it, when you were in high school.  Cubes of raw potato (or some other absorbent material) of different sizes are soaked in iodine solution (or some other dye), and after a given time, they're cut in half to see how far the dye has diffused into the cubes.  After a series of calculations, the far-reaching (and rather counter-intuitive) conclusion is arrived at -- that small cubes have a much larger ratio of surface area to volume than big ones do, and as a result, diffusion is way less efficient for big cubes.  This is one of the reasons that the cells of a whale, a human, and a mouse are all about the same size (really freakin' small) -- any larger, and transport would be hindered by their low surface-area-to-volume ratio.

The calculations aren't hard, but I see many kids losing the forest for the trees.  Quickly.  Which kids get lost seems to have little to do with effort level, and almost nothing to do with verbal ability.  I can typically divide the class into two sections -- the group that will get the concept quickly and easily (usually with a delighted, "Oh!  Wow!  That's cool!"), and the group that after slogging their way through the calculations, still don't see the point -- sometimes, not even after I explain it to them.  Which are in which group seems to have nothing to do with their grades on prior tasks -- or with the effort they exert.

It's ironic that nearly simultaneously with the article in The Atlantic, a paper was published in PNAS (The Proceedings of the National Academy of Sciences) by Ariel Starr, Melissa Libertus, and Elizabeth Brannon, of Duke University.  Entitled "Number Sense in Infancy Predicts Mathematical Ability in Childhood," the study by Starr et al. tells us something fascinating -- that a "preverbal number sense" in infants, who have never manipulated numbers before, predicts their score on standardized math assessments three years later.

Here's how Rachel Nuwer of Science Now describes the experiment:
The researchers showed the babies opposing images of two sets of dots that flashed before them on a screen. One side of the screen always contained 10 dots, which were arranged in various patterns. The other side alternated between 10 and 20 dots, also arranged in various patterns. The team tracked the infants’ gaze—a common method for judging infant cognition—to see which set of dots they preferred to watch. Babies prefer to look at new things to old things, so the pattern of dots that flashed between arrays of 10 and 20 should appear more interesting to infants because the dots were changing not just in position, but in number. Both screens changed dot position simultaneously, so in theory, the flashing pattern changes were equally distracting. If an infant indicated that she picked up on the difference in dot numbers by preferentially staring at the 10- and 20-dot side of the screen, the researchers concluded that her intuitive number sense was at work.
Three years later, the children who achieved the best scores on preschool math assessments were, to a great degree, the ones who had shown innate mathematical sense as infants.

Now, I don't want to imply that hard work isn't important; there's a lot to be gained by effort, and I suspect that even my long-ago student with the "genetics-proof brain" would have gotten it had she persisted.  But Kimball and Smith's assertion, that hard work can trump innate ability, may simply be factually incorrect.  The bottom line may be that perhaps everyone can learn differential calculus, but the hard-wiring of our brains is probably different enough that for some of us, the effort and time that would be required would probably represent the limit of an exponential function as t approaches infinity.