Skeptophilia (skep-to-fil-i-a) (n.) - the love of logical thought, skepticism, and thinking critically. Being an exploration of the applications of skeptical thinking to the world at large, with periodic excursions into linguistics, music, politics, cryptozoology, and why people keep seeing the face of Jesus on grilled cheese sandwiches.
Showing posts with label research. Show all posts
Showing posts with label research. Show all posts

Friday, August 8, 2025

The four-alarm fire

I present to you three recent articles with a linked theme.

The first is about a study at the Walter and Eliza Hall Institute for Medical Research in Australia, and describes an mRNA vaccine that appears to be capable of stopping malaria in its tracks.  The impact of malaria is astonishing, a fact that often escapes the notice of those of us who live in temperate parts of the world where it doesn't occur.  I still remember my shock, when one of my biology professors asked what species of animal has caused more human deaths than any other -- in fact, more than all the other animals combined.

Turns out, of course, it's the mosquito.  Between malaria, yellow fever, and dengue, and a host of other less-common diseases like chikungunya, eastern equine encephalitis, and West Nile virus, mosquitoes (actually several species, but lumping them together for the sake of simplicity) have by far outstripped all other animals in their negative impact on humans.  And of the diseases they carry, malaria is the worst, infecting an estimated three hundred million people per year, and causing six hundred thousand annual fatalities.

The new vaccine is, like the COVID-19 vaccine, an activated piece of messenger RNA.  In this case, it targets a gene in the malaria microorganism that is essential to the pathogen's reproduction within the mosquito.

[Image licensed under the Creative Commons Supyyyy, Double-stranded RNA, CC BY-SA 4.0]

In preclinical trials, the vaccine caused a 99.7% drop in transmission rates.  The potential impact of a therapy with this efficacy is astronomical, especially given that post-infection medical treatment for malaria is of limited benefit -- and has to be administered for the remainder of the patient's life.  A vaccine that could stop malaria transmission almost completely would have as great a positive effect on life in equatorial regions of the world as the smallpox and polio vaccines did globally in the twentieth century.

The second is a series of studies having to do with the use of mRNA vaccines to target cancer.  The difficulty with conventional chemotherapy is that it's hard to find chemicals that kill tumor cells without damaging your own tissues; as I'm sure many of you know all too well, chemotherapy drugs often come along with miserable and long-lasting side effects.  The effectiveness of mRNA cancer treatments is that the strand of mRNA can be designed to target tumor-specific antigens, turning them into what amount to "smart bombs" that destroy cancerous tissues without harming the rest of the body.  The therapy has been demonstrated to be useful against a variety of types of cancer, including the deadly and extremely hard to treat pancreatic cancer.  There has even been dramatic work done that has raised the possibility of a universal cancer vaccine -- something about which University of Florida researcher Duane Mitchell said, "What we found is by using a vaccine designed not to target cancer specifically but rather to stimulate a strong immunologic response, we could elicit a very strong anticancer reaction.  And so this has significant potential to be broadly used across cancer patients — even possibly leading us to an off-the-shelf cancer vaccine."

The third is that the Secretary of Health and Human Services, Robert F. Kennedy Jr., just announced that he's canceling five hundred million dollars in funding for the development of mRNA vaccines.

Let me be blunt, here.

This action will kill people.

Not that RFK cares.  His dangerous lies were directly responsible for the vaccine avoidance that caused a devastating outbreak of measles in Samoa that killed eighty people, mostly children -- an action for which he has yet to take responsibility.  (This, of course, is hardly surprising; "It's someone else's fault" should be the new motto of the GOP.)

RFK has built his entire stance on lies.  He called the COVID-19 vaccine "the deadliest vaccine ever made," despite the CDC finding that vaccination saved more than two hundred thousand lives during the peak of the pandemic.  He has claimed without any scientific basis that all mRNA vaccines are dangerous, and in fact has talked about it in such a way as to lead people to believe that mRNA itself is a dangerous chemical, despite the fact that anyone who passed high school biology should recognize how ridiculous this is.  (I actually saw someone post, apparently seriously, that they would "never allow mRNA in their body," to which I responded, "good luck with that.")

I know there's some stiff competition, but I think RFK would top the list of the Most Dangerous Trump Appointees.  His fear-based, anti-science policies are going to directly result in deaths -- if we're lucky, it'll only be in the thousands, but if we have another pandemic, it could well be in the millions.  The scariest part is that I have no idea what we can do about it.  Besides not taking responsibility, the other thing the Republicans seem to be awfully good at is not bowing to pressure from knowledgeable experts.  In fact, being countered makes them double down and hang on even harder.

And can I point out here that almost half of the research funding RFK cut could be offset by canceling the plans for Trump's fucking Versailles-wannabe golden ballroom?

This is a four-alarm fire, and it seems like barely anyone is paying attention.  Certainly no one who can do anything about it.  This goes way beyond whether any of us will be able to get flu and COVID boosters this fall; this is about basic medical research that can save countless lives.  But ignorance and anti-science dogmatism are winning at the moment.

I just hope that we won't have to wait until a deadly global pandemic for people to wake up and start objecting -- and getting this ignorant, dramatically unqualified ideologue out of a position he never should have been appointed to in the first place.

****************************************


Saturday, October 5, 2024

The treadmill

I've mentioned before how my difficulties with math short-circuited my goal of becoming a researcher in physics, but the truth is, there's more to the story than that.

Even after I realized that I didn't have the mathematical ability -- nor, honestly, enough interest and focus to overcome my challenges -- I still had every intention of pursuing a career in science.  I spent some time in the graduate school of oceanography at the University of Washington, and from there switched to biology, but I found neither to be a good fit.  It wasn't a lack of interest in the disciplines; biology, in fact, is still a deep and abiding fascination to this day, and I ultimately spent over three decades teaching the subject to high schoolers.  What bothered me was the publish-or-perish atmosphere that permeated all of research science.  I still recall my shock when one of our professors said, "Scientists spend 25% of their time doing the research they're interested in, and 75% of their time trying to beat everyone else in the field to grant money so they don't starve to death."

It's hard to pinpoint an exact moment that brought me to the realization that the career I'd always dreamed of wasn't for me -- but this was certainly one of the times I said, "Okay, now, just hang on a moment."

I'm not alone in having issues with this.  The brilliant theoretical physicist Sabine Hossenfelder did a video on her YouTube channel called "My Dream Died, and Now I'm Here" that's a blistering indictment of the entire edifice of research science.  Hossenfelder has the following to say about how science is currently done:

It was a rude awakening to realize that this institute [where she had her first job in physics research] wasn't about knowledge discovery, it was about money-making.  And the more I saw of academia, the more I realized it wasn't just this particular institute and this particular professor.  It was generally the case.  The moment you put people into big institutions, the goal shifts from knowledge discovery to money-making.  Here's how this works:

If a researcher gets a scholarship or research grant, the institution gets part of that money.  It's called the "overhead."  Technically, that's meant to pay for offices and equipment and administration.  But academic institutions pay part of their staff from this overhead, so they need to keep that overhead coming.  Small scholarships don't make much money, but big research grants can be tens of millions of dollars.  And the overhead can be anything between fifteen and fifty percent.  This is why research institutions exert loads of pressure on researchers to bring in grant money.  And partly, they do this by keeping the researchers on temporary contracts so that they need grants to get paid themselves...  And the overhead isn't even the real problem.  The real problem is that the easiest way to grow in academia is to pay other people to produce papers on which you, as the grant holder, can put your name.  That's how academia works.  Grants pay students and postdocs to produce research papers for the grant holder.  And those papers are what the supervisor then uses to apply for more grants.  The result is a paper-production machine in which students and postdocs are burnt through to bring in money for the institution...

I began to understand what you need to do to get a grant or to get hired.  You have to work on topics that are mainstream enough but not too mainstream.  You want them to be a little bit edgy, but not too edgy.  It needs to be something that fits into the existing machinery.  And since most grants are three years, or five years at most, it also needs to be something that can be wrapped up quickly...

The more I saw of the foundations of physics, the more I became convinced that the research there wasn't based upon sound scientific principles...  [Most researchers today] are only interested in writing more papers...  To get grants.  To get postdocs.  To write more papers.  To get more grants.  And round and round it goes.
The topic comes up today because of two separate studies that came out in the last two weeks that illustrate a hard truth that the scientific establishment as a whole has yet to acknowledge; there's a real human cost to putting talented, creative, bright people on the kind of treadmill Hossenfelder describes.

[Image licensed under the Creative Commons Doenertier82, Phodopus sungorus - Hamsterkraftwerk, CC BY-SA 3.0]

The first study, from a group in Sweden, found that simply pursuing a Ph.D. takes a tremendous toll on mental health, and instead of there being a "light at the end of the tunnel," the toll worsens as the end of the work approaches.  By the fifth year of doctoral study, the likelihood of a student using mental-health medications rises by forty percent.  It's no surprise why; once the Ph.D. is achieved, there's the looming stress of finding a postdoc position, and then after that the savage competition for the few stable, tenure-track research positions out there in academia.  "You need to generate data as quickly as possible, and the feeling of competition for funding and jobs can be very strong, even early in your PhD.," said Rituja Bisen, a fifth-year Ph.D. student in neurobiology at the University of Würzburg.  "Afterward, many of us have to move long distances, even out of the country, to find a worthwhile position.  And even then, there's no guarantee.  It doesn’t matter how good a lab is; if it’s coming out of a toxic work culture, it isn’t worth it in the long run."

The other study, out of Poland (but involving worldwide data), is perhaps even more damning; over fifty percent of researchers leave science entirely in under ten years after publishing their first academic paper.

You spend huge amounts of money on graduate school, work your ass off to get a Ph.D, and then a position as a researcher, and after all that -- you find that (1) the stress isn't worth it, (2) you're barely making enough money to get by, and (3) the competition for grants is only going to get worse over time.  It's not surprising that people decide to leave research for other career options.

But how heartbreaking is it that we're doing this to the best and brightest minds on the planet?

And the problem is even more drastic for women and minorities; for them, the number still left publishing after ten years is more like thirty percent of the ones who started.

How far would we have advanced in our understanding of how the universe works if the system itself wasn't strangling the scientists?

Back when modern science got its start, in the seventeenth and eighteenth centuries, science was the province of the rich; only the people who were already independently wealthy had the wherewithal to (1) get a college education, and afterward (2) spend their time messing about in laboratories.  There are exceptions -- Michael Faraday comes to mind -- but by and large, scientific inquiry was confined to the gentry.

Now, we have the appearance of a more open, egalitarian model, but at its basis, the whole enterprise still depends on institutions competing for money, and the people actually doing the research (i.e. the scientists) being worked to the bone to keep the whole superstructure running.

It's a horrible problem, and one I don't see changing until our attitudes shift -- until we start prioritizing the advancement of knowledge over academia-for-profit.  Or, perhaps, until our governments recognize how absolutely critical science is, and fund that over the current goals of fostering corporate capitalism to benefit the extremely wealthy and developing newer and better ways to kill those we perceive as our enemies.

I've heard a lot of talk about how prescient Star Trek was -- we now have something very like their communicators and supercomputers, and aren't far away from tricorders.  But we won't actually get there until we develop one other thing, and I'm not talking about warp drives or holodecks.

I'm talking about valuing science, and scientists, as being the pinnacle of what we as a species can achieve, and creating a system to provide the resources to support them instead of doing everything humanly possible to drive them away.

****************************************


Tuesday, September 3, 2024

The problem with research

If there's one phrase that torques the absolute hell out of me -- and just about every actual scientist out there -- it's, "Well, I did my research."

Oh, you did, did you?  What lab did you do your research in?  Or was it field work?  Let's see your data!  Which peer-reviewed journal published your research?  How many times has it been cited in other scientific journals?

Part of the problem, of course, is like a lot of words in the English language -- "theory" and "proof" are two examples that come to mind -- the word "research" is used one way by actual researchers and a different way by most other people.  We were taught the alternate definition of "research" in grade school, with being assigned "research papers," which meant "go out and look up stuff other people have found out on the topic, and summarize that in your own words."  There's a value to doing this; it's a good starting place to understanding a subject, and is honestly where we all began with scholarship.

The problem is -- and it exists even at the grade-school level of inquiry -- this kind of "research" is only as good as the sources you choose.  When I was a teacher, one of the hardest things to get students to understand was that all sources are not created equal.  A paper in Science, or even the layperson's version of it in Scientific American or Discover, is head-and-shoulders above the meanderings of Some Random Guy in his blog.  (And yes, I'm well aware that this pronouncement is being made by Some Random Guy in his blog.)

That doesn't mean those less-reputable sources are necessarily wrong, of course.  It's more that they can't be relied upon.  While papers in Science (and other comparable journals) are occasionally retracted for errors or inaccuracies, there is a vetting process that makes their likelihood of being correct vastly higher.  After all, any oddball with a computer can create a website, and post whatever they want on it, be it brilliant posts about cutting-edge science or the looniest of wingnuttery.

The confusion between the two definitions of the word research has the effect of increasing people's confidence in the kind we were all doing in middle school, and giving that low-level snooping about an undeserved gloss of reputability.  This was the upshot of a paper in Nature (peer-reviewed science, that), by Kevin Aslett of the University of Central Florida et al., entitled, "Online Searches to Evaluate Misinformation Can Increase Its Perceived Veracity."  Their results are kind of terrifying, if not unexpected given the "post-truth society" we've somehow slid into.  The authors write:

Although conventional wisdom suggests that searching online when evaluating misinformation would reduce belief in it... across five experiments, we present consistent evidence that online search to evaluate the truthfulness of false news articles actually increases the probability of believing them...  We find that the search effect is concentrated among individuals for whom search engines return lower-quality information.  Our results indicate that those who search online to evaluate misinformation risk falling into data voids, or informational spaces in which there is corroborating evidence from low-quality sources. 

The tendency appears to be that when someone is "doing their research" on a controversial subject, what they do is an online search, pursued until they find two or three hits on sources that corroborate what they already believed, and that strengthens their conviction that they were right in the first place.  The study found that very little attention was usually given to the quality of those sources, or where those sources got the information themselves.  If it makes the "researcher" nod sagely and say, "Yeah, that's what I thought," it doesn't matter if the information came from NASA -- or from QAnon.

The problem is, a lot of those bogus sources can look convincing. 

Other times, of course, all you have to be able to do is add two-digit numbers to realize that they're full of shit.

People see data in some online source, and rarely consider (1) who collected the data and why, (2) how it was analyzed, (3) what information wasn't included in the analysis, and (4) whether it was verified, and if so how and by whom.  I first ran into the old joke about "73.4% of all statistics are made up on the spot" years ago, and it's still funny, even if our laughs are rather wry these days.  Sites like Natural News, Food Babe, Before It's News, Breitbart.com, Mercola.com, InfoWars, One America News, and even a few with scholarly-sounding names -- like The Society for Scientific Exploration, Evolution News, and The American College of Pediatricians are three examples -- are clearinghouses for fringe-y and discredited ideas, often backed up by data that's either cherry-picked and misrepresented, or from sources even further down the ladder of sketchy credibility.

Given how much bullshit is out there,  a lot of it well-hidden behind facts, figures, and fancy writing, it can be a challenge for laypeople (and I very much count myself amongst their numbers) to discern truth from fiction.  It's also an uphill struggle to fight against the very natural human tendency of confirmation bias; we all would love it if our cherished notions of how the world works were one hundred percent correct.  But if we want to make smart decisions, we all need to stop saying "I did my research" when all that "research" involved was a twenty-minute Google search to find the website of some random crank who confirmed what we already believed.

Remember, as the brilliant journalist Kathryn Schulz points out, that one of the most mind-expanding and liberating things we can say is, "I don't know.  Maybe I'm wrong."  And to start from that open-minded perspective and find out what the facts really are -- from the actual researchers.

****************************************


Thursday, September 16, 2021

Bias amplification

Last week I had a frustrating exchange with an acquaintance over the safety of the COVID-19 vaccine.

He'd posted on social media a meme with the gist that there'd been so much waffling and we're-not-sure-ing by the medical establishment that you couldn't trust anything they said.  I guess he'd seen me post something just a few minutes earlier and knew I was online, because shortly afterward he DMd me.

"I've been waiting for you to jump in with your two cents' worth," he said.

I guess I was in a pissy mood -- and to be honest, anti-vaxx stuff does that to me anyhow.  I know about a dozen people who've contracted COVID, two of whom died of it (both members of my graduating class in high school), and in my opinion any potential side-effects from the vaccine are insignificant compared to ending your life on a ventilator.

"Why bother?" I snapped at him.  "Nothing I say to you is going to make the slightest bit of difference.  It's a waste of time arguing."

He started in on how "he'd done his research" and "just wasn't convinced it was safe" and "the medical establishment gets rich off keeping people sick."  I snarled, "Thanks for making my point" and exited the conversation.


It's kind of maddening to be told "I've done my research" by someone who not only has never set foot in a scientific laboratory, but hasn't even bothered to read peer-reviewed papers on the topic.  Sorry, scrolling through Google, YouTube, and Reddit -- and watching Fox News -- is not research.

Unlike a lot of anti-science stances, this one is costing lives.  Every single day I see news stories about people who have become grievously ill with COVID, and whose relatives tell tearful stories after they died about how much they regretted not getting the vaccine.  Today's installment -- from a man in Tennessee who has been in the hospital for three weeks and is still on oxygen -- "They told us not to worry, that it was just a bad cold.  They lied."

The problem is -- like my acquaintance's stubbornly self-confident "I've done my research" comment -- fighting this is a Sisyphean task.  If you think I'm exaggerating, check out the paper that came out this week in Journal of the European Economic Association, about some (actual, peer-reviewed) research showing that not only do we tend to gloss over evidence contradicting our preferred beliefs, when we then share those beliefs with others, our certainty we're right increases whether or not the people we're talking to agree with us.

The phenomenon, which has been called bias amplification, is like confirmation bias on steroids.  "This experiment supports a lot of popular suspicions about why biased beliefs might be getting worse in the age of the internet," said Ryan Oprea, who co-authored the study.  "We now get a lot of information from social media and we don't know much about the quality of the information we're getting.  As a result, we're often forced to decide for ourselves how accurate various opinions and sources of information are and how much stock to put in them.  Our results suggest that people resolve this quandary by assigning credibility to sources that are telling us what we'd like to hear and this can make biases due to motivated reasoning a lot worse over time."

I don't even begin to know how to combat this.  The problem is, most laypeople (and I very much include myself in this) lack the expertise to comprehend a lot of peer-reviewed research on immunology, which is usually filled with technical jargon and abstruse details of biochemistry.  And every step you take away from the actual research -- from university or research-lab press releases, to summaries in popular science magazines, to blurbs in ordinary media, to Some Guy's blog -- introduces more opinions, oversimplifications, and outright misinformation.

And I'm completely aware that Skeptophilia is also Some Guy's blog.  I will say in my own defense, however, that I do try to base what I write on the actual research, not on Tucker Carlson quoting Nicki Minaj's tweets about how her boyfriend got the COVID vaccine and afterward his balls swelled up.  (No, I am not making this up.)

So that's today's rather discouraging scientific study.  It's sad that so many of us have to become gravely ill, or watch someone we love die in agony, before we'll admit that we might have been wrong.  I'll just end with what the research -- from the scientists themselves -- has to say: the COVID vaccines are safe and effective, and the vast majority of people who have had severe COVID are unvaccinated.  The "breakthrough cases" of vaccinated people testing positive almost never result in hospitalization, and when they do, it's because of comorbidities.

But don't take my word for it.  If you honestly want to know what the research says, and you're willing to keep an open mind on the topic and shape your opinion based upon the evidence, start here.  And after that, go out and get the fucking vaccine.

Seriously.

 **************************************

London in the nineteenth century was a seriously disgusting place to live, especially for the lower classes.  Sewage was dumped into gutters along the street; it then ran down into the ground -- the same ground from which residents pumped their drinking water.  The smell can only be imagined, but the prevalence of infectious water-borne diseases is a matter of record.

In 1854 there was a horrible epidemic of cholera hit central London, ultimately killing over six hundred people.  Because the most obvious unsanitary thing about the place was the smell, the leading thinkers of the time thought that cholera came from bad air -- the "miasmal model" of contagion.  But a doctor named John Snow thought it was water-borne, and through his tireless work, he was able to trace the entire epidemic to one hand-pumped well.  Finally, after weeks and months of argument, the city planners agreed to remove the handle of the well, and the epidemic ended only a few days afterward.

The work of John Snow led to a complete change in attitude toward sanitation, sewers, and safe drinking water, and in only a few years completely changed the face of the city of London.  Snow, and the epidemic he halted, are the subject of the fantastic book The Ghost Map: The Story of London's Most Terrifying Epidemic -- and How It Changed Cities, Science, and the Modern World, by science historian Steven Johnson.  The detective work Snow undertook, and his tireless efforts to save the London poor from a horrible disease, make for fascinating reading, and shine a vivid light on what cities were like back when life for all but the wealthy was "solitary, poor, nasty, brutish, and short" (to swipe Edmund Burke's trenchant turn of phrase).

[Note: if you purchase this book using the image/link below, part of the proceeds goes to support Skeptophilia!]


Friday, August 6, 2021

Research and rabbit holes

I've suspected for a while that the FBI is keeping a file on me based upon my Google search history.

This, I suspect, is something that plagues a lot of writers, but it's really hit home apropos of my murder mystery series, The Snowe Agency Mysteries, the research for which has resulted in some searches that would look seriously sketchy to anyone who didn't know I'm a writer.  These have included:
  • What anesthetic available to a veterinarian would kill a human the most quickly?
  • How fast does a bubble of air injected into an artery kill someone?
  • Would the remains of a person poisoned to death twenty years ago still show traces of the poison?
  • The behavior of psychopathic individuals
  • The physiology of drowning
  • How hard does a person need to be hit in the back of the head to knock them unconscious?
To anyone would-be Sherlocks out there: allow me to assure you that I have never killed, nor am I planning on killing, anyone.


Writing takes you down some interesting rabbit holes, and I'm not just talking about writing mysteries.  One of the reasons I love writing fiction is that I learn so much in the process -- it gives me a chance to stretch my own brain a little.  Here are a few things I had to research for books I've written:
  • Living conditions in 14th century Norway (Lock & Key)
  • Communications and surveillance technology (Kill Switch)
  • Eighteenth-century land grants in the northeastern U.S. (Descent into Ulthoa)
  • Ancient Greek timekeeping devices (Gears)
  • Medieval Jewish mystical traditions (Sephirot)
  • Creatures from Japanese mythology (The Fifth Day)
  • The effects of untreated type-1 diabetes (Whistling in the Dark)
  • Viking ship design (Kári the Lucky)
  • The rate of spread of the Black Death in England (We All Fall Down)
  • The structure and furnishings in homes in nineteenth-century southern Louisiana (The Communion of Shadows)
  • How long hydropower electric plants would keep functioning if left unattended (In the Midst of Lions)
And that's just scratching the surface.

I was chatting with a friend and fellow author a couple of days ago, and commented that fiction should open up new worlds, that if my readers are the same when they close the book as they were when they opened it, I've failed as a writer.  However, writing also opens up new worlds for the writer, lets us explore topics we'd otherwise never look into.  (It's all too easy to get lost in research -- to intend to sit down and write, and suddenly three hours have gone by, and all you've done is jump from one abstruse website to another, as my friend and writing partner Cly Boehs would be happy to tell you.)

There are two things about learning: (1) it's fun. And (2) you're never done.  And when it comes to writing, there are always new areas to investigate, new worlds to create.

So many stories to tell, so little time.

**********************************************

Author and biochemist Camilla Pang was diagnosed with autism spectrum disorder at age eight, and spent most of her childhood baffled by the complexities and subtleties of human interactions.  She once asked her mother if there was an instruction manual on being human that she could read to make it easier.

Her mom said no, there was no instruction manual.

So years later, Pang recalled the incident and decided to write one.

The result, Explaining Humans: What Science Can Teach Us About Life, Love, and Relationships, is the best analysis of human behavior from a biological perspective since Desmond Morris's classic The Naked Ape.  If you're like me, you'll read Pang's book with a stunned smile on your face -- as she navigates through common, everyday behaviors we all engage in, but few of us stop to think about.

If you're interested in behavior or biology or simply agree with the Greek maxim "gnothi seauton" ("know yourself"), you need to put this book on your reading list.  It's absolutely outstanding.

[Note:  if you purchase this book using the image/link below, part of the proceeds goes to support Skeptophilia!]


Monday, March 22, 2021

The imaginary scientist

The unfortunate reality is that in this "Age of Information," where we as a species have the ability to store, access, and transfer knowledge with a speed that fifty years ago would have been in the realm of science fiction, it is harder than ever to know what's true and what isn't.

The internet is as good a conduit of bullshit as it is of the truth.  Not only are there plenty of well-intentioned but ill-informed people, there are lots of folks who lie deliberately for their own ends -- monetary gain, power, influence, the dubious thrill of having pulled off a hoax, or just their "five minutes of fame."  It used to be that in order to be successful, these purveyors of bad information had to go to the trouble and expense of writing a book, or at least of finding a way to get speaking engagements.  Now that anyone with money and access can own a webpage, there's nothing stopping cranks, liars, hoaxers, and the rest from getting their message out there to the entire electronic world simultaneously.

When I taught a high school course in critical thinking, one of my mantras was "check your sources."  If you find a claim online, where did it come from?  What is the originator's background -- does it seem like (s)he has sufficient knowledge and expertise?  Has it been checked and corroborated by others?  If it's from a journal, is it a peer-reviewed source -- or one of the all-too-common "pay to play" journals that will take damn near anything you write if you're willing to pay them to do it?  Does it line up with what we already know from science and history?  (Another mantra was "nearly every time someone claims 'this new theory will overturn everything we know about physics!', it turns out to be wrong.")

None of this guarantees that the claim is correct, of course; but using those questions as general guidelines will help you to navigate the intellectual minefield of science representation on the internet.

Except when it doesn't.

As an example of this, have you heard of Camille Noûs?

I hadn't, until I read a troubling story that appeared last week in Nature, written by Cathleen O'Grady.  Camille Nôus first showed up as a signatory on an open letter about science policy in France early last year, and since then has been listed as a co-author on no fewer than 180 different papers.  She?  He? -- the name "Camille" could be either, which I don't think is accidental -- has been racking up citation after citation, in a wide range of unrelated fields, including astrophysics, ecology, chemistry, and molecular biology.

Pretty impressive accomplishments in the world of research, where increasing specialization has resulted in what a friend of mine described as "researchers knowing more and more about less and less until finally they'll know everything about nothing."

[Image licensed under the Creative Commons Yakuzakorat, Scientists are working in the lab.9, CC BY 4.0]

This same narrowing of focus is why the red flag of Camille Noûs's ubiquity would never become apparent to many scientists; they might find the name over and over in papers from their field of evolutionary biology, for example, and not realize -- probably never even see -- that Noûs had also, astonishingly, co-authored papers in medical biochemistry.

So what's going on here?

By this point, it probably will come as no shock that Camille Noûs doesn't exist.  The last name "Noûs" was chosen because "nous" means "we" in French, and is also a play on the Greek word νοῦς, which means "reason."  Noûs was the brainchild of  RogueESR, a French science advocacy group, as a way to personify collective efforts and knock the elitist attitude of some leading scientists down a peg.  RogueESR protested the cost-saving approach by many research institutions of eliminating tenure-track positions and making just about all available openings temporary, project-specific research, and they decided to come up with a moniker representing the human, group-cooperative side of science.

"Hundreds of articles will make this name the top author on the planet," they wrote in a newsletter, "with the consequence of distorting certain bibliometric statistics and demonstrating the absurdity of individual quantitative assessment."

Well, okay, I get the point.  At its best, science is a collective effort, and one should never lose sight of the fact that behind every technical paper there are creative, curious human minds who shouldn't be treated as expendable and replaceable cogs in a machine.  But the problem is, if you can't trust a paper in a major peer-reviewed journal to print the truth, who can you trust?  Yes, sometimes scientists make mistakes, and papers have to be retracted; but admitting an error, and publishing something that is known to be false up-front, are hardly the same thing.

Some journals are taking a stance on this issue, and are refusing to accept papers with Noûs's name on the list of authors, or at least agreeing to publish only if the name is removed.  But the fact that Noûs is already listed as an author on 180 papers -- and those papers are being cited in other papers, and round and round and round -- means that the imaginary author won't disappear any time soon.

While I certainly agree with the motives behind the protest, this is an ethically questionable way of approaching it.  There is already enough distrust of science and scientists by the general public; the very last thing we need is researchers including an out-and-out lie in their papers, however noble their intentions, however tongue-in-cheek the lie is.

The people who are joining the protest and adding Noûs to their author list need to find another way to make their opinions on the issue heard.

The reason we critical thinking non-scientists always want people to go to the peer-reviewed research is because it is -- or should be -- the gold standard for representing the best, most thoroughly-tested, most comprehensive and accurate knowledge we currently have.  The Camille Noûs stunt weakens the whole enterprise.  "The campaign is naïve and ethically questionable," said Lisa Rasmussen, a bioethicist at the University of North Carolina - Charlotte.  "It flouts the basic principle of taking responsibility alongside the credit of authorship."

Which is it exactly.  I'll still rely on research in journals like Science and Nature when I want to be certain of my facts, but the whole incident brings home the unfortunate fact that even when you do your best to check your sources, you can still be led astray.  Science, however rigorous its methods, is still a human pursuit, and like all human pursuits, can be subject to bias, misjudgment, error -- and outright falsification, however well-intentioned.

******************************************

Last week's Skeptophilia book-of-the-week, Simon Singh's The Code Book, prompted a reader to respond, "Yes, but have you read his book on Fermat's Last Theorem?"

In this book, Singh turns his considerable writing skill toward the fascinating story of Pierre de Fermat, the seventeenth-century French mathematician who -- amongst many other contributions -- touched off over three hundred years of controversy by writing that there were no integer solutions for the equation  an + bn = cn for any integer value of n greater than 2, then adding, "I have discovered a truly marvelous proof of this, which this margin is too narrow to contain," and proceeding to die before elaborating on what this "marvelous proof" might be.

The attempts to recreate Fermat's proof -- or at least find an equivalent one -- began with Fermat's contemporaries, Evariste de Gaulois, Marin Mersenne, Blaise Pascal, and John Wallis, and continued for the next three centuries to stump the greatest minds in mathematics.  It was finally proven that Fermat's conjecture was correct by Andrew Wiles in 1994.

Singh's book Fermat's Last Theorem: The Story of a Riddle that Confounded the World's Greatest Minds for 350 Years describes the hunt for a solution and the tapestry of personalities that took on the search -- ending with a tour-de-force paper by soft-spoken British mathematician Andrew Wiles.  It's a fascinating journey, as enjoyable for a curious layperson as it is for the mathematically inclined -- and in Singh's hands, makes for a story you will thoroughly enjoy.

[Note: if you purchase this book using the image/link below, part of the proceeds goes to support Skeptophilia!]



Monday, September 16, 2019

The best of the worst

It's mid-September, so you know what that means:

It's time for this year's Ig Nobel Prizes.

The Ig Nobel Prizes celebrate the loopy side of science in a ceremony that has been taking place at Harvard University annually for the last 29 years.  The idea is to recognize research (and researchers) whose work is probably never going to receive an actual Nobel -- but deserves to be in the spotlight purely for the absurdity and humor value.


The winners each year get invited to a ceremony wherein they're wined and dined and given their cash prize (a $10 trillion bill from Zimbabwe, which is worth a few cents).  They then have to give an acceptance speech, which if it goes over sixty seconds is interrupted by an eight-year-old girl yelling, "Please stop, I'm bored" over and over until they give up.

As is usual with the Ig Nobel Ceremony, good times were had by all and sundry.  The audience is encouraged to participate by folding up their programs into paper airplanes and throwing them at the presenters, and needless to say they rose to the occasion.

So, without further ado, here are the 2019 winners:
  • Research finding that eating pizza is correlated with lower risk of dying of various diseases -- but only if the pizza is made and consumed in Italy.
  • A study showing that not only dogs can learn using "clicker training;" it also works for training orthopedic surgeons.
  • The finding that the degree of asymmetry in how low testicles hang on postmen (both clothed and naked) in France depends on the temperature.
  • A study to figure out how much drool is produced by an average five-year-old child.
  • An invention of a diaper-changing machine (now patented) by an American engineer.
  • A comparative study of paper money in various countries to find out which is the dirtiest.  (The winner was the Romanian leu.)
  • Research into finding which areas of the body are the most pleasurable to scratch.  (The back and the ankle, apparently.)
  • A study finding that holding a pen in your mouth to stretch your facial muscles in a "pseudo-smile" makes you happy -- then a further study finding that the opposite is true.
  • Research into why (and how) wombats make cube-shaped poop.
If you're so inclined, the link provided has further links to each of the academic papers that won prizes.

As is usual for the Ig Nobels, the studies that won raise more questions than they answer.  For example:
  • The asymmetrical-ball-hanging study immediately made me wonder, "why French postmen?"  Would we expect that there would be a different pattern amongst, say, Argentinian stockbrokers?  (Maybe because they're in the Southern Hemisphere?  You know, the testicular Coriolis effect, or something?)  Or were French postmen just the group they found that was the easiest to convince to drop trou in the name of science?
  • Who in their right mind would volunteer their child to test a diaper-changing machine?  That's the kind of thing I can imagine going very wrong.  Strangely hilarious, but very wrong.
  • If forcing a smile with a pen first worked and then didn't, why did they bother writing a paper about it?
  • The pizza experiment just sounds like the scientists taking advantage of an opportunity to use grant money to hang around in Italy going to restaurants.  Which places it squarely in the "why didn't I think of that first?" category.
  • Would you want to go under the knife with a surgeon who'd been trained using a clicker?  ("Good incision!"  *click click*  *trainer pulls the surgeon's mask aside and sticks a treat in his mouth*)
I suppose having further questions is a good thing in science, because after all that's how progress is made.  One thing leads to another is kind of the status quo.

On the other hand, I have a hard time seeing what more you could do with cubical wombat poop.

So that's this year's Ig Nobels.  Remember this next time you hear people say that scientists are humorless, pedantic geeks.  Anyone who can get a grant to investigate back scratches is okay in my book, because back scratches are fucking awesome.

**********************************

This week's Skeptophilia book recommendation made the cut more because I'd like to see what others think of it than because it bowled me over: Jacques Vallée's Passport to Magonia.

Vallée is an interesting fellow, and certainly comes with credentials; he has an M.S. in astrophysics from the University of Lille and a Ph.D. in computer science from Northwestern University.  He's at various times been an astronomer, a computer scientist, and a venture capitalist, and apparently was quite successful at all three.  But if you know his name, it's probably because of his connection to something else -- UFOs.

Vallée became interested in UFOs early, when he was 16 and saw one in his home town of Pontoise, France.  After earning his degree in astrophysics, he veered off into the study of the paranormal, especially allegations of alien visitation, associating himself with some pretty reputable folks (J. Allen Hynek, for example) and some seriously questionable ones (like the fraudulent Israeli spoon-bender, Uri Geller).

Vallée didn't really get the proof he was looking for (of course, because if he had we'd probably all know about it), but his decades of research compiles literally hundreds -- perhaps thousands -- of alleged sightings and abductions.  And that's what Passport to Magonia is about.  To Vallée's credit, he doesn't try to explain them -- he doesn't have a favorite hypothesis he's trying to convince you of -- he simply says that there are two things that are significant: (1) the number of claims from otherwise reliable and sane folks is too high for there not to be something to it; and (2) the similarity between the claims, going all the way back to medieval claims of abductions by spirits and "elementals," is great enough to be significant.

I'm not saying I necessarily agree with him, but his book is lucid and fascinating, and the case studies he cites make for pretty interesting reading.  I'd be curious to see what other Skeptophiles think of his work.

[Note: if you purchase this book using the image/link below, part of the proceeds goes to support Skeptophilia!]






Saturday, June 29, 2019

The biochemical symphony

Sometimes I run into a piece of scientific research that's so odd and charming that I just have to tell you about it.

Take, for example, the paper that appeared in ACS Nano this week, that ties together two of my favorite things -- biology and music.  It has the imposing title,  "A Self-Consistent Sonification Method to Translate Amino Acid Sequences into Musical Compositions and Application in Protein Design Using Artificial Intelligence," and was authored by Chi-Hua Yu, Zhao Qin, Francisco J. Martin-Martinez, and Markus J. Buehler, all of the Massachusetts Institute of Technology.  Their research uses a fascinating lens to study protein structure: converting the amino acid sequence and structure of a protein into music, then having an AI software study the musical pattern that results as a way of learning more about how proteins function -- and how that function might be altered.

What's cool is that the musical note that represents each amino acid isn't randomly chosen.  It's based on the amino acid's actual quantum vibrational frequency.  So when you listen to it, you're not just hearing a whimsical combination of notes based on something from nature; you're actually hearing the protein itself.

[Image licensed under the Creative Commons © Nevit Dilmen, Music 01754, CC BY-SA 3.0]

In an article about the research in MIT News, written by David L. Chandler, you can hear clips from the Yu et al. study.  I recommend the second one especially -- the one titled "An Orchestra of Amino Acids" -- which is a "sonification" of spider silk protein.  The strange, percussive rhythm is kind of mesmerizing, and if someone had told me that it was a composition by an avant-garde modern composer -- Philip Glass, perhaps, or Steve Reich -- I would have believed it without question.  But what's coolest about this is that the music actually means something beyond the sound.  The AI is now able to discern the difference between some basic protein structures, including two of the most common -- the alpha-helix (shaped like a spring) and the beta-pleated-sheet (shaped like the pleats on a kilt -- because they sound different.  This gives us a lens into protein  function that we didn't have before.  "[Proteins] have their own language, and we don’t know how it works," said Markus Buehler, who co-authored the study.  "We don’t know what makes a silk protein a silk protein or what patterns reflect the functions found in an enzyme.  We don’t know the code."

But this is exactly what the AI, and the scientists running it, hope to find out.  "When you look at a molecule in a textbook, it’s static," Buehler said.  "But it’s not static at all. It’s moving and vibrating. Every bit of matter is a set of vibrations.  And we can use this concept as a way of describing matter."

This new approach has impressed a lot of people not only for its potential applications, but from how amazingly creative it is.  This is why it drives me nuts when people say that science isn't a creative process.  They apparently have the impression that science is pure grunt work, inoculating petri dishes, looking at data from particle accelerators, analyzing rock layers.  But at its heart, the best science is about making connections between disparate ideas -- just like this research does -- and is as deeply creative as writing a symphony.

"Markus Buehler has been gifted with a most creative soul, and his explorations into the inner workings of biomolecules are advancing our understanding of the mechanical response of biological materials in a most significant manner," said Marc Meyers, professor of materials science at the University of California at San Diego, who was not involved in this work.  "The focusing of this imagination to music is a novel and intriguing direction.  his is experimental music at its best.  The rhythms of life, including the pulsations of our heart, were the initial sources of repetitive sounds that engendered the marvelous world of music.  Markus has descended into the nanospace to extract the rhythms of the amino acids, the building blocks of life."

What is most amazing about this is the potential for the AI, once trained, to go in reverse -- to be given an altered musical pattern, and to predict from that what the function of a protein engineered from that music would do.  Proteins are perhaps the most fundamental pieces of living things; the majority of genes do what they do by making proteins, which then guide processes within the organism (including frequently affecting other genes).  The idea that we could use music as a lens into how our biochemistry works is kind of stunning.

So that's your science-is-so-freaking-cool moment for the day.  I peruse the science news pretty much daily, looking for intriguing new research, but this one's gonna be hard to top.  Now I think I'm going to go back to the paper and click on the sound links -- and listen to the proteins sing.

***************************************

Richard Dawkins is a name that often sets people's teeth on edge.  However, the combative evolutionary biologist, whose no-holds-barred approach to young-Earth creationists has given him a well-deserved reputation for being unequivocally devoted to evidence-based science and an almost-as-well-deserved reputation for being hostile to religion in general, has written a number of books that are must-reads for anyone interested in the history of life on Earth -- The Blind Watchmaker, Unweaving the Rainbow, Climbing Mount Improbable, and (most of all) The Ancestor's Tale.

I recently read a series of essays by Dawkins, collectively called A Devil's Chaplain, and it's well worth checking out, whatever you think of the author's forthrightness.  From the title, I expected a bunch of anti-religious screeds, and I was pleased to see that they were more about science and education, and written in Dawkins's signature lucid, readable style.  They're all good, but a few are sheer brilliance -- his piece, "The Joy of Living Dangerously," about the right way to approach teaching, should be required reading in every teacher-education program in the world, and "The Information Challenge" is an eloquent answer to one of the most persistent claims of creationists and intelligent-design advocates -- that there's no way to "generate new information" in a genome, and thus no way organisms can evolve from less complex forms.

It's an engaging read, and I recommend it even if you don't necessarily agree with Dawkins all the time.  He'll challenge your notions of how science works, and best of all -- he'll make you think.

[If you purchase this book using the image/link below, part of the proceeds will go to support Skeptophilia!]





Saturday, September 15, 2018

The lighter side of science

If you think that scientists are a bunch of dry-as-dust, humorless nerds, all you have to do to realize you were wrong is to read about Thursday evening's gala ceremony.

Called the Ig Nobel Prizes, it's an event that's been taking place at Harvard University annually for the last 28 years.  The idea is to recognize research (and researchers) whose work is probably never going to receive an actual Nobel -- but deserves to be in the spotlight purely for the absurdity and humor value.


This year's recipients:
  • Marc A. Mitchell and David Wartinger, for a study showing that you have a 64% chance of passing a kidney stone if you ride on a rollercoaster.  To do the research, Mitchell and Wartinger took 3D-printed models of human kidneys on the Big Thunder Ride at Walt Disney World.  Twenty times.
  • Japanese gastroenterologists Akira Horiuchi and Yoshiko Nakayama, who wanted to find out if colonoscopies are uncomfortable if administered in a seated position, so they gave one to themselves.  It causes "mild discomfort," apparently.
  • A study by Alethea L. Blackler, Rafael Gomez, Vesna Popovic, and M. Helen Thompson that found people don't read instruction manuals.  (The title of this study bears mention; it's "Life's Too Short to RTFM.")
  • Research by Lindie H. Lianga, Douglas J. Brown, Huiwen Lian, Samuel Hanig, D. Lance Ferris, and Lisa M. Keeping finding that if you have an abusive boss, you'll feel better if you make (and skewer) a voodoo doll in his/her image.
  • A study by Paula M. S. Romão, Adília M. Alarcão, and César A. N. Viana that showed "spit-shines" actually work, by using spit to clean eighteenth-century sculptures.
  • Research by John M. Barry, Bruce Blank, and Michael Boileau to see if you could find out if guys were getting hard-ons while they were asleep by wrapping postage stamps around their penises before bed, and checking the next morning for tears in the perforations.  Turns out you can.
Some of these seem to me to fall into the "you needed to do research to find that out?" category.  Like the reading-the-manual one.  I'm the worst ever about this.  When we get something that needs assembly, my first step is always to yell, "Carol, can you help me with this?"  She's methodical and careful and makes sure I don't use my usual method, which is to jam things together whichever way seems right, a technique that always results in leftover parts and sub-optimal performance.

Also, I'm not at all shocked that skewering a voodoo doll would be highly satisfying.  I'm lucky enough to work for an awesome principal, but I've had bosses who I would have gladly stabbed in effigy.  A pity I didn't know about this sooner.

Oh, and the nocturnal erection study; are there guys who don't get erections while they're asleep?  I thought that was kind of hard-wired.

So to speak.

In any case, the winners each year get invited to a ceremony wherein they're wined and dined and given their cash prize (a $10 trillion bill from Zimbabwe, which is worth a few cents).  They then have to give an acceptance speech, which if it goes over sixty seconds is interrupted by an eight-year-old girl yelling, "Please stop, I'm bored" over and over until they give up.

As is usual with the Ig Nobel Ceremony, good times were had by all and sundry.  The audience is encourage to participate by folding up their programs into paper airplanes and throwing them at the presenters.

So that's this week's hilarity from the world of science.  And if you wanted more evidence of scientists having a great sense of humor, you should definitely check out The Journal of Irreproducible Results, which is the best science journal spoof in the world.

Now, y'all'll have to excuse me.  I'm heading off to the post office.  I seem to be out of stamps.

**************************

This week's Skeptophilia book recommendation is a charming inquiry into a realm that scares a lot of people -- mathematics.  In The Universe and the Teacup, K. C. Cole investigates the beauty and wonder of that most abstract of disciplines, and even for -- especially for -- non-mathematical types, gives a window into a subject that is too often taught as an arbitrary set of rules for manipulating symbols.  Cole, in a lyrical and not-too-technical way, demonstrates brilliantly the truth of the words of Galileo -- "Mathematics is the language with which God has written the universe."