Skeptophilia (skep-to-fil-i-a) (n.) - the love of logical thought, skepticism, and thinking critically. Being an exploration of the applications of skeptical thinking to the world at large, with periodic excursions into linguistics, music, politics, cryptozoology, and why people keep seeing the face of Jesus on grilled cheese sandwiches.

Thursday, April 11, 2024

Requiem for a visionary

I was saddened to hear of the death of the brilliant British physicist Peter Higgs on Monday, April 8, at the grand old age of 94.  Higgs is most famous for his proposal in 1964 of what has since come to be known as the "Higgs mechanism" (he was far too modest a man to name it after himself; that was the doing of colleagues who recognized his genius).  This springboarded off work by the Nobel Prize-winning Japanese physicist Yochiro Nambu, who was researching spontaneous symmetry breaking -- Higgs's insight was to see that the same process could be used to argue for the existence of a previously unknown field, the properties of which seemed to explain why ordinary particles have mass.

This was a huge leap, and by Higgs's own account, he was knocking at the knees when he presented the paper at a conference.  But it passed peer review and was published in the journal Physical Review Letters, and afterward stood up to repeated attempts to punch holes in its logic.  His argument required the existence of a massive spin-zero boson -- now known as the Higgs boson -- and he had to wait 48 years for it to be discovered at CERN by the ATLAS and Compact Muon Solenoid (CMS) experiments.  When informed that the Higgs boson had been discovered, at exactly the mass/energy he'd predicted, he responded with his typical humility, saying, "It's really an incredible thing that it's happened in my lifetime."

It surprised no one when he won the Nobel Prize in Physics the following year (2013).

Higgs at the Nobel Prize Awards Ceremony [Image licensed under the Creative Commons Bengt Nyman, Nobel Prize 24 2013, CC BY 2.0]

Higgs, however, was a bit of an anachronism.  He was a professor at Edinburgh University, but refused to buy into the competitive grant-seeking paper-production culture of academia.  He was also famously non-technological; he said he'd never sent an email, used a cellphone, or owned a television.  (He did say that he'd been persuaded to watch an episode of The Big Bang Theory once, but "wasn't impressed.")  He frustrated the hell out of the administration of the university, responding to demands for a list of recent publications with the word "None."  Apparently it was only caution -- well-founded, as it turned out -- by the administrators that persuaded them to keep him on the payroll.  "He might get a Nobel Prize at some point," one of them said.  "If not, we can always get rid of him."

In an interview, Higgs said that he'd never get hired in today's academic world, something that is more of an indictment against academia than it is of Higgs himself.  "It's difficult to imagine how I would ever have enough peace and quiet in the present sort of climate to do what I did in 1964," he said.  "After I retired it was quite a long time before I went back to my department.  I thought I was well out of it.  It wasn't my way of doing things any more.  Today I wouldn't get an academic job.  It's as simple as that.  I don't think I would be regarded as productive enough."

Reading about this immediately made me think about the devastating recent video by theoretical physicist Sabine Hossenfelder, a stinging takedown of how the factory-model attitude in research science is killing scientists' capacity for doing real and groundbreaking research:

It was a rude awakening to realize that this institute [where she had her first job in physics research] wasn't about knowledge discovery, it was about money-making.  And the more I saw of academia, the more I realized it wasn't just this particular institute and this particular professor.  It was generally the case.  The moment you put people into big institutions, the goal shifts from knowledge discovery to money-making.  Here's how this works:

If a researcher gets a scholarship or research grant, the institution gets part of that money.  It's called the "overhead."  Technically, that's meant to pay for offices and equipment and administration.  But academic institutions pay part of their staff from this overhead, so they need to keep that overhead coming.  Small scholarships don't make much money, but big research grants can be tens of millions of dollars.  And the overhead can be anything between fifteen and fifty percent.  This is why research institutions exert loads of pressure on researchers to bring in grant money.  And partly, they do this by keeping the researchers on temporary contracts so that they need grants to get paid themselves...  And the overhead isn't even the real problem.  The real problem is that the easiest way to grow in academia is to pay other people to produce papers on which you, as the grant holder, can put your name.  That's how academia works.  Grants pay students and postdocs to produce research papers for the grant holder.  And those papers are what the supervisor then uses to apply for more grants.  The result is a paper-production machine in which students and postdocs are burnt through to bring in money for the institution...

I began to understand what you need to do to get a grant or to get hired.  You have to work on topics that are mainstream enough but not too mainstream.  You want them to be a little bit edgy, but not too edgy.  It needs to be something that fits into the existing machinery.  And since most grants are three years, or five years at most, it also needs to be something that can be wrapped up quickly...

The more I saw of the foundations of physics, the more I became convinced that the research there wasn't based upon sound scientific principles...  [Most researchers today] are only interested in writing more papers...  To get grants.  To get postdocs.  To write more papers.  To get more grants.  And round and round it goes.

You can see why a visionary like Peter Higgs was uncomfortable in today's academia (and vice versa).  But it's also horrifying to think about the Peter Higgses of this generation -- today's up-and-coming scientific groundbreakers, who may not ever get a chance to bring their ideas to the world, sandbagged instead by a hidebound money-making machine that has amplified "publish-or-perish" into "publish-or-never-get-started."

In any case, the world has lost a gentle, soft-spoken genius, whose unique insights -- made at a time when the academic world was more welcoming to such individuals -- completed our picture of the Standard Model of particle physics, and whose theories led to an understanding of the fundamental properties of matter and energy we're still working to explore fully.  94 is a respectable age in pretty much anyone's opinion, but it's still sad to lose someone of such brilliance, who was not only a leading name in pure research, but was unhesitating in pointing out the problems with how science is done.

It took 48 years for his theory about the Higgs mechanism to be experimentally vindicated; let's hope his criticisms of academia have a shorter gestation period.


No comments:

Post a Comment