Skeptophilia (skep-to-fil-i-a) (n.) - the love of logical thought, skepticism, and thinking critically. Being an exploration of the applications of skeptical thinking to the world at large, with periodic excursions into linguistics, music, politics, cryptozoology, and why people keep seeing the face of Jesus on grilled cheese sandwiches.
Showing posts with label academia. Show all posts
Showing posts with label academia. Show all posts

Tuesday, November 19, 2024

Paradoxes and pointlessness

In his 1967 short story "Thus We Frustrate Charlemagne," writer R. A. Lafferty took one of the first looks at something that since has become a standard trope in science fiction; going back into the past and doing something that changes history.

In his hilarious take on things, some time-machine-wielding scientists pick an event in history that seems to have been a critical juncture (they chose the near-miss assassination attempt on Charlemagne in 778 C.E. that immediately preceded the Battle of Roncevaux), then send an "avatar" back in time to change what happened.  The avatar kills the guy who saved Charlemagne's life, Charlemagne himself is killed, and his consolidation of power into what would become the Holy Roman Empire never happens.

Big deal, right?  Major repercussions down throughout European history?  Well, what happens is that when the change occurs, it also changes the memories of the scientists -- how they were educated, what they knew of history.  The avatar comes back, and everything is different, but the scientists are completely unaware of what's happened -- because their history now includes the change the avatar made.

So they decide that Charlemagne's assassination must have had no effect on anything, and they pick a different historical event to change.  The avatar goes back to try again -- with the same results.

Each time the avatar returns, things have become more and more different from where they started -- and still, none of the characters inside the story can tell.  They can never, in C. S. Lewis's words, "know what might have happened;" no matter what they do, those alternate timelines remain forever outside their ability to see.

In the end, the scientists give up.  Nothing, they conclude, has any effect on the course of events, so trying to change history is a complete waste of time.

One has to wonder if Harvard astrophysicist Avi Loeb has read Lafferty's story, because Loeb just authored an article in The Debrief entitled, "The Wormhole Dilemma: Could Advanced Civilizations Use Time Travel to Rewrite History?"  Which, incidentally, is a fine example of Betteridge's Law -- "any headline phrased as a question can be answered with the word 'no.'"

Before we get into what the article says, I have to say that I'm getting a little fed up with Loeb himself.  He's something of a frequent flier on Skeptophilia and other science-based skepticism websites (such as the one run by the excellent Jason Colavito), most recently for his strident claim that meteoric debris found in the Pacific Ocean was from the wreckage of an alien spacecraft.  (tl;dr: It wasn't.)  

I know we skeptical types can be a little hard to budge sometimes, and a criticism levied against us with at least some measure of fairness is that we're so steeped in doubting that we wouldn't believe evidence if we had it.  But even so, Loeb swings so far in the opposite direction that it's become difficult to take anything he says seriously.  In the article in The Debrief, he talks about how wormholes have been shown to be mathematically consistent with what we know about physics (correct), and that Kip Thorne and Stephen Hawking demonstrated that they could theoretically be kept open long enough to allow passage of something from one point in spacetime to another (also correct).  

This would require, however, the use of something with negative mass-energy to stabilize the wormhole so it doesn't snap shut immediately.  Which is a bit of a sticking point, because there's never been any proof that such a something actually exists.

Oh, but that's no problem, Loeb says; dark energy has negative (repulsive) energy, so an advanced civilization could "excavate dark energy from the cosmic reservoir and mold it into a wormhole."  He admits that we don't know if this is possible because we still have no idea what dark energy actually is, but then goes into a long bit about how we (or well-intentioned aliens) could use such a wormhole to "fix history," starting with getting rid of Adolf Hitler and preventing the Holocaust.

A laudable goal, no doubt, but let's just hang on a moment.

[Image is in the Public Domain courtesy of artist Martin Johnson]

The idea of the altering of history potentially creating intractable paradoxes is a staple of science fiction, ever since Lafferty (and Ray Bradbury in his brilliant and devastating short story "The Sound of Thunder") brought it into the public awareness.  Besides my own novel Lock & Key, in which such a paradox wipes out all of humanity except for one dubiously lucky man who somehow escapes being erased and ends up having to fix the problem, this sort of thing seemed to happen every other week on Star Trek: The Next Generation, where one comes away with the sense that the space-time continuum is as flimsy as a wet Kleenex.  It may be that there is some sort of built-in protection in the universe for preventing paradoxes -- such as the famous example of going back in time and killing your own grandfather -- but even that point is pure speculation, because the physicists haven't shown that time travel into the past is possible, much less practical.

So Loeb's article is, honestly, a little pointless.  He looks at an idea that countless fiction writers -- including myself -- have been exploring ad nauseam since at least 1967, and adds nothing to the conversation from a scientific perspective other than saying, "Hey, maybe superpowerful aliens could do it!"  As such, what he's done is really nothing more than mental masturbation.

I know I'm coming away sounding like a killjoy, here.  It's not that this stuff isn't fun to think about; I get that part of it.  But yet another article from Loeb talking about how (1) highly-advanced alien civilizations we know nothing about about might (2) use technology that requires an unknown form of exotic matter we also know nothing about to (3) accomplish something physicists aren't even sure is possible, isn't doing anything but giving new meaning to the phrase "Okay, that's a bit far-fetched."

The whole thing put me in mind of physicist Sabine Hossenfelder's recent, rather dismal, video "Science is in Trouble, and It Worries Me."  Her contention is that science's contribution to progress in our understanding of the universe, and to improving the wellbeing of humanity, has slowed way down -- that (in her words) "most of what gets published is bullshit."  Not that what gets published is false; that's not what she means.  Just that it's pointless.  The emphasis on science being on the cutting edge, on pushing the limits of what we know, on being "disruptive" (in a good sense), has all but vanished.  Instead, the money-making model -- writing papers so you get citations so you get grants so you can write more papers, and so on and so on -- has blunted the edge of what academia accomplishes, or even can accomplish.

And I can't help but throw this fluff piece by Loeb into that same mix.  As a struggling writer who has yet to exceed a three-figure income from my writing in a given year, I have to wonder how much The Debrief paid Loeb for his article.  I shouldn't be envious of another writer, I guess; and honestly, I wouldn't be if what Loeb had written had scientific merit, or even substance.

But as is, the whole thing pisses me off.  It adds to the public perception of scientists as speculative hand-wavers, gives the credulous the impression that something is possible when it probably isn't, teaches the reader nothing most of us haven't already known for years, and puts another entirely undeserved feather in Avi Loeb's cap.

My general sense is that he was doing less harm when he was looking for an alien hiding behind every tree.

****************************************


Saturday, October 5, 2024

The treadmill

I've mentioned before how my difficulties with math short-circuited my goal of becoming a researcher in physics, but the truth is, there's more to the story than that.

Even after I realized that I didn't have the mathematical ability -- nor, honestly, enough interest and focus to overcome my challenges -- I still had every intention of pursuing a career in science.  I spent some time in the graduate school of oceanography at the University of Washington, and from there switched to biology, but I found neither to be a good fit.  It wasn't a lack of interest in the disciplines; biology, in fact, is still a deep and abiding fascination to this day, and I ultimately spent over three decades teaching the subject to high schoolers.  What bothered me was the publish-or-perish atmosphere that permeated all of research science.  I still recall my shock when one of our professors said, "Scientists spend 25% of their time doing the research they're interested in, and 75% of their time trying to beat everyone else in the field to grant money so they don't starve to death."

It's hard to pinpoint an exact moment that brought me to the realization that the career I'd always dreamed of wasn't for me -- but this was certainly one of the times I said, "Okay, now, just hang on a moment."

I'm not alone in having issues with this.  The brilliant theoretical physicist Sabine Hossenfelder did a video on her YouTube channel called "My Dream Died, and Now I'm Here" that's a blistering indictment of the entire edifice of research science.  Hossenfelder has the following to say about how science is currently done:

It was a rude awakening to realize that this institute [where she had her first job in physics research] wasn't about knowledge discovery, it was about money-making.  And the more I saw of academia, the more I realized it wasn't just this particular institute and this particular professor.  It was generally the case.  The moment you put people into big institutions, the goal shifts from knowledge discovery to money-making.  Here's how this works:

If a researcher gets a scholarship or research grant, the institution gets part of that money.  It's called the "overhead."  Technically, that's meant to pay for offices and equipment and administration.  But academic institutions pay part of their staff from this overhead, so they need to keep that overhead coming.  Small scholarships don't make much money, but big research grants can be tens of millions of dollars.  And the overhead can be anything between fifteen and fifty percent.  This is why research institutions exert loads of pressure on researchers to bring in grant money.  And partly, they do this by keeping the researchers on temporary contracts so that they need grants to get paid themselves...  And the overhead isn't even the real problem.  The real problem is that the easiest way to grow in academia is to pay other people to produce papers on which you, as the grant holder, can put your name.  That's how academia works.  Grants pay students and postdocs to produce research papers for the grant holder.  And those papers are what the supervisor then uses to apply for more grants.  The result is a paper-production machine in which students and postdocs are burnt through to bring in money for the institution...

I began to understand what you need to do to get a grant or to get hired.  You have to work on topics that are mainstream enough but not too mainstream.  You want them to be a little bit edgy, but not too edgy.  It needs to be something that fits into the existing machinery.  And since most grants are three years, or five years at most, it also needs to be something that can be wrapped up quickly...

The more I saw of the foundations of physics, the more I became convinced that the research there wasn't based upon sound scientific principles...  [Most researchers today] are only interested in writing more papers...  To get grants.  To get postdocs.  To write more papers.  To get more grants.  And round and round it goes.
The topic comes up today because of two separate studies that came out in the last two weeks that illustrate a hard truth that the scientific establishment as a whole has yet to acknowledge; there's a real human cost to putting talented, creative, bright people on the kind of treadmill Hossenfelder describes.

[Image licensed under the Creative Commons Doenertier82, Phodopus sungorus - Hamsterkraftwerk, CC BY-SA 3.0]

The first study, from a group in Sweden, found that simply pursuing a Ph.D. takes a tremendous toll on mental health, and instead of there being a "light at the end of the tunnel," the toll worsens as the end of the work approaches.  By the fifth year of doctoral study, the likelihood of a student using mental-health medications rises by forty percent.  It's no surprise why; once the Ph.D. is achieved, there's the looming stress of finding a postdoc position, and then after that the savage competition for the few stable, tenure-track research positions out there in academia.  "You need to generate data as quickly as possible, and the feeling of competition for funding and jobs can be very strong, even early in your PhD.," said Rituja Bisen, a fifth-year Ph.D. student in neurobiology at the University of Würzburg.  "Afterward, many of us have to move long distances, even out of the country, to find a worthwhile position.  And even then, there's no guarantee.  It doesn’t matter how good a lab is; if it’s coming out of a toxic work culture, it isn’t worth it in the long run."

The other study, out of Poland (but involving worldwide data), is perhaps even more damning; over fifty percent of researchers leave science entirely in under ten years after publishing their first academic paper.

You spend huge amounts of money on graduate school, work your ass off to get a Ph.D, and then a position as a researcher, and after all that -- you find that (1) the stress isn't worth it, (2) you're barely making enough money to get by, and (3) the competition for grants is only going to get worse over time.  It's not surprising that people decide to leave research for other career options.

But how heartbreaking is it that we're doing this to the best and brightest minds on the planet?

And the problem is even more drastic for women and minorities; for them, the number still left publishing after ten years is more like thirty percent of the ones who started.

How far would we have advanced in our understanding of how the universe works if the system itself wasn't strangling the scientists?

Back when modern science got its start, in the seventeenth and eighteenth centuries, science was the province of the rich; only the people who were already independently wealthy had the wherewithal to (1) get a college education, and afterward (2) spend their time messing about in laboratories.  There are exceptions -- Michael Faraday comes to mind -- but by and large, scientific inquiry was confined to the gentry.

Now, we have the appearance of a more open, egalitarian model, but at its basis, the whole enterprise still depends on institutions competing for money, and the people actually doing the research (i.e. the scientists) being worked to the bone to keep the whole superstructure running.

It's a horrible problem, and one I don't see changing until our attitudes shift -- until we start prioritizing the advancement of knowledge over academia-for-profit.  Or, perhaps, until our governments recognize how absolutely critical science is, and fund that over the current goals of fostering corporate capitalism to benefit the extremely wealthy and developing newer and better ways to kill those we perceive as our enemies.

I've heard a lot of talk about how prescient Star Trek was -- we now have something very like their communicators and supercomputers, and aren't far away from tricorders.  But we won't actually get there until we develop one other thing, and I'm not talking about warp drives or holodecks.

I'm talking about valuing science, and scientists, as being the pinnacle of what we as a species can achieve, and creating a system to provide the resources to support them instead of doing everything humanly possible to drive them away.

****************************************


Thursday, April 11, 2024

Requiem for a visionary

I was saddened to hear of the death of the brilliant British physicist Peter Higgs on Monday, April 8, at the grand old age of 94.  Higgs is most famous for his proposal in 1964 of what has since come to be known as the "Higgs mechanism" (he was far too modest a man to name it after himself; that was the doing of colleagues who recognized his genius).  This springboarded off work by the Nobel Prize-winning Japanese physicist Yochiro Nambu, who was researching spontaneous symmetry breaking -- Higgs's insight was to see that the same process could be used to argue for the existence of a previously unknown field, the properties of which seemed to explain why ordinary particles have mass.

This was a huge leap, and by Higgs's own account, he was knocking at the knees when he presented the paper at a conference.  But it passed peer review and was published in the journal Physical Review Letters, and afterward stood up to repeated attempts to punch holes in its logic.  His argument required the existence of a massive spin-zero boson -- now known as the Higgs boson -- and he had to wait 48 years for it to be discovered at CERN by the ATLAS and Compact Muon Solenoid (CMS) experiments.  When informed that the Higgs boson had been discovered, at exactly the mass/energy he'd predicted, he responded with his typical humility, saying, "It's really an incredible thing that it's happened in my lifetime."

It surprised no one when he won the Nobel Prize in Physics the following year (2013).

Higgs at the Nobel Prize Awards Ceremony [Image licensed under the Creative Commons Bengt Nyman, Nobel Prize 24 2013, CC BY 2.0]

Higgs, however, was a bit of an anachronism.  He was a professor at Edinburgh University, but refused to buy into the competitive grant-seeking paper-production culture of academia.  He was also famously non-technological; he said he'd never sent an email, used a cellphone, or owned a television.  (He did say that he'd been persuaded to watch an episode of The Big Bang Theory once, but "wasn't impressed.")  He frustrated the hell out of the administration of the university, responding to demands for a list of recent publications with the word "None."  Apparently it was only caution -- well-founded, as it turned out -- by the administrators that persuaded them to keep him on the payroll.  "He might get a Nobel Prize at some point," one of them said.  "If not, we can always get rid of him."

In an interview, Higgs said that he'd never get hired in today's academic world, something that is more of an indictment against academia than it is of Higgs himself.  "It's difficult to imagine how I would ever have enough peace and quiet in the present sort of climate to do what I did in 1964," he said.  "After I retired it was quite a long time before I went back to my department.  I thought I was well out of it.  It wasn't my way of doing things any more.  Today I wouldn't get an academic job.  It's as simple as that.  I don't think I would be regarded as productive enough."

Reading about this immediately made me think about the devastating recent video by theoretical physicist Sabine Hossenfelder, a stinging takedown of how the factory-model attitude in research science is killing scientists' capacity for doing real and groundbreaking research:

It was a rude awakening to realize that this institute [where she had her first job in physics research] wasn't about knowledge discovery, it was about money-making.  And the more I saw of academia, the more I realized it wasn't just this particular institute and this particular professor.  It was generally the case.  The moment you put people into big institutions, the goal shifts from knowledge discovery to money-making.  Here's how this works:

If a researcher gets a scholarship or research grant, the institution gets part of that money.  It's called the "overhead."  Technically, that's meant to pay for offices and equipment and administration.  But academic institutions pay part of their staff from this overhead, so they need to keep that overhead coming.  Small scholarships don't make much money, but big research grants can be tens of millions of dollars.  And the overhead can be anything between fifteen and fifty percent.  This is why research institutions exert loads of pressure on researchers to bring in grant money.  And partly, they do this by keeping the researchers on temporary contracts so that they need grants to get paid themselves...  And the overhead isn't even the real problem.  The real problem is that the easiest way to grow in academia is to pay other people to produce papers on which you, as the grant holder, can put your name.  That's how academia works.  Grants pay students and postdocs to produce research papers for the grant holder.  And those papers are what the supervisor then uses to apply for more grants.  The result is a paper-production machine in which students and postdocs are burnt through to bring in money for the institution...

I began to understand what you need to do to get a grant or to get hired.  You have to work on topics that are mainstream enough but not too mainstream.  You want them to be a little bit edgy, but not too edgy.  It needs to be something that fits into the existing machinery.  And since most grants are three years, or five years at most, it also needs to be something that can be wrapped up quickly...

The more I saw of the foundations of physics, the more I became convinced that the research there wasn't based upon sound scientific principles...  [Most researchers today] are only interested in writing more papers...  To get grants.  To get postdocs.  To write more papers.  To get more grants.  And round and round it goes.

You can see why a visionary like Peter Higgs was uncomfortable in today's academia (and vice versa).  But it's also horrifying to think about the Peter Higgses of this generation -- today's up-and-coming scientific groundbreakers, who may not ever get a chance to bring their ideas to the world, sandbagged instead by a hidebound money-making machine that has amplified "publish-or-perish" into "publish-or-never-get-started."

In any case, the world has lost a gentle, soft-spoken genius, whose unique insights -- made at a time when the academic world was more welcoming to such individuals -- completed our picture of the Standard Model of particle physics, and whose theories led to an understanding of the fundamental properties of matter and energy we're still working to explore fully.  94 is a respectable age in pretty much anyone's opinion, but it's still sad to lose someone of such brilliance, who was not only a leading name in pure research, but was unhesitating in pointing out the problems with how science is done.

It took 48 years for his theory about the Higgs mechanism to be experimentally vindicated; let's hope his criticisms of academia have a shorter gestation period.

****************************************



Monday, December 14, 2020

The modern glass ceiling

Rosalind Franklin has become justly famous for her role in discovering the three-dimensional structure of the DNA double helix.  Her specialty was x-ray crystallography, which involves bombarding a crystal with x-rays and photographing the scatter-pattern produced as the x-rays rebound off the atoms in the crystal.  From that photograph, a trained eye can make a good guess as to the arrangement of the atoms in the crystal.

The analogy I always used in my biology classes was a thought experiment: Imagine that you and a couple dozen friends are in a large darkened room, empty except for an object in the middle whose size and shape you can't see.  You and your friends are lined up around the perimeter, and have to stay with your backs against the wall.  You're asked to determine the shape of the object by hurling tennis balls in various directions; if the tennis ball misses the object, your friend on the opposite side of the room gets hit; if the ball hits the object, it ricochets off and lands near another of your friends somewhere else in the room.

Given enough tennis balls and enough time, and recording the results of each throw, you could probably make a decent guess about the size and the shape of the invisible object.  That, essentially, is what x-ray crystallographers do.

Easy concept, difficult in practice.  Franklin was exceptionally good at it, and produced the famous photo that proved the double-helical structure of DNA, accomplishing what dozens of other researchers had failed to do.

[Image licensed under the Creative Commons MRC Laboratory of Molecular Biology, Rosalind Franklin, CC BY-SA 4.0]

The trouble began when the paper was written that described the conclusions drawn from the photograph -- and the paper's lead author, Maurice Wilkins, didn't include Franklin's name on the list of authors.  Franklin herself died of ovarian cancer in 1958, not long after the paper's release, and so was unable to defend herself; but the reasons for the omission become crystal-clear when you hear comments from James Watson (of Watson & Crick fame) about Franklin's role in the lab.

"There was never lipstick to contrast with her straight black hair, while at the age of thirty-one her dresses showed all the imagination of English blue-stocking adolescents," Watson wrote.  "Her belligerent moods interfered with Wilkins’s ability to maintain a dominant position that would allow him to think unhindered about DNA...  Clearly Rosy had to go or be put in her place… The thought could not be avoided that the best home for a feminist was in another person's lab."

The marginalization, or outright disparagement, of women in academia was ubiquitous back then.  Most of us are rightly outraged when we read about how Franklin and her accomplishments were dismissed.  And we often congratulate ourselves on how far we've come, and name examples of women, minorities, and LGBTQ people who have risen to the top of their fields.

The problem is, this is not so far off from the "there is no such thing as racism... look, I have a black friend!" nonsense you sometimes hear.  As evidence of this, consider the nauseatingly condescending article that appeared only a couple of days ago in the Wall Street Journal.  Written by Joseph Epstein, this article illustrates with disgusting clarity that we've not come far from Watson's "the best home for a feminist was in another person's lab" attitude:
Madame First Lady—Mrs. Biden—Jill—kiddo: a bit of advice on what may seem like a small but I think is a not unimportant matter.  Any chance you might drop the “Dr.” before your name?  “Dr. Jill Biden ” sounds and feels fraudulent, not to say a touch comic...  As for your Ed.D., Madame First Lady, hard-earned though it may have been, please consider stowing it, at least in public.

Excuse me?  "Fraudulent?"  "Comic?"  "Kiddo?"  Can you imagine condescension like this being aimed at a white cis/het man?  A Ph.D. or Ed.D. does confer the right to use the honorific "Doctor" before your name, Mr. Epstein, whether you like it or not.  Not only that, it is a completely deserved acknowledgement of the intellect and diligence of the person who earned it.  I "only" have a master's degree, and I worked my ass off to achieve that.  I know a number of people with doctorates in various fields (a good many of them non-medical), and from their experiences I know how many years of hard work it takes to do the original research required for a doctoral degree.

It is horrifying that this article was written, and unconscionable that the Wall Street Journal elected to publish it.

The smug, smirking tone of this op-ed piece is emblematic; here, over fifty years after Rosalind Franklin conducted her groundbreaking research and was robbed of public acknowledgement of her role, we are still not past the way the patronizing, self-congratulatory patriarchy uses its position of power to minimize (or ignore entirely) the accomplishments of anyone who isn't a white cis/het male.

It may come as no shock that Joseph Epstein has been pulling this bullshit for years.  For fifty years, in fact.  The privilege of white cis/het males in this society extends to overlooking outright sexism and bigotry for decades.  Not just overlooking, but giving it tacit acceptance by the fact that it appears in a major publication.  Take a look at the paragraph he wrote in a piece call "Homo/Hetero: The Struggle for Sexual Identity" in The Atlantic in 1970:

They are different front the rest of us.  Homosexuals are different, moreover, in a way that cuts deeper than other kinds of human differences—religious, class, racial—in a way that is, somehow, more fundamental.  Cursed without clear cause, afflicted without apparent cure, they are an affront to our rationality, living evidence of our despair of ever finding a sensible, an explainable, design to the world.  One can tolerate homosexuality, a small enough price to be asked to pay for someone else's pain, but accepting it, really accepting it, is another thing altogether.  I find I can accept it least of all when I look at my children.  There is much my four sons can do in their lives that might cause me anguish, that might outrage me, that might make me ashamed of them and of myself as their father.  But nothing they could ever do would make me sadder than if any of them were to become homosexual.  For then I should know them condemned to a state of permanent niggerdom among men, their lives, whatever adjustment they might make to their condition, to be lived out as part of the pain of the earth.

It's tempting to say, "Well, that was 1970."  Which might be an excuse if Epstein had ever apologized or retracted what he'd written.  The best he could do was a mealy-mouthed reference to his 1970 article in the Washington Examiner in 2015, in which he said, "I am pleased the tolerance for homosexuality has widened in America and elsewhere, that in some respects my own aesthetic sensibility favors much homosexual artistic production...  My only hope now is that, on my gravestone, the words Noted Homophobe aren’t carved."

So it's probably too much to expect Epstein to back down with respect to his smug dismissal of Dr. Jill Biden's degree.  The Wall Street Journal, on the other hand, should issue a retraction and an unqualified apology.  This has nothing to do with her being the wife of the president-elect.  It would still be the case if she was an ordinary citizen in any part of academia.  The "glass ceiling" isn't gone; we're just very good at pretending it is, at acting like today we've shucked all the old problems of discrimination and bigotry.  But that a major newspaper is publishing -- even on its "Opinion" page -- something this blatantly demeaning, condescending, and rude is somewhere beyond appalling.

We need more women and minorities to be belligerent (to use James Watson's word) -- to refuse to accept the disparagement of their accomplishments, to give a pair of middle fingers to the entrenched establishment Epstein represents, that feels threatened whenever anyone from outside attempts an ingress.  How much talent, passion, and intelligence has been thwarted because of this attitude?  We can not tolerate this any more.  It has to be shouted down every single time it rears its ugly head.

If we really have progressed beyond the bigotry of the mid-twentieth century, if we really have gotten to a place where this generation's Rosalind Franklins would be welcomed and appreciated, we need to call out the Joseph Epsteins of the world, loud and clear. 

*********************************************

If you, like me, never quite got over the obsession with dinosaurs we had as children, there's a new book you really need to read.

In The Rise and Fall of the Dinosaurs: A New History of a Lost World, author Stephen Brusatte describes in brilliantly vivid language the most current knowledge of these impressive animals who for almost two hundred million years were the dominant life forms on Earth.  The huge, lumbering T. rexes and stegosauruses that we usually think of are only the most obvious members of a group that had more diversity than mammals do today; there were not only terrestrial dinosaurs of pretty much every size and shape, there were aerial ones from the tiny Sordes pilosus (wingspan of only a half a meter) to the impossibly huge Quetzalcoatlus, with a ten-meter wingspan and a mass of two hundred kilograms.  There were aquatic dinosaurs, arboreal dinosaurs, carnivores and herbivores, ones with feathers and scales and something very like hair, ones with teeth as big as your hand and others with no teeth at all.

Brusatte is a rising star in the field of paleontology, and writes with the clear confidence of someone who not only is an expert but has tremendous passion and enthusiasm.  If you're looking for a book for a dinosaur-loving friend -- or maybe you're the dino aficionado -- this one is a must-read.

[Note: if you purchase this book using the image/link below, part of the proceeds goes to support Skeptophilia!]