Tuesday, August 2, 2022
Death, with big nasty pointy teeth
Monday, August 1, 2022
The thoughtographer
To the materialist and the professional skeptic -- that is to say, people who do not wish their belief that death is the end of life as we know it to be disturbed -- the notion of ghosts is unacceptable. No matter how much evidence is presented to support the reality of the phenomena, these people will argue against it and ascribe it to any of several "natural" causes. Delusion or hallucination must be the explanation, or perhaps a mirage, if not outright trickery. Entire professional groups that deal in the manufacturing of illusions have taken it upon themselves to label anything that defies their ability to reproduce it artificially through trickery or manipulation as false or nonexistent. Especially among photographers and magicians, the notion that ghosts exist has never been popular.
A few years ago, Dr. Jules [sic] Eisenbud of the University of Colorado at Denver startled the world with his disclosures of the peculiar talents of a certain Ted Serios, a Chicago bellhop gifted with psychic photography talents. This man could project images into a camera or television tube, some of which were from the so-called future. Others were from distant places Mr. Serios had never been to. The experiments were undertaken under the most rigid test conditions. They were repeated, which was something the old-line scientists in parapsychology stressed over and over again. Despite the abundant amount of evidence, produced in the glaring limelight of public attention and under strictest scientific test conditions, some of Dr. Eisenbud's colleagues at the University of Colorado turned away from him whenever he asked them to witness the experiments he was conducting. So great was the prejudice against anything Eisenbud and his colleagues might find that might oppose existing concepts that men of scientists couldn't bear to find out for themselves. They were afraid they would have to unlearn a great deal.
Ted Serios exhibits a behavior pathology with many character disorders. He does not abide by the laws and customs of our society. He ignores social amenities and has been arrested many times. His psychopathic and sociopathic personality manifests itself in many other ways. He does not exhibit self-control and will blubber, wail and bang his head on the floor when things are not going his way.
He exhibits strong hostility toward figures of authority, such as policemen and scientists. He is an alcoholic and in psychic experiments he has been encouraged toward the excessive use of alcohol. He has demonstrated the symptoms of a manic-depressive with manic episodes. In one hypermaniacal period he acted like a violent madman and could not be restrained.
He often becomes profane and raging, completely reckless. While depressed he ignores other people, has a far-away look and is disenchanted with everything. He is always bored with talk unless it is about him. He often imagines himself a hero, and sometimes identifies with a violent known personality. He also exhibits sadistic behavior, for example by embarrassing Dr. Eisenbud once, giving as his own Dr. Eisenbud's name and his profession (a psychiatrist) when arrested.
In spite of the questionable research methods and the personality quirks of Serios, a number of Denver professional men believed Ted Serios was a psychic, with a unique power to record his thoughts with a Polaroid camera.
Saturday, July 30, 2022
First, do no harm
I keep thinking I'm not going to need to write any more about LGBTQ issues, that I've said all I need to say, and yet... here we are.
I'm going to start with a question directed at the people who are so stridently against queer rights, queer visibility, even queer existence. I doubt many people of that ilk read Skeptophilia, but you never know. So here goes:
How does guaranteeing that LGBTQ people are treated equally, fairly, and kindly, and are given the same rights as straight people, affect you at all?
It costs you absolutely nothing to say, "I'm not like you, and maybe I don't even understand this part of you, but even so, I respect your right to be who you are without shame or fear." For example, I'm not trans; I have always felt unequivocally that I am one hundred percent male. But when I had trans students in my classes, all it required was my crossing out a first name on the roster and writing in the name they'd prefer to be known by, and remembering to use the appropriate pronouns. A minuscule bit of effort on my part; hugely, and positively, significant on theirs.
What possible justification could I have for refusing?
The reason this whole topic comes up once again is a link sent to me by a loyal reader of Skeptophilia about a rugby team in Australia, the Manly Sea Eagles, which had seven of its players refuse to play in an important match because the owner wanted the team to wear jerseys with a rainbow design meant to promote inclusivity.
Retired Sea Eagles player Ian Roberts, who is the first rugby league player to come out publicly as gay, was devastated by the players' actions. "I try to see it from all perspectives, but this breaks my heart,” Roberts said. "It’s sad and uncomfortable. As an older gay man, this isn’t unfamiliar. I did wonder whether there would be any religious pushback. That’s why I think the NRL have never had a Pride round. I can promise you every young kid on the northern beaches who is dealing with their sexuality would have heard about this."
Matt Bungard, of Wide World of Sports, was blunter still. "I don’t want to hear one single thing about ‘respecting other people’s opinions’ or using religion as a crutch to hide behind while being homophobic. No issues playing at a stadium covered in alcohol and gambling sponsors, which is also a sin. What a joke."
Which I agree with completely, but it brings me back to my initial question; how did wearing the jerseys, for one night, harm those seven players? The jerseys didn't say, "Hi, I'm Gay." They were just a sign of support and inclusivity, of treating others the way you'd like to be treated.
Hmm, now where did I hear about that last bit? Seems like I remember someone famous saying that. Give me a moment, I'm sure it'll come to me.
A Christian baker creating a wedding cake for a gay couple is saying, "I may not be gay, but I'm happy you've found someone you love and want to spend your life with." Straight parents who give unconditional support to their trans child are saying, "I love you no matter what, no matter who you are and what you'd like to be called." A straight teacher having books with queer representation is saying, "Even if I don't experience sexuality like you do, I want you to understand yourself and be happy and confident enough to express your own truth openly."
If you're against same-sex marriage, if you bristle at Pride events, if you refuse to use a person's chosen name and pronouns, if you think businesses should be able to deny services to queer people, I want you to stop, just for one moment, and ask yourself: how is any of this harming me? Maybe it's time to pay more attention to the "love thy neighbor" parts of the Bible than to the Book of Leviticus, of which (face it) 99% is ignored by most Christians anyway. Maybe it's time to put more emphasis on compassion, understanding, and acceptance than on condemning anyone who doesn't think, act, or believe like you do.
After all, Jesus said it himself, in the Gospel of John, chapter thirteen: "By this all people will know that you are my disciples: if you have love for one another."
Friday, July 29, 2022
The cost of fraud
My Aunt Florence, my mother's older sister, died of Alzheimer's disease.
Her children, especially my cousin Linda, took care of her as she slowly declined during the last fifteen years of her life. She finally died in 2008 at the age of ninety, and by that time there was little left of her but a physical shell. She was unresponsive, the higher parts of her brain destroyed by the agonizing progression of this horrible illness. She went from being a bright, inquisitive, vital woman, an avid reader who did crossword puzzles in ink and could beat the hell out of me in Scrabble, to being... gone.
During this ordeal I lived fifteen hundred miles away, so I wasn't confronted every day by the terrible reality of what Alzheimer's does, both to the people suffering it and to their families. Even so, it was my aunt's face I kept picturing while I was reading an article in Neoscope sent to me by a friend -- all the while getting angrier and angrier.
If you've kept up at all with the research on Alzheimer's you probably are familiar with the words beta amyloid. It's a short-chain protein, whose function is unknown, which allegedly is directly toxic to nerve cells (and can cause other proteins to misfold, suggesting an etiology similar to Creutzfeld-Jakob syndrome, better known as "mad cow disease"). A great deal of money and time has been spent investigating the role of beta amyloid in Alzheimer's, and in developing drugs that interfere with its production -- significantly, not a single one of which has been shown to slow down the progression of the disease, much less reverse it.
It turns out this is no coincidence. There is good evidence that the often-cited papers on the topic by Sylvain Lesné -- who wrote convincingly that a specific beta amyloid species, Aß*56, was the culprit in the devastating destruction you see in Alzheimer's sufferers -- were based on faked data.
Not even well-faked, either. The images Lesné included from "Western blot" experiments, a commonly-used separation technique used to detect specific proteins in mixtures, were cut-and-pasted, something that can be seen not only in faint cut lines in the images but in the fact that the bands in the photographs have clearly been duplicated and moved (i.e., if you look at the edges of the bands, several of them have identically-shaped edges -- something that would be next to impossible in an actual Western blot).
It's a devastating finding. About how the hell fraud like this got past peer review, biochemist Derek Lowe writes in Science:
The Lesné stuff should have been caught at the publication stage, but you can say that about every faked paper and every jiggered Western blot. When I review a paper, I freely admit that I am generally not thinking “What if all of this is based on lies and fakery?” It’s not the way that we tend to approach scientific manuscripts. Rather, you ask whether the hypothesis is a sound one and if it was tested in a useful way: were the procedures used sufficient to trust the results and were these results good enough to draw conclusions that can in turn be built upon by further research? Are there other experiments that would make things stronger? Other explanations that the authors didn’t consider and should address? Are there any parts where the story doesn’t hang together? If so, how would these best be fixed?Lesné's apparently fraudulent research doesn't invalidate the whole beta amyloid hypothesis; other, independent studies support the toxic effects of beta amyloid on nerve cells, and have shown there's beta amyloid present in damaged cells. But Lesné's contention that Aß*56 was causative of Alzheimer's was apparently a blind alley -- and the presence of the protein in the neurons of Alzheimer's sufferers could as well be a result of the disease as a cause.
There is a good-faith assumption behind all these questions: you are starting by accepting the results as shown. But if someone comes in with data that have in fact been outright faked, and what’s more, faked in such a way as to answer or forestall just those sorts of reviewing tasks, there’s a good chance that these things will go through, unfortunately.
Thursday, July 28, 2022
The scent of memory
There are two very specific scents that will always remind me of my beloved grandma, with whom I lived for a year and a half when I was eight or nine years old.
One is the flowers of the sweet olive tree. Sweet olive is a small tree with glossy leaves and little, cream-colored flowers with a fruity, spicy smell a little reminiscent of a combination of fresh peaches and vanilla. My grandma had a beautiful sweet olive, and when it flowered in early summer, it perfumed the entire yard.
The other is the smell of old books. When I lived with my grandma my bedroom was in the attic, a maze-like twist of rooms and alcoves with sloped ceilings, a wooden floor, and shelves laden with what seemed to my young eyes like thousands of books. That dusty, dry smell when you turn the cover of a book that hasn't been opened in decades immediately brings me back to that happy time that was, in a lot of ways, like an oasis of calm and happiness in an otherwise troubled and turbulent childhood.
I suspect a lot of you can relate to my experience of having scents trigger memories, but it's hardly a new observation. This phenomenon was the impetus for Marcel Proust's Remembrance of Things Past, in which the entire book begins with the protagonist having a memory triggered by the smell and taste of madeleine cake and tea. What's been less obvious is why smell has such a strong link to memory -- for many people, stronger than any of the other senses.
Two recent papers, one in the journal Cell and the other in Nature, have elucidated why that might be. One reason is that the sense of smell is so specific -- we have over four hundred different types of olfactory sensors, each responsive to different molecules, and those sensors are connected through the olfactory bulb of the brain directly to the hippocampus (a major memory center) and the parts of the cerebral cortex involved in storage of memories.
There's a lot we still don't understand, however. In part, our responsiveness to scents seems to have an epigenetic component -- the phenomenon wherein traits can be inheritable without involving alterations in the DNA. For example, the grandchildren of mice who were given mild electric shocks when exposed to the scent of cherry blossoms have an aversive reaction to the odor even though they were never shocked themselves. (This may sound like a Lamarckian "inheritance of acquired characteristics" model, and in a way, it is; but epigenetic effects usually happen because the trait involved hormonal alteration of the rate of gene expression, which can affect the children -- and grandchildren -- without the actual DNA being changed.)
What it immediately made me wonder is how this affects the experience of animals with way more sensitive noses -- like dogs. Dogs' big snouts have fifty times more olfactory receptors than our puny little noses do.
Not only that, a dog's olfactory processing center is forty times bigger relative to the rest of its brain than yours is. So how does that affect what neuroscientist David Eagleman calls its umwelt -- the sensory world it inhabits? Imagine if the olfactory landscape you were immersed in every time you took a breath was as vivid as the visual and auditory landscapes most of us experience without even thinking about it.
No wonder my dogs both sniff me thoroughly whenever I come home after being away. Who knows what information they're gleaning about where I've been, who I've come into contact, and *gasp* what other dogs I may have said hi to along the way?
We're just beginning to parse how our sensory processing centers integrate with the other parts of the brain, and solve the old question of why (and how) the olfactory sense is such a powerful trigger to sometimes long-buried memories. Scientists are even considering the possible use of scents as a way to decondition traumatic memories in PTSD sufferers -- if there are smells with positive, reassuring connections, being exposed to those while suffering from the resurgence of trauma might blunt the edge of the pain. In one study, smelling an odor with pleasant associations (in this case, fresh coffee) while recounting a memory of a traumatic experience significantly lowered the emotional distress of the test subjects.
It will be fascinating to see where this research goes, as we untangle layer after layer of the complex network that allows us to perceive our world and recall what we experience. And, perhaps, explain how after fifty years, there's still a link in my brain between sweet olive flowers, old books, and my grandma's face.
Wednesday, July 27, 2022
Redefining the primitive
One of the most interesting, and persistent, misconceptions about evolutionary biology revolves around the use of the word "primitive" to describe certain life forms.
The misunderstanding goes back to Aristotle, really. The great philosopher proposed a concept usually known by its Latin name of scala naturae, the "scale of nature." Often called "the great chain of being." The idea is that life has progressed up some sort of ladder of complexity, starting with something like bacteria, then upwards through jellyfish and worms and bugs and fish and amphibians and reptiles and "lower" mammals, finally arriving at us, who (of course) being the pinnacle of creation, stand proudly at the top of the ladder.
The problem with this, as with many misconceptions, is that in some ways it's kinda sorta almost true. Something like today's bacteria were the first life forms, and the progression of fish > amphibian > reptile > mammal is pretty well established. The problems start when you look at life forms earlier than fish; during the famous Cambrian Explosion, most of the phyla of animals branched off and diversified in a relative flash, not only including the ones we have around today but a number of oddball groups that didn't survive. So with respect to most modern groups of species, that smooth progression up the ladder didn't actually happen.
The problem gets worse when you try to apply the word "primitive" to current life forms. Is a bug more primitive than a human? Both are alive right now; each of them has exactly the same length of ancestral history. It doesn't even work if you link "primitiveness" with "complexity." Humans and bugs are both complex organisms, they're just complex in different ways. Certainly, there's a difference in intelligence; most humans are smarter than most bugs. But intelligence doesn't equal evolutionary success. By just about any measure, insects are by far the most successful animals on the Earth.
It recalls the famous anecdote about the illustrious biologist J. B. S. Haldane. Haldane was a zoologist but also rather notorious as an outspoken atheist, and religious people used to go to his talks to heckle him about it. At one, during the question-and-answer period, a woman asked, "Professor Haldane, what have your studies in biology told you about the nature of God?"
Haldane thought for a moment, and finally said, "All I can say, ma'am, is that he must have an inordinate fondness for beetles."
In any case, you have to be extraordinarily careful how you apply the word "primitive." In biology it's now used to describe traits (not entire organisms), with a very specific, restricted meaning, defined as "a trait that is shared with the ancestral form." An example is the vascular tissue -- the internal plumbing -- in plants, which is "advanced" as compared to the trait of "lacking vascular tissue." Vascular plants evolved from non-vascular ones, so apropos of that trait, mosses (which lack vascular tissue) are primitive as compared to ferns (which have vascular tissue).
But it still requires caution, because it's all too easy to assume that "primitive traits are less complex" or (worse) that "if an organism is like humans, that means it's advanced," neither of which are true. For example, take a look at the paper last week in The American Naturalist written by a team from the University of Washington that questions the notion of primitiveness with respect to something most of us take for granted -- reproduction in mammals.
There are three basic modes of reproduction in Class Mammalia. A couple of modern species are oviparous -- egg-laying (the monotremes, namely the echidna and the platypus). Another group are marsupials, which give birth to extremely altricial (undeveloped) young, because the mothers have no placentas to interface between themselves and their babies. Once the offspring are too big (which isn't very big at all; kangaroos are about two centimeters long at birth) they are born, and develop the rest of the way in the mother's pouch. The third are the placentals such as ourselves (and every mammal you've ever heard of other than the monotremes and marsupials).
Egg-laying certainly is a primitive trait; it's pretty clear that the reptilian ancestors of the earliest mammals were oviparous. But what about the presence of a placenta? Once again, the danger is in assuming that it's the "advanced trait" because (1) humans are placentals, (2) there are currently more placentals than marsupials, and (3) somehow the placental method "seems more complicated." These are all more like smug self-congratulation than they are science, and none are reliable indicators of the primitiveness of a trait.
The current study, in fact, suggests that the placental mode of reproduction may predate the marsupial method -- by a lot. The researchers studied the odd multituberculates, a group of mammals that were amongst the first to diversify significantly, way back in the Jurassic Period around 170 million years ago. They were some of the most common mammals for a very long time, finally going extinct about 35 million years ago (for reasons unknown).
The salient point here is that the marsupial mammals, including the extinct ones that have been studied, have a very distinctive pattern of bone growth that is connected to their being born so incredibly undeveloped. A careful analysis of multituberculate bones shows they're a great deal more similar to today's placentals -- despite the fact that they branched off from the rest of Class Mammalia way earlier than the marsupials did.
So it looks like the little multituberculates had placentas and long gestation periods, and our mode of reproduction is actually the primitive one. Meaning the marsupial lineage lost the ability to form a placenta, rather than our lineage gaining it. Why that happened isn't known; but as we've seen, a trait doesn't need to be complex to give its owner a selective advantage. Perhaps in marsupials, the draw on the mother's resources is lowered enough by giving birth early that it allows her a better shot at surviving -- but that's pure speculation.
Whatever it is, both modes function perfectly well. "Evolution," as biologist Richard Dawkins put it, "is the law of 'whatever works.'"
And it all reinforces the notion that there is no "great chain of being," there's just an enormous tangled web of which we are just a single strand.
Tuesday, July 26, 2022
Seeing through the fog
There's something a little unsettling about the idea that when you're looking outward in space, you're looking backward in time.
If it seems like we're seeing things as they actually are, right now, it's only because (1) the speed of light is so fast, and (2) most of the objects we look at and interact with are relatively close by. Even the Sun, though, which in astronomical terms is right on top of us, is eight light-minutes away, meaning that the light leaving its surface takes eight minutes to cross the 150 million kilometers between it and us. If the Sun were suddenly to go dark -- not, mind you, a very likely occurrence -- we would have no way of knowing it for eight minutes.
The farther out you go, the worse it gets. The nearest star to the Solar System, Proxima Centauri, is about 4.2 light years away. So the awe-inspiring panorama of stars in a clear night sky is a snapshot of the past. Some of the stars you're looking at (especially the red supergiants like Antares and Betelgeuse) might actually already have gone supernova, and that information simply hasn't gotten here yet. None of the stars we see are in exactly the same positions relative to us as they appear to be to us right now.
Worst of all is when you look way out, as the James Webb Space Telescope is currently doing, because then, you have to account not only for distance, but for the fact that the universe is expanding. And it hasn't expanded at a uniform rate. Current models support the inflationary model, which says that between 10^-36 and 10^-32 seconds after the Big Bang the universe expanded by a factor of around 10^26. This seems like a crazy conjecture, but it immediately solves two perplexing problems in observational astronomy.
The first one, the horizon problem, has to do with the homogeneity of space. Look as far out into space as you can in one direction, then do the same thing in the opposite direction, and what you'll see is essentially the same -- the same distribution of matter and energy. The difficulty is that those two points are causally disconnected; they're far enough apart that light hasn't had time to travel from one to the other, and therefore no mechanism of communication can exist between them. By our current understanding of information transfer, once causally disconnected, always causally disconnected. So if something set the initial conditions in point A, how did point B end up with identical conditions if they've never been in contact with each other? It seems like a ridiculous coincidence.
The other one is the flatness problem, which has to do with the geometry of space-time. This subject gets complicated fast, and I'm a layperson myself, but as far as I understand it, the gist is this. The presence of matter warps the fabric of space locally (as per the General Theory of Relativity), but what is its overall geometry? From studies of such phenomenal as the cosmic microwave background radiation, it seems like the basic geometry of the universe as a whole is perfectly flat. Once again, there seems to be no particular reason to expect that could occur by accident.
Both these problems are taken care of simultaneously by the inflationary model. The horizon problem disappears if you assume that in the first tiny fraction of a second after the Big Bang, the entire universe was small enough to be causally connected, but during inflation the space itself expanded so fast that it carried pieces of it away faster than light can travel. (This is not forbidden by the Theories of Relativity; matter and energy can't exceed the speed of light, but space-time itself is under no such stricture.) The flatness problem is solved because the inflationary stretching smoothed out any wrinkles and folds that were in space-time at the moment of the Big Bang, just as taking a bunched-up bedsheet and pulling on all four corners flattens it out.
All of this will be facing some serious tests over the next few years as we get better and better at looking out into the far reaches. Just last week a team at the University of Cambridge published a paper in Nature Astronomy about a new technique to look out so far that what you're seeing is only 378,000 years after the Big Bang. (I know that may seem like a long time, but it's only 0.003% of the current age of the universe.) The problem is that prior to this, the universe was filled with a fog of glowing hydrogen atoms, so it was close to opaque. The new technique involves filtering out the "white noise" from the hydrogen haze, much the way as you can still see the shadows and contours of the landscape on a foggy day. It's not going to be easy; the signal emitted by the actual objects that were there in the early universe is estimated to be a hundred thousand times weaker than the interference from the glowing fog.
It's mind-blowing. I've been learning about this stuff for years, but I'm still boggled by it. If I think about it too hard I'm a little like the poor woman in a video with science vlogger Hank Green, who is trying to wrap her brain around the idea that anywhere you look, if you go out far enough, you're seeing the same point in space (i.e. all spots currently 13.8 billion light years from us were condensed into a single location at the moment of the Big Bang), and seems to be about to have a nervous breakdown from the implications. (Hat tip to my friend, the amazing author Robert Chazz Chute, for throwing the video my way.)
So think about all this next time you're looking up into a clear night sky. It's not a bad thing to be reminded periodically how small we are. The universe is a grand, beautiful, amazing, weird place, and how fortunate we are to be living in at time where we are finally beginning to understand how it works.








