Skeptophilia (skep-to-fil-i-a) (n.) - the love of logical thought, skepticism, and thinking critically. Being an exploration of the applications of skeptical thinking to the world at large, with periodic excursions into linguistics, music, politics, cryptozoology, and why people keep seeing the face of Jesus on grilled cheese sandwiches.
Showing posts with label social media. Show all posts
Showing posts with label social media. Show all posts

Tuesday, September 16, 2025

The morass of lies

It will come as no shock to regular readers of Skeptophilia that I really hate it when people make shit up and then misrepresent it as the truth.

Now making shit up, by itself, is just fine.  I'm a fiction writer, so making shit up is kind of my main gig.  It's when people then try to pass it off as fact that we start having problems.  The problem is, sometimes the false information sounds either plausible, or cool, or interesting -- it often has a "wow!" factor -- enough that it then gets spread around via social media, which is one of the most efficient conduits for nonsense ever invented.

Here are three examples of this phenomenon that I saw just within the past twenty-four hours.

The first is about a Miocene-age mammal called Orthrus tartaros, "a distant relative of modern weasels," that was a scary hypercarnivore.  Here's an artist's conception of what Orthrus tartaros looked like:


Problem is, there's no such animal.  In Greek mythology, Orthrus was Cerberus's two-headed brother, who had been given the task of guarding the giant Geryon's cattle, and was killed by Heracles as one of his "Ten Labors."  "Tartaros," of course, comes from Tartarus, the Greek version of hell.  While there are plenty of animals named after characters from Greek myth, this ain't one of them.  In fact, it's the creation of a Deviant Art artist who goes by the handle Puijila, and specializes in "speculative evolution" art that was never intended to represent actual animals.  But along the way, someone swiped Puijila's piece and started passing it around as if it were real.

What's frustrating about this is that there are plenty of prehistoric animals that were scary as fuck, such as the absolutely terrifying gorgonopsids.  You don't need to pretend that an (admittedly extremely talented) artist's fictional creations are part of the real menagerie.

The second one cautioned the tender-hearted amongst us against catching spiders and putting them outdoors.  "Spiders in your house," the post said, "are adapted to living indoors.  95% of the spiders captured and released outside die within 24 hours.  Just let them live inside -- most of them are completely harmless."

[Image licensed under the Creative Commons Ciar, House spider side view 01, CC BY-SA 3.0]

While I agree completely that spiders have gotten an undeserved bad rap, and the vast majority of them are harmless (and in fact, beneficial, considering the number of flies and mosquitoes they eat), the rest of this is flat wrong.  Given that here in the United States, conventional houses have only become common in the past two hundred years or so, how did the ancestors of today's North American spiders manage before that, if they were so utterly dependent on living indoors?  And second, how did anyone figure out that "95% of the spiders captured and released died within 24 hours?"  Did they fit them with little radio tracking tags, or something?  This claim fails the plausibility test on several levels -- so while the central message of "learn to coexist with our fellow creatures" is well meant, it'd be nice to see it couched in facts rather than made-up nonsense.

The last one is just flat-out weird.  I'd seen it before, but it's popping up again, probably because here in the Northern Hemisphere, it's vegetable-garden-harvest time:


If you "didn't know this" it's probably because it's completely false.  Pepper plants have flowers that botanists call "perfect" (they contain both male and female parts), so they can self-pollinate.  The wall of a pepper -- the part you eat -- comes from the flower's ovary, so honestly, the edible parts of peppers are more female than male (even that's inaccurate if you know much about sexual reproduction in plants, which is pretty peculiar).  The number of bumps has zero to do with either sex or flavor.

So: one hundred percent false.  When you grow or buy peppers, don't worry about the number of bumps, and afterward, use them for whatever you like.

What puzzles me about all this is why anyone would make this kind of stuff up in the first place.  Why would you spend your time crafting social media posts that are certifiable nonsense, especially when the natural world is full of information that's even more cool and weird and mind-blowing, and is actually real?  Once such a post is launched, I get why people pass it along; posts like this have that "One True Fact That Will Surprise You!" veneer, and the desire to share such stuff comes from a good place -- hoping that our friends will learn something cool.

But why would you create a lie and present it as a fact?  That, I don't get.

Now, don't get me wrong; there's no major harm done to the world by people making a mistake and believing in the sexuality of peppers, doomed house spiders, and a Miocene hypercarnivorous weasel.  But it still bothers me, because passing this nonsense along establishes a habit of credulity.  "I saw it on the internet" is the modern-day equivalent of "my uncle's best friend's sister-in-law's cousin swears this is true."  And once you've gotten lazy about checking to see if what you post about trivia is true and accurate, it's a scarily small step to uncritically accepting and reposting falsehoods about much, much more important matters.

Especially given that there are a couple of media corporations I could name that survive by exploiting that exact tendency.

So I'll exhort you to check your sources.  Yes, on everything.  If you can't verify something, don't repost it.  To swipe a line from Smokey Bear, You Too Can Prevent Fake News.  All it takes is a little due diligence -- and a determination not to make the current morass of online lies any worse than it already is.

****************************************


Saturday, August 10, 2024

All the lonely people

I'm a big fan of the band OneRepublic, but I don't think any of their songs has struck me like their 2018 hit "Connection."


"There's so many people here to be so damn lonely."  Yeah, brother, I feel that hard.  This whole culture has fostered disconnection -- or, more accurately, bogus connections.  Social media gives you the appearance of authentic interaction, but the truth is what you see is chosen for you by an algorithm that often has little to do with what (or whom) you're actually interested in.  A host of studies has documented the correlation between frequent social media use and poor mental health, anxiety, depression, and low self-esteem -- but as usual, the causation could run either way.  Rather than social media causing the decline in emotional wellness, it could be that people who are already experiencing depression gravitate toward social media because they lack meaningful real-life connections -- and at least the interactions on Facebook and TikTok and Instagram and whatnot are better than nothing.

Whichever way it goes, it appears that social media, which has long billed itself as being the new way to make friends, has left a great many people feeling more isolated than ever.

I know that's true for me.  I'm pretty shy, and don't get out much.  I volunteer sorting books for our local Friends of the Library book sale once a week; I see my athletic trainer once a week; I have a friend with whom I go for walks on Saturday mornings.  That's about it.  My social calendar is more or less non-existent.  And despite my natural tendency toward introversion, it's not a good thing.  I've had the sense -- undoubtedly inaccurate, but that doesn't make it feel any less real -- that if I were to vanish from the face of the Earth, maybe a dozen people would notice, and half that would care.

It's a hell of a way to live.

Sadly, I'm far from the only person who feels this way.  Disconnection and isolation are endemic in our society, and the scary part is the toll it takes.  Not only are there the obvious connections to mental health issues like depression and anxiety, a study out of Oregon State University published this week in the Journal of Psychology found that chronic loneliness is connected to a slew of other problems -- including poor sleep, nightmares, heart disease, stroke, dementia, and premature death.  The study, which involved 1,600 adults between the ages of eighteen and eighty, was absolutely unequivocal.

"Interpersonal relationships are very much a core human need," said psychologist Colin Hesse, director of the School of Communication in OSU’s College of Liberal Arts, who led the study.  "When people’s need for strong relationships goes unmet, they suffer physically, mentally and socially.  Just like hunger or fatigue means you haven’t gotten enough calories or sleep, loneliness has evolved to alert individuals when their needs for interpersonal connection are going unfulfilled...  Quality restorative sleep is a linchpin for cognitive functioning, mood regulation, metabolism and many other aspects of well-being.  That’s why it’s so critical to investigate the psychological states that disrupt sleep, loneliness being key among them."

The open question is what to do about it.  Social media clearly isn't the answer.  I don't want to paint it all as negative; I have good interactions on social media, and it allows me to keep in touch with friends who live too far away to see regularly, which is why I'm willing to participate in it at all.  But to have those interactions requires wading through all of the other stuff the algorithm desperately wants me to see (including what appear to be eighteen gazillion "sponsored posts," i.e., advertisements).  The bottom line is that people like Mark Zuckerberg and the other CEOs of large social media organizations don't give a flying rat's ass about my feelings; it's all about making money.  If it makes MZ money, you can bet you'll see it lots.  If it doesn't?

Meh.  Maybe.  Probably not.  Certainly you shouldn't count on it.

So the alternative is to try to get out there more and form some authentic connections, which is much easier said than done.  All I know is that it's important.  There may be people in this world who are natural loners, but I suspect they're few and far between.  The majority of us need deep connection with friends, and suffer if we don't have it.

And the Hesse et al. study has shown that there's more at risk than just your mood if you don't.

****************************************



Monday, June 3, 2024

Inside the bubble

A couple of nights ago, my wife and I watched the latest episode in the current series of Doctor Who, "Dot and Bubble."  [Nota bene: this post will contain spoilers -- if you intend to watch it, you should do so first, then come back and read this afterward.]

All I'd heard about it before watching is that it is "really disturbing."  That's putting it mildly.  Mind you, there's no gore; even the monsters are no worse than the usual Doctor Who fare.  But the social commentary it makes puts it up there with episodes like "Midnight," "Cold Blood," and "The Almost People" for leaving you shaken and a little sick inside.

The story focuses on the character of Lindy, brilliantly played by Callie Cooke, who is one of the residents of "Finetime."  Finetime is basically a gated summer camp for spoiled rich kids, where they do some nominal work for two hours a day and spend the rest of the time playing.  Each of the residents is surrounded, just about every waking moment, by a virtual-reality shell showing all their online friends -- the "bubble" of the title -- and the "work" each of them does is mostly to keep their bubbles fully charged so they don't miss anything.


The tension starts to ramp up when the Doctor and his companion, Ruby Sunday, show up unannounced in Lindy's bubble, warning her that people in Finetime are disappearing.  At first she doesn't believe it, but when forced to look people up, she notices an abnormal number of them are offline -- she hadn't noticed because the only ones she sees are the ones who are online, so she wasn't aware how many people in her bubble had vanished.  At first she's dismissive of Ruby and downright rude to the Doctor, but eventually is driven to the realization that there are monsters eating the inhabitants of Finetime one by one.

Reluctantly accepting guidance from the Doctor, she runs for one of the conduits that pass under the city, which will give her a way out of the boundaries into the "Wild Wood," the untamed forests outside the barrier.  Along the way, though, we begin to see that Lindy isn't quite the vapid innocent we took her for at first.  She coldly and unhesitatingly sacrifices the life of a young man who had tried to help her in order to save her own; when she finds out that the monsters had already killed everyone in her home world, including her own mother, she basically shrugs her shoulders, concluding that since they were in a "happier place" it was all just hunky-dory.

It was the end, though, that was a sucker punch I never saw coming.  When she finally meets up with the Doctor and Ruby in person, and the Doctor tells her (and a few other survivors) that they have zero chance of surviving in the Wild Wood without his help, she blithely rejects his offer.

"We can't travel with you," she says, looking at him as if he were subhuman.  "You, sir, are not one of us.  You were kind -- although it was your duty to save me.  Screen-to-screen contact is just about acceptable.  But in person?  That's impossible."

In forty-five minutes, a character who started out seeming simply spoiled, empty-headed, and shallow moved into the territory of "amoral" and finally into outright evil.  That this transformation was so convincing is, once again, due to Callie Cooke's amazing portrayal.

What has stuck with me, though, and the reason I'm writing about it today, is that the morning after I watched it, I took a look at a few online reviews of the episode.  They were pretty uniformly positive (and just about everyone agreed that it was disturbing as hell), but what is fascinating -- and more than a little disturbing in its own right -- is the difference between the reactions of the reviewers who are White and the ones who are Black.

Across the board, the White reviewers thought the take-home message of "Dot and Bubble" is "social media = bad."  Or, at least, social media addiction = bad.  If so, the moral to the story is (to quote Seán Ferrick of the YouTube channel WhoCulture) "as subtle as a brick to the face."  The racism implicit in Lindy's rejection of the Doctor was a shocking twist at the end, adding another layer of yuck to an already awful character.

The Black reviewers?  They were unanimous that the main theme throughout the story is racism (even though race was never once mentioned explicitly by any of the characters).  In the very first scene, it was blatantly obvious to them that every last one of Lindy's online friends is White -- many of them almost stereotypically so.  Unlike the White reviewers, the Black reviewers saw the ending coming from a mile off.  Many of them spoke of having dealt all their lives with sneering, race-based microaggressions -- like Lindy's being willing at least to talk to Ruby (who is White) while rejecting the Doctor (who is Black) out of hand.

When considering "Dot and Bubble," it's easy to stop at it being a rather ham-handed commentary on social media, but really, it's about echo chambers.  Surround yourself for long enough with people who think like you, act like you, and look like you, and you start to believe the people who don't share those characteristics are less than you.

What disturbs me the worst is that I didn't see the obvious clues that writer Russell T. Davies left us, either.  When Lindy listens to Ruby and rejects the Doctor, it honestly didn't occur to me that the reason could be the color of his skin.  I didn't even notice that all Lindy's friends were White.  As a result, the ending completely caught me off guard.  As far as the subtle (and not-so-subtle) racist overtones of the characters in the episode, I wasn't even aware of them except in retrospect.

But that's one of the hallmarks of privilege, isn't it?  You're not aware of it because you don't have to be.  As a White male, there are issues of safety, security, and acceptance I never even have to think about.  So I guess like Lindy and the other residents of Finetime, I also live in my own bubble, surrounded by people who (mostly) think like I do, never having to stretch myself to consider, "What would it be like if I was standing where they are?"

And what makes the character of Lindy so horrific is that even offered the opportunity to do that -- to step outside of her bubble and broaden her mind a little -- she rejects it.  Even if it means losing the aid of the one person who is able to help her, and without whose assistance she is very likely not to survive.

For myself, my initial blindness to what "Dot and Bubble" was saying was a chilling reminder to keep pushing my own boundaries.  In the end, all I can do is what poet Maya Angelou tells us: "Do the best you can until you know better.  Then, when you know better, do better."

****************************************



Saturday, September 17, 2022

The will to fight

If you're fortunate enough not to suffer from crippling depression and anxiety, let me give you a picture of what it's like.

Last week I started an online class focused on how to use TikTok as a way for authors to promote their books.  So I got the app and created an account -- it's not a social media platform I'd used before -- and made my first short intro video.  I was actually kind of excited, because it seemed like it could be fun, and heaven knows I need some help in the self-promotion department.  (As an aside, if you're on TikTok and would like to follow me, here's the link to my page.)

Unfortunately, it seemed like as soon as I signed up, I started having technical problems.  I couldn't do the very first assignment because my account was apparently disabled, and that (very simple) function was unavailable.  Day two, I couldn't do the assignment because I lacked a piece of equipment I needed.  (That one was my fault; I thought it was on the "optional accessories" list, but I was remembering wrong.)  Day three's assignment -- same as day one; another function was blocked for my account.  By now, I was getting ridiculously frustrated, watching all my classmates post their successful assignments while I was completely stalled, and told my wife I was ready to give up.  I was getting ugly flashbacks of being in college physics and math classes, where everyone else seemed to be getting it with ease, and I was totally at sea.  When the same damn thing happened on day four, my wife (who is very much a "we can fix this" type and also a techno-whiz), said, "Let me take a look."  After a couple of hours of jiggering around with the settings, she seemed to have fixed the problem, and all the functions I'd been lacking were restored.

The next morning, when I got up and got my cup of coffee and thought, "Okay, let me see if I can get started catching up," I opened the app and it immediately crashed.

Tried it again.  Crash.  Uninstalled and reinstalled the app.  Crash.

[Image licensed under the Creative Commons LaurMG., Frustrated man at a desk (cropped), CC BY-SA 3.0]

I think anyone would be frustrated at this point, but my internal voices were screaming, "GIVE UP.  YOU SHOULD NEVER HAVE SIGNED UP FOR THIS.  YOU CAN'T DO IT.  IT FIGURES.  LOSER."  And over and over, like a litany, "Why bother.  Why bother with anything."  Instead of the frustration spurring me to look for a solution, it triggered my brain to go into overdrive demanding that I give up and never try again.

When I heard my wife's alarm go off an hour later, I went and told her what had happened, trying not to frame it the way I wanted to, which was "... so fuck everything."  She sleepily said, "Have you tried turning your phone completely off, then turning it back on?"  Ah, yes, the classic go-to for computer problems, and it hadn't occurred to me.  So I did...

... and the app sprang back to life.

But now I was on day five of a ten-day course, and already four assignments behind.  That's when the paralyzing anxiety kicked in.  I had told the instructors of the course a little about my tech woes, and I already felt like I had been an unmitigated pest, so the natural course of action -- thinking, "you paid for this course, tell the instructors and see if they can help you catch up" -- filled me with dread.  I hate being The Guy Who Needs Special Help.  I just want to do my assignments, keep my head down, fly under the radar, be the reliable work-horse who gets stuff done.  And here I was -- seemingly the only one in the class who was being thwarted by mysterious forces at every turn.

So I never asked.  The more help I needed, the more invisible I became.  It's now day seven, and I'm maybe halfway caught up, and I still can't bring myself to tell them the full story of what was going on.

Adversity > freak out and give up.  Then blame yourself and decide you should never try anything new ever again.  That's depression and anxiety.

I've had this reaction pretty much all my life, and it's absolutely miserable.  It most definitely isn't what I was accused of over and over as a child -- that I was choosing to be this way to "get attention" or to "make people feel sorry for me."  Why the fuck would anyone choose to live like this?  All I wanted as a kid -- all I still want, honestly -- is to be normal, not to have my damn brain sabotage me any time the slightest thing goes wrong.  As I told my wife -- who, as you might imagine, has the patience of a saint -- "some days I would give every cent I have to get a brain transplant."

So re: TikTok, if Carol hadn't been there, I'd have deleted my account and forfeited the tuition for the class.  But I'm happy to report that I haven't given up, and I've posted a few hopefully mildly entertaining videos, which I encourage you to peruse.

The reason all this comes up, though, isn't just because of my social media woes.  I decided to write about this because of some research published this week in the journal Translational Psychiatry which found that a single gene -- called Tob -- seems to mediate resilience to emotional stress in mice, and without it, produces exactly the "freak out and give up" response people have when they suffer from depression and anxiety.

Tob was already the subject of intense research because it apparently plays a role in the regulation of the cell cycle, cancer suppression, and the immune system.  It's known that in high-stress situations, Tob rapidly switches on, so it is involved somehow in the flight-fight-freeze response.  And a team led by Tadashi Yamamoto of the Okinawa Institute of Science and Technology found that "Tob-knockout mice" -- mice that have been genetically engineered to lack the Tob gene -- simply gave up when they were in stressful situations requiring resilience and sustained effort.  Put another way, without Tob, they completely lost the will to fight. 

When I read this article -- which I came across while I was in the midst of my struggle with technology -- I immediately thought, "Good heavens, that's me."  Could my tendency to become frustrated and overwhelmed easily, and then give up in despair, be due to the underactivity of a gene?  I know that depression and anxiety run in my family; my mother and maternal grandmother definitely struggled with them, as does my elder son.  Of course, it's hard to tease apart the nature/nurture effects in this kind of situation.  It's a reasonable surmise that being raised around anxious, stressed people would make a kid anxious and stressed.

But it also makes a great deal of sense that these familial patterns of mental illness could be because there's a faulty gene involved.

Research like Yamamoto et al. is actually encouraging; identifying a genetic underpinning to mental illnesses like the one I have suffered from my entire life opens up a possible target for treatment.  Because believe me, I wouldn't wish this on anyone.  While fighting with a silly social media platform might seem to someone who isn't mentally ill like a shrug-inducing, "it's no big deal, why are you getting so upset?" situation, for people like me, everything is a big deal.  I've always envied people who seem to be able to let things roll off them; whatever the reason, if it came from the environment I grew up in or because I have a defective Tob gene, I've never been able to do that.  Fortunately, my family and friends are loving and supportive and understand what I go through sometimes, and are there to help.

But wouldn't it be wonderful if this kind of thing could be fixed permanently?

****************************************


Saturday, May 28, 2022

Social media dissociation

I suspect that many of my readers will resonate with my desire to fritter away less time on social media.

I don't mean the actual "social" part of social media.  I have friends whom I seldom if ever get to see, and especially since the pandemic started, visiting online is about my only opportunity.  I greatly value those conversations.  What I'm referring to is the aimless scrolling, looking for new content, any new content.  Trying to find a distraction even though I know that a dozen other things, from listening to some music, to playing with my dogs, to going for a run -- even weeding the garden -- will leave me feeling better.

But -- once again, as I'm sure many of you can attest -- it can be exceedingly hard to say "enough" and close the app.  It was one thing when your connectivity had to be via a desktop or laptop computer; but now that just about all of us (even me, Luddite though I am) are carrying around our social media addiction in our pockets, it's way too easy to say "just a few more minutes" and drop back into the world of scrolling.

One effect I've noticed it's had on me is a shortening of my attention span.  Something has to be absolutely immersive to keep my attention for over five minutes.  Two of my favorite YouTube science channels, the wonderful Veratasium and physicist Sabine Hossenfelder's awesome Science Without the Gobbledygook, have videos that average at about ten to twelve minutes long, and man... sometimes that is a struggle, however fascinating the topic.

I don't like this trend.  I won't say I've ever had the best of focus -- distractions and my wandering mind have been issues since I was in grade school -- but social media have made it considerably worse.  Frequently I think about how addicted I am to scrolling, and it's a real cause of worry.

But then I start scrolling again and forget all about it.

That last bit was the subject of a study from the University of Washington that was presented last month at the CHI Conference on Human Factors in Computing Systems.  In, "'I Don’t Even Remember What I Read': How Design Influences Dissociation on Social Media," a team led by Amanda Baughan looked at how social media apps are actually designed to have this exact effect -- and that although we frequently call it an addiction, it is more accurately described as dissociation.

"Dissociation is defined by being completely absorbed in whatever it is you're doing," Baughan said, in an interview with Science Daily.  "But people only realize that they've dissociated in hindsight.  So once you exit dissociation there's sometimes this feeling of: 'How did I get here?'  It's like when people on social media realize: 'Oh my gosh, how did thirty minutes go by?  I just meant to check one notification.'"

Which is spot-on.  Even the title is a bullseye; after a half-hour on Twitter, I'd virtually always be hard-pressed to tell you the content of more than one or two of the tweets I looked at.  The time slips by, and it feels very much like I glance up at the clock, and three hours are gone without my having anything at all to show for it.

It always reminds me of a quote from C. S. Lewis's The Screwtape Letters.  While I (obviously) don't buy into the theology, his analysis of time-wasting by the arch-demon Screwtape is scarily accurate:
As this condition becomes more fully established, you will be gradually freed from the tiresome business of providing Pleasures as temptations.  As the uneasiness and his reluctance to face it cut him off more and more from all real happiness, and as habit renders the pleasures of vanity and excitement and flippancy at once less pleasant and harder to forgo (for that is what habit fortunately does to a pleasure) you will find that anything or nothing is sufficient to attract his wandering attention.  You no longer need a good book, which he really likes, to keep him from his prayers or his work or his sleep; a column of advertisements in yesterday’s paper will do.  You can make him waste his time not only in conversation he enjoys with people whom he likes, but in conversations with those he cares nothing about on subjects that bore him.  You can make him do nothing at all for long periods.  You can keep him up late at night, not roistering, but staring at a dead fire in a cold room.  All the healthy and outgoing activities which we want him to avoid can be inhibited and nothing given in return, so that at last he may say, as one of my own patients said on his arrival down here [in hell], "I now see that I spent most of my life in doing neither what I ought nor what I liked."

That last line, especially, is a fair knockout, and it kind of makes me suspicious that social media may have been developed down in hell after all.

Baughan, however, says maybe we shouldn't be so hard on ourselves.  "I think people experience a lot of shame around social media use," she said.  "One of the things I like about this framing of 'dissociation' rather than 'addiction' is that it changes the narrative.  Instead of: 'I should be able to have more self-control,' it's more like: 'We all naturally dissociate in many ways throughout our day -- whether it's daydreaming or scrolling through Instagram, we stop paying attention to what's happening around us.'"

Even so, for a lot of us, it gets kind of obsessive at times.  It's worse when I'm anxious or depressed, when I crave a distraction not only from unpleasant external circumstances but from the workings of my own brain.  And it's problematic that when that occurs, the combination of depression and social media create a feedback loop that keeps me from seeking out activities -- which sometimes just means turning off the computer and doing something, anything, different -- that will actually shake me out of my low mood.

But she's right that shaming ourselves isn't productive, either.  Maybe a lot of us could benefit by some moderation in our screen time, but self-flagellation doesn't accomplish anything.  I'm not going to give up on social media entirely -- like I said, without it I would lose touch with too many contacts I value -- but setting myself some stricter time limits is probably a good idea.

And now that you've read this, maybe it's time for you to shut off the device, too.  What are you going to do instead?  I think I'll go for a run.

**************************************

Friday, April 1, 2022

Moodscrolling

I think one of the reason I have a love/hate relationship with Twitter is that my feed sounds way too much like my brain.

I do a lot of what I call "hamster-wheeling."  Just sitting there -- or, worse, lying in bed at night trying to sleep -- I get a running litany of disconnected thoughts that leap about my cerebral cortex like a kangaroo on crack.  Think about that, and look at this selection of tweets that I pulled from the first few scroll-downs of my feed this morning, and which I swear I'm not making up:

  • I'm putting everyone on notice that I'm not taking any shit today.
  • Wow, I've got bad gas.  My apologies to my coworkers.
  • I'm on vacation why am I up at 6 AM scrolling on Twitter
  • In England in the 1880s, "pants" was considered a dirty word.
  • I wonder how Weeping Angels reproduce.  Do they fuck?  I'd fuck a Weeping Angel, even though I'd probably regret it.
  • Super serious question.  Does anyone still eat grilled cheese sandwiches?
  • A stranger at the gym just told me I should dye my beard because it's got gray in it.  WTF?
  • Doo-dah, doo-dah, all the live-long day
The only tweets I didn't consider including were purely political ones and people hawking their own books, which admittedly make up a good percentage of the total.  But if you take those out, what's left is, in a word, bizarre.  In three words, it's really fucking bizarre.

Me, I find my hamster-wheeling thoughts annoying and pointless; I can't imagine that anyone else would want to hear them.  For criminy's sake, even I don't want to hear them.

So why the hell do I stay on Twitter?

I think part of it is insufficient motivation to do what it would take to delete my account, but part of it is that despite the weird, random content, I still find myself spending time just about every day scrolling through it. 

[Image licensed under the Creative Commons MatthewKeys, Twitter's San Francisco Headquarters, CC BY-SA 3.0]

I've noticed that my tendency to waste time on social media is inversely proportional to my mood.  When I'm in a good mood, I can always find more interesting things to do; when I'm low, I tend to sit and scan through Twitter and Facebook, sort of waiting for something to happen that will lift me up, get me interested, or at least distract me.

Moodscrolling, is the way I think of it.

I'm apparently not the only one.  A team at Fudan University (China) found that social media use and depression and anxiety were strongly correlated -- and that both had increased dramatically since the pandemic started.  It seems to be an unpleasant positive feedback loop; the worse things get and the more isolated we are, the more depressed and anxious we get (understandably), and the more we seek out contact on social media.  Which, because of its weird content, often outright nastiness, and partisan rancor (you should see some of the political tweets I decided not to post), makes us feel worse, and round and round it goes.  Breaking the cycle by forcing yourself to stand up and walk away from the computer is hard when you're already feeling down; especially so now that it's all available on our phones, so the option of consuming social media is seldom farther away than our own pockets.

It's not that I think it's all bad.  If it was, I would delete my account.  I've met some very nice people in Twitter communities I've joined -- fellow fiction writers and Doctor Who fans are two that come to mind.  Facebook, on the other hand, lets me stay in touch with dear friends whom I seldom get to see.  But there's no doubt that if you did a cost-benefit analysis -- the amount of time I spend on social media as compared to the positive stuff I get from it -- it would show numbers that are seriously in the red.

Walking away, though, takes willpower, and that's exactly what depressed and anxious people tend to lack.  The study I linked above, though, makes me more certain that's what I need to do.  The random, disjointed thoughts my own brain comes up with are enough; I don't need to see everyone else's.

Although I have to admit that the guy who posted about the Weeping Angels asks a good question.  Not only are they made of stone, they all appear to be female.  And if you watch Doctor Who, there certainly seems to be a lot of them.  For the record, though, I am not in the least interested in having sex with one, even if it turns out they're somehow capable of it.  Those things are seriously creepy.

**************************************

Friday, March 18, 2022

Birds of a feather

I should probably avoid social media altogether, given what a cesspit of ugliness it can be sometimes.

Unfortunately, it's provided the simplest way of keeping in touch with dear friends I seldom see, especially during the height of the pandemic (when I kind of wasn't seeing anyone).  But to say it amplifies the echo chamber effect is an understatement.  Not only do we tend to link on social media to like-minded folks (can't tell you how many times I've heard someone say that they'd unfriended someone solely because of some opinion or another, usually political), but with the few non-like-minded social media friends we have and keep, it takes so much energy to argue that most of us just sigh heavily, shrug our shoulders, and move on, even when confronted with opinions completely antithetical to our own.

Take, for example, what I saw posted yesterday -- a meme saying, "All I'm saying is, if my dog got three rabies shots and then still got rabies, I'd begin to get suspicious."  (It took all my willpower not to respond, "Oh, how I wish that was all you were saying.")  In any case, not only does the post trumpet zero understanding about how vaccinations and immunity work, it's back to the maddening phenomenon of a layperson thinking an opinion formed from watching Fox News and doing a ten-minute read of some guy's website constitutes "research."


If that wasn't bad enough, a friend-of-the-friend -- no one I know -- responded, "It's what comes from drinking the libtard kool-aid."  So, let's take the ignorant post and make it worse by slathering on some ugly vitriol demeaning half the residents of the country.

And what did I do in response?

Nothing.

I just didn't have the energy to get drawn in.  Plus, there's a sense of such argument being futile anyhow.  I seriously doubt anyone, in the history of the internet, has ever had their opinion changed by arguing a point online with a total stranger.

Only a few minutes after seeing the post, though, I stumbled on some research out of the University of Buffalo that contains at least a glimmer of hope; that the screeching you hear on social media isn't necessarily reflective of the attitudes that the majority of people have, because these platforms amplify the loudest voices -- not necessarily the ones that make the best sense, or are even the most common.

In a paper in The Journal of Computer-Mediated Communication, Yini Zhang, Fan Chen, and Karl Rohe looked at our tendency to form "flocks" on social media.  By studying the posts from 193,000 Twitter accounts, and the 1.3 million accounts those accounts follow, they were able to uncover patterns of tweets and retweets, and found the strongest-worded opinions were the ones that got liked and retweeted the most.  They called this phenomenon murmuration -- the term comes from the flocking behavior of starlings -- capturing the idea that online expression of opinions forms and shifts not based on actual changes in the information available, but on who is saying what, and how stridently.

"By identifying different flocks and examining the intensity, temporal pattern and content of their expression, we can gain deeper insights far beyond where liberals and conservatives stand on a certain issue," said study lead author Yini Zhang, in an interview in Science Daily.  "These flocks are segments of the population, defined not by demographic variables of questionable salience, like white women aged 18-29, but by their online connections and response to events.  As such, we can observe opinion variations within an ideological camp and opinions of people that might not be typically assumed to have an opinion on certain issues.  We see the flocks as naturally occurring, responding to things as they happen, in ways that take a conversational element into consideration."

The fact that the social media flocking doesn't mirror the range of opinion out there is heartening, to say the least.  "[S]ocial media public opinion is twice removed from the general public opinion measured by surveys," Zhang said.  "First, not everyone uses social media.  Second, among those who do, only a subset of them actually express opinions on social media.  They tend to be strongly opinionated and thus more willing to express their views publicly."

It's not just political discourse that can be volatile.  A friend of mine just got blasted on Facebook a couple of days ago, out of the blue, because she posts stuff intended to be inspirational or uplifting, and one of her Facebook friends accused her of being "self-righteous," and went on to lambaste her for her alleged holier-than-thou attitude.  The individual in question doesn't have a self-righteous bone in her whole body -- she might be the only person I know who has more of a tendency to anxious self-doubt than I do -- so it was a ridiculous accusation.  But it does exemplify the sad fact that a lot of us feel freer to be unkind to people online than we ever would face-to-face.  

The important point here is that it's easy to see the nastiness and foolishness on social media and conclude that this is the way the majority of the public believes and acts, but the Zhang et al. study suggests that the majority of the opinions of this sort are generated by a few strident people.  Only afterward do those posts act like a magnet to the like-minded followers they already had.

So as hard as it is to keep in mind sometimes, I maintain that the majority of people are actually quite nice, and want the same things we want -- safety, security, the basic necessities, health and happiness for our friends and family.  The ugly invective from people like the guy who made the "libtard" comment is far from a majority opinion, and shouldn't feed into a despairing sense that everyone is horrible.

The flocks, apparently, aren't led by the smartest birds, just the ones who squawk the loudest.  A lot of the rest are tagging along for the ride.  There's a broader population at the center, opinion-wise, than you'd think, judging by what you see on social media.  And when the birds step away from social media, most of them turn out to be ordinary tweeters just trying to stay with the flock-mates they feel the most comfortable with.

**************************************

Tuesday, August 17, 2021

Reinforcing outrage

I got onto social media some years ago for two main reasons; to stay in touch with people I don't get to see frequently (which since the pandemic has been pretty much everyone), and to have a platform for marketing my books.

I'm the first to admit that I'm kind of awful at the latter.  I hate marketing myself, and even though I know I won't be successful as an author if no one ever hears about my work, it goes against the years of childhood training in such winning strategies as "don't talk about yourself" and "don't brag" and (my favorite) "no one wants to hear about that" (usually applied to whatever my current main interest was).

I'm still on Facebook, Twitter, and Instagram, although for me the last-mentioned seems to mostly involve pics of my dog being cute.  It strikes me on a daily basis, though, how quickly non-dog-pic social media can devolve into a morass of hatefulness -- Twitter seems especially bad in that regard -- and also that I have no clue how the algorithms work that decide for you what you should and should not look at.  It's baffling to me that someone will post a fascinating link or trenchant commentary and get two "likes" and one retweet, and then someone else will post a pic of their lunch and it'll get shared far and wide.

So I haven't learned how to game the system, either to promote my books or to get a thousand retweets of a pic of my own lunch.  Maybe my posts aren't angry enough.  At least that seems to be the recommendation of a study at Yale University that was published last week in Science Advances, which found that expressions of moral outrage on Twitter are more often rewarded by likes and retweets than emotionally neutral ones.

[Image licensed under the Creative Commons "Today Testing" (For derivative), Social Media Marketing Strategy, CC BY-SA 4.0]

Apparently, getting likes and retweets is the human equivalent of the bell ringing for Pavlov's dog.  When our posts are shared, it gives us incentive to post others like them.  And since political outrage gets responses, we tend to move in that direction over time.  Worse still, the effect is strongest for people who are political moderates, meaning the suspicion a lot of us have had for a while -- that social media feeds polarization -- looks like it's spot-on.

"Our studies find that people with politically moderate friends and followers are more sensitive to social feedback that reinforces their outrage expressions,” said Yale professor of psychology Molly Crockett, who co-authored the study.  "This suggests a mechanism for how moderate groups can become politically radicalized over time — the rewards of social media create positive feedback loops that exacerbate outrage...  Amplification of moral outrage is a clear consequence of social media’s business model, which optimizes for user engagement.  Given that moral outrage plays a crucial role in social and political change, we should be aware that tech companies, through the design of their platforms, have the ability to influence the success or failure of collective movements.  Our data show that social media platforms do not merely reflect what is happening in society.  Platforms create incentives that change how users react to political events over time."

Which is troubling, if not unexpected.  Social media may not just be passively encouraging polarization, but deliberately exploiting our desire for approval.  In doing so, they are not just recording the trends, but actively influencing political outcomes.

It's scary how easily manipulated we are.  The catch-22 is that any attempt to rein in politically-incendiary material on social media runs immediately afoul of the rights of free speech; it took Facebook and Twitter ages to put the brakes on posts about the alleged danger of the COVID vaccines and the "Big Lie" claims of Donald Trump and his cronies that Joe Biden stole the election last November.  (A lot of those posts are still sneaking through, unfortunately.)  So if social media is feeding social media polarization with malice aforethought, the only reasonable response is to think twice about liking and sharing sketchy stuff -- and when in doubt, err on the side of not sharing it.

Either that, or exit social media entirely, something that several friends of mine have elected to do.  I'm reluctant -- there are people, especially on Facebook, who I'd probably lose touch with entirely without it -- but I don't spend much time on it, and (except for posting links to Skeptophilia every morning) hardly post at all.  What I do post is mostly intended for humor's sake; I avoid political stuff pretty much entirely.

So that's our discouraging, if unsurprising, research of the day.  It further reinforces my determination to spend as little time doomscrolling on Twitter as I can.  Not only do I not want to contribute to the nastiness, I don't need the reward of retweets pushing me any further into outrage.  I'm outraged enough as it is.

************************************

I was an undergraduate when the original Cosmos, with Carl Sagan, was launched, and being a physics major and an astronomy buff, I was absolutely transfixed.  Me and my co-nerd buddies looked forward to the new episode each week and eagerly discussed it the following day between classes.  And one of the most famous lines from the show -- ask any Sagan devotee -- is, "If you want to make an apple pie from scratch, first you must invent the universe."

Sagan used this quip as a launching point into discussing the makeup of the universe on the atomic level, and where those atoms had come from -- some primordial, all the way to the Big Bang (hydrogen and helium), and the rest formed in the interiors of stars.  (Giving rise to two of his other famous quotes: "We are made of star-stuff," and "We are a way for the universe to know itself.")

Since Sagan's tragic death in 1996 at the age of 62 from a rare blood cancer, astrophysics has continued to extend what we know about where everything comes from.  And now, experimental physicist Harry Cliff has put together that knowledge in a package accessible to the non-scientist, and titled it How to Make an Apple Pie from Scratch: In Search of the Recipe for our Universe, From the Origin of Atoms to the Big Bang.  It's a brilliant exposition of our latest understanding of the stuff that makes up apple pies, you, me, the planet, and the stars.  If you want to know where the atoms that form the universe originated, or just want to have your mind blown, this is the book for you.

[Note: if you purchase this book using the image/link below, part of the proceeds goes to support Skeptophilia!]



Friday, May 14, 2021

The network of nonsense

I've long been fascinated with communication network theory -- the model that maps out the rules behind the spread of information (and its ugly cousin, disinformation).  Back in my day (you'll have to imagine me saying this in a creaky old-geezer voice) both moved a lot more slowly; communities devoted to conspiracies, for example, had to rely on such clunky modes of transmission as newsletters, magazines, and word-of-mouth.

Now?  The internet, and especially social media, have become rapid-transit networks for bullshit.  The phenomenon of a certain idea, video, meme, or link "going viral" has meant that virtually overnight, it can go from being essentially unknown to basically everyone who is online seeing it.  There was nothing even close to comparable forty years ago.

Communications network theory looks at connectedness between different communities and individuals, the role of nodes (people or groups who are multiply-connected to many other people and groups), and "tastemakers" -- individuals whose promotion of something virtually guarantees it gaining widespread notice.  The mathematics of this model is, unfortunately, over my head, but the concepts are fascinating.  Consider the paper that came out this week in the journal Social Media and Society, "From 'Nasa Lies' to 'Reptilian Eyes': Mapping Communication About 10 Conspiracy Theories, Their Communities, and Main Propagators on Twitter," by Daniela Mahl, Jing Zeng, and Mike Schäfer of the University of Zürich.

In this study, they looked at the communities that have grown up around ten different conspiracy theories:

  1. Agenda 21, which claims that the United Nations has a plan to strip nations of their sovereignty and launch a one-world government
  2. The anti-vaccination movement
  3. The Flat Earthers
  4. Chemtrails -- the idea we're being dosed with psychotropic chemicals via jet exhaust contrails
  5. Climate change deniers
  6. Directed energy weapons -- high-intensity beams are being used to kill people and start natural disasters like major forest fires
  7. The Illuminati
  8. Pizzagate -- the claim that the Democrats are running some kind of nationwide human trafficking/pedophilia ring
  9. The Reptilians -- many major world leaders are reptilian aliens in disguise, and you can sometimes catch a glimpse of their real appearance in video clips
  10. "9/11 was an inside job"

They also looked at connections to two non-conspiracy communities -- pro-vaccination and anti-flat-Earth.

The researchers analyzed thousands of different accounts and tens of thousands of tweets to see what kind of overlap there was between these twelve online communities, as based on hashtag use, retweets, and so on.

What they found was that the communities studied formed eight tightly-networked clusters.  Here's a diagram of their results:


There are a couple of interesting features of this.

First, that six of the communities are so entangled that they form two multiply-connected clusters, the chemtrail/Illuminati/Reptilians cluster, and the Pizzagate/9/11/climate change denial clusters.  Both make sense considering who is pushing each of them -- the first by such conspiracy loons as David Icke, and the second by far-right media like Fox, OAN, and Newsmax.

Note, however, that even if three of the other conspiracy theories -- the anti-vaxxers, Agenda 21, and directed energy weapons -- are distinct enough that they form their own nodes, they still have strong connections to all the others.  The only one that stands out as essentially independent of all the others is the Flat Earthers.

Evidently the Flerfs are so batshit crazy that even the other crazies don't want to have anything to do with them.

This demonstrates something that I've long believed; that acceptance of one loony idea makes you more likely to fall for others.  Once you've jettisoned evidence-based science as your touchstone for deciding what is the truth, you'll believe damn near anything.

The other thing that jumps out at me is that the pro-vaccine and anti-flat-Earth groups have virtually no connections to any of the others.  They are effectively closed off from the groups they're trying to counter.  What this means is discouraging; that the people working to fight the network of nonsense by creating accounts dedicated to promoting the truth are sitting in an echo chamber, and their well-meant and fervent messages are not reaching the people whose minds need to be changed.

It's something that I've observed before; that it's all very well for people on Twitter and Facebook to post well-reasoned arguments about why Tucker Carlson, Tomi Lahren, Marjorie Taylor Greene, and Lauren Boebert are full of shit, but they're never going to be read by anyone who doesn't already agree.

It's why Fox News is so insidious.  Years ago, they and their spokespeople, commentators like Rush Limbaugh and Ann Coulter, started off by convincing their listeners that everyone else was lying.  Once you've decided that the only way to get the truth is to rely on one single source, you're at the mercy of the integrity and accuracy of that source.  In the case of Fox, you are vulnerable to being manipulated by a group of people whose representation of the news is so skewed it has run afoul of Great Britain's Office of Communications multiple times on the basis of inaccuracy, partiality, and inflammatory content.  (And in fact, last year Fox began an international streaming service in the UK, largely motivated by the fact that online content is outside the jurisdiction of the Office of Communications.)

Mahl et al. write:

Both anti-conspiracy theory communities, Anti-Flat Earth and Pro-Vaccination, are centered around scientists and medical practitioners.  Their use of pro-conspiracy theory hashtags likely is an attempt to directly engage and confront users who disseminate conspiracy theories.  Studies from social psychology have shown that cross-group communication can be an effective way to resolve misunderstandings, rumors, and misinformation.  By deliberately using pro-conspiracy hashtags, anti-conspiracy theory accounts inject their ideas into the conspiracists’ conversations.  However, our study suggests that this visibility does not translate into cross-group communication, that is, retweeting each other’s messages.  This, in turn, indicates that debunking efforts hardly traverse the two clusters.

I wish I had an answer to all this.  It's one thing if a group of misinformed people read arguments countering their beliefs and reject them; it's another thing entirely if the misinformed people are so isolated from the truth that they never even see it.  Twitter and Facebook have given at least a nod toward deplatforming the worst offenders -- one study found that the flow of political misinformation on Twitter dropped by 75% after Donald Trump's account was suspended -- but it's not dealing with the problem as a whole, because there even if you delete the platforms of the people responsible for the wellspring of bullshit, there will always be others waiting in the wings to step in and take over.

However discouraging this is, it does mean that the skeptics and science types can't give up.  Okay, we're not as multiply-connected as the wackos are; so we have to be louder, more insistent, more persistent.  Saying "oh, well, nothing we can do about it" and throwing in the towel will have only one effect; making sure the disinformation platforms reach more people and poison more conduits of discourse.

And I, for one, am not ready to sit back and accept that as inevitable.

********************************

I have often been amazed and appalled at how the same evidence, the same occurrences, or the same situation can lead two equally-intelligent people to entirely different conclusions.  How often have you heard about people committing similar crimes and getting wildly different sentences, or identical symptoms in two different patients resulting in completely different diagnoses or treatments?

In Noise: A Flaw in Human Judgment, authors Daniel Kahneman (whose wonderful book Thinking, Fast and Slow was a previous Skeptophilia book-of-the-week), Olivier Sibony, and Cass Sunstein analyze the cause of this "noise" in human decision-making, and -- more importantly -- discuss how we can avoid its pitfalls.  Anything we can to to detect and expunge biases is a step in the right direction; even if the majority of us aren't judges or doctors, most of us are voters, and our decisions can make an enormous difference.  Those choices are critical, and it's incumbent upon us all to make them in the most clear-headed, evidence-based fashion we can manage.

Kahneman, Sibony, and Sunstein have written a book that should be required reading for anyone entering a voting booth -- and should also be a part of every high school curriculum in the world.  Read it.  It'll open your eyes to the obstacles we have to logical clarity, and show you the path to avoiding them.

[Note: if you purchase this book using the image/link below, part of the proceeds goes to support Skeptophilia!]



Thursday, November 12, 2020

Content creation mania

While I don't want to excuse mental laziness, I think it's understandable sometimes if laypeople come to the conclusion that for every expert, there is an equal and opposite expert.

I ran into a good example of this over at Science Daily yesterday, when I read an article about the modern penchant for "creating content" wherever we go -- by which they mean things like taking photos and posting them on social media, tweeting or Facebook posting during experiences like concerts, sports events, and political rallies, and just in general never doing anything without letting the world know about it.

I'm not a social media addict by any stretch of the imagination, but I know I have that tendency sometimes myself.  I've tried to avoid Twitter ever since the presidential race really heated up, because I very quickly got sick of all the posturing and snarling and TWEETS IN ALL CAPS from people who should know better but apparently have the decorum and propriety of Attila the Hun.  I find Instagram a lot more fun because it's all photographs, and there's less opportunity for vitriol.  Even so, I still post on both pretty regularly, even if I don't reach the level of Continuous Live-Stream Commentary some people do.  (For what it's worth, I'm on Twitter @TalesOfWhoa and Instagram @skygazer227.  You're welcome to follow me on either or both.  Be forewarned if you follow me on Instagram, however, you'll mostly see pics of my dogs, gardens, pottery projects, and various running-related stuff.)

[Image is in the Public Domain]

The content-creation study, which appeared in the Journal of Marketing and was a team effort between researchers at Rutgers and New York Universities, found that contrary to the usual conventional wisdom that if you want to really enjoy something you should put away your phone, enjoyment and appreciation of experience increases when people are allowed to do things like tweet, Facebook post, or take and post photographs.  "In contrast to popular press advice," said study co-author Gabriela Tonietto, "this research uncovers an important benefit of technology's role in our daily lives... by generating content relevant to ongoing experiences, people can use technology in a way that complements, rather than interferes with, their experiences."

The problem is, this runs afoul of other studies that have shown social media engagement to be directly proportional to depression, anxiety, and disconnection from face-to-face contact with others.  A quick search will give you as many links as you like, to peer-reviewed research -- not just quick-takes in popular magazines -- warning of the dangers of spending time on social media.  Pick any one of these and you'll come away with the impression that whatever facet of social media the study looked at was the root of all modern psychiatric disorders.

Humans, though, are complex.  We don't categorize easily.  Social media might well create a sense of isolation in some and foster connectedness in others.  One person might derive real enjoyment from posting her vacation photos on Instagram; another might berate himself for how few "likes" he'd gotten.  There's also the problem of mistaking correlation for causation in all of these studies.  The people who report social media boosting their enjoyment might well be those who were well-adjusted to start with, for whom social media was simply another fun way to connect with friends and acquaintances; the people for whom it generates depression, anxiety, or addictive behavior could have had those tendencies beforehand, and the all-too-common desperation for "likes" simply made it all worse.  A paper in the journal Cyberpsychology, Behavior, and Social Networking back in 2014 admitted this up front:

During the past decade, online social networking has caused profound changes in the way people communicate and interact.  It is unclear, however, whether some of these changes may affect certain normal aspects of human behavior and cause psychiatric disorders.  Several studies have indicated that the prolonged use of social networking sites (SNS), such as Facebook, may be related to signs and symptoms of depression.  In addition, some authors have indicated that certain SNS activities might be associated with low self-esteem, especially in children and adolescents.  Other studies have presented opposite results in terms of positive impact of social networking on self-esteem.  The relationship between SNS use and mental problems to this day remains controversial, and research on this issue is faced with numerous challenges.

So I'm always inclined to view research on social and psychological trends with a bit of a weather eye.  Well-conducted research into the workings of our own psychology and sociology can be fascinating, but humans are complicated beasts and confounding factors are legion.  The upshot of the social media studies for me can be summarized in a Marie Kondo-ism: "does it spark joy?"  If posting photos of your pets' latest antics on Instagram boosts your enjoyment, have at it.  If you like pretending to be a color commentator on Twitter while watching your favorite team play, go for it.  If it all makes you feel depressed, anxious, or alone, maybe it is time to put away the phone.

In any case, I'm going to wind this up, because I need to share the link to today's post on Facebook and Twitter.  My public awaits.  And if I don't post on time, my like-total for the day will be low, and we can't have that.

************************************

This week's Skeptophilia book-of-the-week is about our much maligned and poorly-understood cousins, the Neanderthals.

In Rebecca Wragg Sykes's new book Kindred: Neanderthal Life, Love, Death, and Art we learn that our comic-book picture of these prehistoric relatives of Homo sapiens were far from the primitive, leopard-skin-wearing brutes depicted in movies and fiction.  They had culture -- they made amazingly evocative and sophisticated art, buried their dead with rituals we can still see traces of, and most likely had both music and language.  Interestingly, they interbred with more modern Homo sapiens over a long period of time -- DNA analysis of humans today show that a great many of us (myself included) carry around significant numbers of Neanderthal genetic markers.

It's a revealing look at our nearest recent relatives, who were the dominant primate species in the northern parts of Eurasia for a hundred thousand years.  If you want to find out more about these mysterious hominins -- some of whom were our direct ancestors -- you need to read Sykes's book.  It's brilliant.

[Note: if you purchase this book using the image/link below, part of the proceeds goes to support Skeptophilia!]