Skeptophilia (skep-to-fil-i-a) (n.) - the love of logical thought, skepticism, and thinking critically. Being an exploration of the applications of skeptical thinking to the world at large, with periodic excursions into linguistics, music, politics, cryptozoology, and why people keep seeing the face of Jesus on grilled cheese sandwiches.

Wednesday, May 5, 2021

Memory boost

There's one incorrect claim that came up in my biology classes more than any other, and that's the old idea that "humans only use 10% of their brain."  Or 5%.  Or 2%.  Often bolstered by the additional claim that Einstein is the one who said it.  Or Stephen Hawking.  Or Nikola Tesla.

Or maybe all three of 'em at once, I dunno.

The problem is, there's no truth to any of it, and no evidence that the claim originated with anyone remotely famous.  That at present we understand only 10% of the brain is doing -- that I can believe.  That we're using less than 100% of our brain at any given time -- of course.

But the idea that evolution has provided us with these gigantic processing units, which (according to a 2002 study by Marcus Raichle and Debra Gusnard) consume 20% of our oxygen and caloric intake, and then we only ever access 10% of its power -- nope, not buying that.  Such a waste of resources would be a significant evolutionary disadvantage, and would have weeded out the low-brain-use individuals long ago.  (It's sufficient to look at some members of Congress to demonstrate that the last bit, at least, didn't happen.)

But at least it means we may escape the fate of the world in Idiocracy.

And speaking of movies, the 2014 cinematic flop Lucy didn't help matters, as it features a woman who gets poisoned with a synthetic drug that ramps up her brain from its former 10% usage rate to... *gasp*... 100%.  Leading to her becoming able to do telekinesis and the ability to "disappear within the space/time continuum."

Whatever the fuck that even means.

All urban legends and goofy movies aside, the actual memory capacity of the brain is still the subject of contention in the field of neuroscience.  And for us dilettante science geeks, it's a matter of considerable curiosity.  I know I have often wondered how I can manage to remember the scientific names of obscure plants, the names of distant ancestors, and melodies I heard fifteen years ago, but I routinely have to return to rooms two or three times because I keep forgetting what I went there for.

So I found it exciting to read about a study in the journal eLife, by Terry Sejnowski (of the Salk Institute for Biological Studies), Kristen Harris (of the University of Texas/Austin), et al., entitled "Nanoconnectomic Upper Bound on the Variability of Synaptic Plasticity."  Put more simply, what the team found was that human memory capacity is ten times greater than previously estimated.

In computer terms, our storage ability amounts to one petabyte.  And put even more simply for non-computer types, this translates roughly into "a shitload of storage."

"This is a real bombshell in the field of neuroscience," Sejnowski said.  "We discovered the key to unlocking the design principle for how hippocampal neurons function with low energy but high computation power.  Our new measurements of the brain's memory capacity increase conservative estimates by a factor of 10 to at least a petabyte, in the same ballpark as the World Wide Web."

The discovery hinges on the fact that there is a hierarchy of size in our synapses.  The brain ramps up or down the size scale as needed, resulting in a dramatic increase in our neuroplasticity -- our ability to learn.

"We had often wondered how the remarkable precision of the brain can come out of such unreliable synapses," said team member Tom Bartol.  "One answer is in the constant adjustment of synapses, averaging out their success and failure rates over time...  For the smallest synapses, about 1,500 events cause a change in their size/ability and for the largest synapses, only a couple hundred signaling events cause a change.  This means that every 2 or 20 minutes, your synapses are going up or down to the next size.  The synapses are adjusting themselves according to the signals they receive."

"The implications of what we found are far-reaching," Sejnowski added.  "Hidden under the apparent chaos and messiness of the brain is an underlying precision to the size and shapes of synapses that was hidden from us."

And the most mind-blowing thing of all is that all of this precision and storage capacity runs on a power of about 20 watts -- less than most light bulbs.

Consider the possibility of applying what scientists have learned about the brain to modeling neural nets in computers.  It brings us one step closer to something neuroscientists have speculated about for years -- the possibility of emulating the human mind in a machine.

"This trick of the brain absolutely points to a way to design better computers," Sejnowski said.  "Using probabilistic transmission turns out to be as accurate and require much less energy for both computers and brains."

Which is thrilling and a little scary, considering what happened when HAL 9000 in 2001: A Space Odyssey basically went batshit crazy halfway through the movie.

That's a risk that I, for one, am willing to take, even if it means that I might end up getting turned into a Giant Space Baby.

But I digress.

In any case, the whole thing is pretty exciting, and it's reassuring to know that the memory capacity of my brain is way bigger than I thought it was.  Although it still leaves open the question of why, with a petabyte of storage, I still can't remember where I put my car keys.


Ever get frustrated by scientists making statements like "It's not possible to emulate a human mind inside a computer" or "faster-than-light travel is fundamentally impossible" or "time travel into the past will never be achieved?"

Take a look at physicist Chiara Marletto's The Science of Can and Can't: A Physicist's Journey Through the Land of Counterfactuals.  In this ambitious, far-reaching new book, Marletto looks at the phrase "this isn't possible" as a challenge -- and perhaps, a way of opening up new realms of scientific endeavor.

Each chapter looks at a different open problem in physics, and considers what we currently know about it -- and, more importantly, what we don't know.  With each one, she looks into the future, speculating about how each might be resolved, and what those resolutions would imply for human knowledge.

It's a challenging, fascinating, often mind-boggling book, well worth a read for anyone interested in the edges of scientific knowledge.  Find out why eminent physicist Lee Smolin calls it "Hugely ambitious... essential reading for anyone concerned with the future of physics."

[Note: if you purchase this book using the image/link below, part of the proceeds goes to support Skeptophilia!]

No comments:

Post a Comment