Skeptophilia (skep-to-fil-i-a) (n.) - the love of logical thought, skepticism, and thinking critically. Being an exploration of the applications of skeptical thinking to the world at large, with periodic excursions into linguistics, music, politics, cryptozoology, and why people keep seeing the face of Jesus on grilled cheese sandwiches.

Wednesday, May 3, 2017

Faith in the facts

I keep waiting for a day to go by in which someone in the Trump administration doesn't say something completely batshit insane.

The latest person to try to reach the summit of Mount Lunacy is Dr. Mark Green, nominee for Army Secretary, who apparently got his Ph.D. from Big Bob's Discount Diploma Warehouse.  Because besides such bizarre statements as "the government exists... to crush evil," particularly evil in the form of transgender people who are just looking for a quiet place to pee, Green has gone on record as saying that he not only doesn't accept evolution, he doesn't believe in...

... the Theory of Relativity.

In a speech that focused not on what he would do in his role as Army Secretary, but on The Universe According To Mark Green, he said, "The theory of relativity is a theory and some people accept it, but that requires somewhat of a degree of faith."

No.  No, no, no.  Faith is exactly what it doesn't take.  Although religious folks will probably disagree with me on this definition, faith is essentially believing in stuff for which you have no evidence; and as such, I've never really understood the distinction between "faith" and "delusion."  All that it takes to accept the Theory of Relativity is understanding the evidence that has been amassed in its favor.

[image courtesy of the Wikimedia Commons]

And at this point, the evidence is overwhelming.  Given its staggering conclusions -- weirdness like time dilation, the speed of light being the ultimate universal speed limit, and warped space -- it is understandable that after it was published, scientists wanted to make sure that Einstein was right.  So they immediately began designing experiments to test Einstein's theoretical predictions.

Needless to say, every single one of the experiments has supported that Einstein was 100% correct.  Every time there's some sort of suspected glitch -- like six years ago, when physicists at CERN thought they had detected a faster-than-light neutrino -- it's turned out to be an experimental error or an uncontrolled variable.  At this point, media should simply have a one-click method for punching in the headline "EINSTEIN VINDICATED AGAIN" whenever this sort of thing happens.

What is funniest about all of this is that the technology Green would be overseeing, as Army Secretary, includes SatNav guidance systems that use GPS coordinates -- which have to take relativistic effects into account.  If you decide that you "don't have enough faith" to accept relativity, your navigational systems will gradually drift out of sync with the Earth (i.e., with reality), and your multi-million-dollar tanks will end up driving directly off of cliffs.

So you need exactly zero faith to accept relativity.  Or evolution, or cosmology, or plate tectonics, or radioisotope dating, or any of the other scientifically sound models that Green and his ilk tend to jettison.  All you need to do is to take the time to learn some science.  What does take faith, however, is accepting that anyone who has as little knowledge of the real world as Mark Green does has any business running an entire branch of the military.

Anyhow, there you have it: our "alternative fact" of the day.  It's almost as good as the "alternative fact" of the day before, which came straight from Dear Leader Trump, to wit: Andrew Jackson was a good guy with a "big heart" who "was really angry about what he saw happening with the Civil War."  Oh, and the Civil War could "have all been worked out," and that "people don't ask the question" about why the Civil War started.

Except, of course, for the thousands of historians who have been writing about the causes of the Civil War for decades.  And Andrew "Big Heart" Jackson was responsible for the forced deportation of fifteen thousand Native Americans from their ancestral homes, in one of the biggest forced relocations ever perpetrated, and in which a quarter of them died of disease, starvation, and exposure.

Oh, yeah, and I don't think Jackson was particularly angry about the Civil War, given that he died sixteen years before it started.

So it'd be nice if our leaders would stop saying things that turn the United States into a world-wide laughingstock.  I'm planning on going to Ecuador this summer, and I'd really like it if I don't have to tell the Ecuadorians I meet that just because I'm an American doesn't mean I'm an ignorant, raving loon.  Thank you.

Tuesday, May 2, 2017

Aesthetic synchrony

Probably most of you have had the fortunate experience of being in a situation where you were completely engaged in what you were doing.  This can be especially powerful when you are being given the chance to experience something novel -- listening to a lecture by a truly masterful speaker, attending a performance of music or theater, visiting a place of great natural beauty -- when you are having what writer Sir Ken Robinson (speaking of masterful lecturers) calls in his talk "Changing Education Paradigms" "an aesthetic experience, when your senses are operating at their peak, when you're present in the current moment, when you're resonating with the excitement of this thing you're experiencing, when you are fully alive."

When this happens, we often say we are "on the same wavelength" with others who are sharing the experience with us.   And now, a team led by Suzanne Dikker of New York University has shown that this idiom might literally be true.

Dikker's team had thirteen test subjects -- twelve high school students and their teacher -- wear portable electroencephalogram headsets for an entire semester of biology classes.  Naturally, some of the topics and activities were more engaging than others, and the researchers had students self-report daily on such factors as how focused they were, how much they enjoyed their teacher's presentation, how much they enjoyed the students they interacted with, and their satisfaction levels about the activities they were asked to take part in.

[image courtesy of the Wikimedia Commons]

Dikker et al. write:
The human brain has evolved for group living.  Yet we know so little about how it supports dynamic group interactions that the study of real-world social exchanges has been dubbed the "dark matter of social neuroscience."  Recently, various studies have begun to approach this question by comparing brain responses of multiple individuals during a variety of (semi-naturalistic) tasks. These experiments reveal how stimulus properties, individual differences, and contextual factors may underpin similarities and differences in neural activity across people...  Here we extend such experimentation drastically, beyond dyads and beyond laboratory walls, to identify neural markers of group engagement during dynamic real-world group interactions.  We used portable electroencephalogram (EEG) to simultaneously record brain activity from a class of 12 high school students over the course of a semester (11 classes) during regular classroom activities.  A novel analysis technique to assess group-based neural coherence demonstrates that the extent to which brain activity is synchronized across students predicts both student class engagement and social dynamics.  This suggests that brain-to-brain synchrony is a possible neural marker for dynamic social interactions, likely driven by shared attention mechanisms.  This study validates a promising new method to investigate the neuroscience of group interactions in ecologically natural settings.
Put simply, what the researchers found is that when the students reported feeling the most engaged, their brain activity actually synced with that of their classmates.  It squares with our subjective experience, doesn't it?  I know when I'm bored, irritated, or angered by something I'm being required to participate in, I tend to unhook my awareness from where I am -- including being less aware of those around me who are suffering through the same thing.

It's no wonder we call this kind of response "disengaging," is it?

So apparently misery doesn't love company; what loves company is engagement, appreciation, and a sense of belonging.  "The central hub seems to be attention," Dikker says.  "But whatever determines how attentive you are can stem from various sources from personality to state of mind.  So the picture that seems to emerge is that it's not just that we pay attention to the world around us; it's also what our social personalities are, and who we're with."

All the more reason we teachers should focus as much on getting our students hooked on learning as we do on the actual content of the course.  My experience is that if you can get students to "buy in" -- if (in my case) they come away thinking biology is cool, fun, and interesting -- it doesn't matter so much if they can't remember what ribosomes do.  They can fit the facts in later, these days with a thirty-second lookup on Wikipedia.

What can't be looked up is being engaged to the point that you care what ribosomes do.

Unfortunately, in the educational world we've tended to go the other direction.  The flavor of the month is micromanagement from the top down, a set syllabus full of factlets that each student must know, an end product that can fit on a bubble sheet, "quantifiable outcomes" that generate data that the b-b stackers in the Department of Education can use to see if our teachers are teaching and our students learning.  A pity that, as usual, the people who run the business of educating children are ignoring what the research says -- that the most fundamental piece of the puzzle is student engagement.

If you have that, everything else will follow.

Monday, May 1, 2017

Poker face

A wag once said, "Artificial intelligence is twenty years in the future, and always will be."  It's a trenchant remark; predictions about when we'd have computers that could truly think have been off the mark ever since scientists at the Dartmouth Summer Research Project in Artificial Intelligence stated that they would have the problem cracked in a few months...

... back in 1956.

Still, progress has been made.  We now have software that learns from its mistakes, can beat grand masters at strategy games like chess, checkers, and Go, and have come damn close to passing the Turing test.  But the difficulty of emulating human intelligence in a machine has proven to be more difficult than anyone would have anticipated, back when the first computers were built in the 1940s and 1950s.

We've taken a new stride recently, however.  Just a couple of months ago, researchers at the University of Alberta announced that they had created software that could beat human champions at Texas Hold 'Em, a variant of poker.  Why this is remarkable -- and more of a feat than computers that can win chess -- is that all previous game-playing software involved games in which both players have identical information about the state of the game.  In poker, there is hidden information.  Not only that, but a good poker player needs to know how to bluff.

In other words... lie.


Michael Bowling, who led the team at the University of Alberta, said that this turned out to be a real challenge.  "These poker situations are not simple," Bowling said.  "They actually involve asking, 'What do I believe about my opponent’s cards?'"

But the program, called DeepStack, turned out to be quite good at this, despite the daunting fact that in Texas Hold 'Em there are about 10160 decision points -- more unique scenarios than there are atoms in the universe.  But instead of analyzing all the possibilities, as a program might do in chess (such an approach in this situation would be, for all practical purposes, impossible), DeepStack plays much like a person would -- by speculating on the likelihood of certain outcomes based on the limited information it has.

"It will do its thinking on the fly while it is playing," Bowling said.  "It can actually generalize situations it's never seen before."

Which is pretty amazing.  But not everyone is as impressed as I am.

When Skeptophilia frequent flier Rick Wiles, of End Times radio, heard about DeepStack, he was appalled that we now had a computer that could deceive. "I'm still thinking about programming robots to lie," Wiles said.  "This has been done to us for the past thirty, forty, fifty years -- Deep State has deliberately lied to the public because they concluded that it was in our best interest not to be told the truth...  What's even scarier about the robots that can lie is that they weren't programmed to lie, they learned to lie.  Who's the father of all lies?  Satan is the father of all lies.  Are we going to have demon-possessed artificially intelligent robots?  Is it possible to have demonic spirit to possess an artificial intelligent machine?  Can they possess idols?  Can they inhabit places?  Yeah.  Absolutely.  They can take possession of animals.  They can attach themselves to inanimate objects.  If you have a machine that is capable of lying, then it has to be connected to Lucifer.  Now we’re back to the global brain.  This is where they’re going.  They’re building a global brain that will embody Lucifer’s mind and so Lucifer will be deceiving people through the global brain."

So there's that.  But the ironic thing is that, all demonic spirit bullshit aside, Wiles may not be so far wrong.  While I think the development of artificial intelligence is fascinating, and I can understand why researchers find it compelling, you have to worry what our creations might think about us once they do reach sentience.  This goes double if you can no longer be sure that what the computer is telling you is the truth.

Maybe what we should be worried about is not a computer that can pass the Turing test; it's one that can pass the Turing test -- and chooses to pretend, for its own reasons, that it can't.

I mean, the last thing I want is to go on record as saying I agree with Rick Wiles on anything.  But still.

So that's our rather alarming news for the day.  It's not that I think we're headed into The Matrix any time soon; but the idea that we might be supplanted by intelligent machines of our own making, the subject of countless science fiction stories, may not be impossible after all.

And maybe the artificial intelligence of twenty years in the future may not be as far away as we thought.

Saturday, April 29, 2017

Awoo

Yesterday I was asked by one of my Critical Thinking Students if I'd ever heard of Florida Swamp Apes.  After a brief moment in which I wondered if he were asking about a sports team, I answered in the negative.  He brought out his cellphone, on which he had downloaded an admittedly creepy image, which I include below:



Imagine my surprise when I found out that there's a whole site devoted to this odd beast, also called the "Florida Skunk Ape" for its strong smell.  Considered to be the "southernmost Bigfoot species in the United States," the Florida Skunk Ape has been sighted all over southern Florida, but most commonly in the Everglades region.

As with most of these alleged animals, the claims of sightings are numerous and varied, and the hard evidence essentially non-existent.  There are a lot of photographs, but to borrow a line from the astronomer Neil DeGrasse Tyson, there probably is an "Add Bigfoot" button on PhotoShop, so we shouldn't consider the photographic evidence to be evidence at all.  Also on the website is an audio clip of a Skunk Ape's howls, which to my ear sounded more like a distant dog, or possibly a guy going "Awoo."  We also have an interview with Dave Shealy, who seems to be one of the people responsible for the whole Skunk Ape phenomenon (he is the director of the Skunk Ape Research Center of Ochopee, Florida, open 7 AM to 7 PM, admission $5, which I am definitely going to visit next time I'm in Florida).  Lastly, we are informed that Skulls Unlimited, a company which sells a virtually unlimited number of skulls (thus the name), is now offering resin models of Bigfoot skulls.   One has to wonder what they cast the mold from, but in the field of cryptozoology it is sometimes best not to ask too many questions.

I thought I had heard of most of the cryptozoological claims from the United States, but this one was new to me.  Of course, the Sasquatch of the Pacific Northwest is so familiar by now as to elicit yawns, and many of us know of the Boggy Creek Monster of Fouke, Arkansas, which generated not one, nor two, but five truly dreadful movies.  There's Mothman and the Flatwoods Monster in West Virginia, the Dover Demon of Massachusetts, the Enfield Monster of Illinois, Goatman of Maryland, and dozens of others.  But the Skunk Ape is one I'd never heard of before, and I'm finding myself wondering how I missed it.  It did cross my mind briefly that perhaps the Skunk Ape sightings were merely elderly Bigfoots from the north who had moved to Florida when they retired, but apparently this is incorrect, as one site talks about a sighting of a "young and vigorous animal, probably an adolescent" and another refers to "Skunk Ape mating season"  (May, if you're curious; but you might want to refrain from wandering around the swamps of Florida in May, because two female European tourists tell the story of being chased by a "huge male Skunk Ape with an erection."  They got away, fortunately.)

"Not everyone who sees a Skunk Ape reports it," Dave Shealy says.  "They don't want people to poke fun at 'em, or to tell 'em they're crazy. That's not the exception; that's pretty much the rule...  There's never been a documented case of anyone ever being physically attacked by a Skunk Ape.  But also, there's a lot of people that go into the Everglades that never come out." 

Which really isn't all that reassuring.

In any case, the Florida Skunk Ape gives us yet another line in the ledger of Extraordinary Claims Requiring Extraordinary Evidence Of Which There Seems To Be None.  It's just as well, because it's the last week of April, so Skunk Ape mating season is almost upon us, and if there really was evidence that this thing exists I would feel duty-bound to go investigate, and the last thing I want is to be chased around in some godforsaken swamp by a Bigfoot with a boner.  So I think I'll give this one a pass.  

Friday, April 28, 2017

Playing on the heartstrings

I'm a pretty emotional guy, and one of the things that never fails to get me is music.  Among the musical moments that always grab me by the feels and swing me around, sometimes to the point of tears, are:
Then, there are the ones that send chills up my spine.  A few of those:
I've always been fascinated by this capacity for music to induce emotion.  Such a response is nearly universal, although which music causes tears or that little frisson up the spine varies greatly from person to person.  Most of Mozart's music (with the exception of the Requiem and a couple of other pieces) really doesn't do much for me.  It's pleasant to listen to, but doesn't evoke much in me other than that.  I actively dislike Chopin, Brahms, and Mahler, and I know people for whom those are the absolute pinnacle of emotional depth in music.

[image courtesy of the Wikimedia Commons]

In a paper released just last week in Nature, neurophysiologists Kazuma Mori and Makoto Iwanaga of Osaka University looked into an explanation for how this phenomenon happens, if not exactly why it happens.  Their paper, "Two Types of Peak Emotional Responses to Music: The Psychopathology of Chills and Tears," describes experiments they ran in which they allowed test subjects to listen to music while monitoring their reactions not only via subjective description but by such physiological criteria as skin conductivity (a common measure of stress).

And what happened was pretty cool.  They found that (as I have done above) strongly evocative pieces of music tended to fall into two categories, ones that elicit tears and ones that elicit chills.  The authors write:
The present study investigated the psychophysiological responses of two types of peak emotions: chills and tears.  We used music as the stimuli because the chills response has been confirmed in music and emotion studies... The chills and tears responses were measured by self-report sensations during song listening.  We conducted an experiment measuring subjective emotions and autonomic nervous system activity.  The hypothesis was that tears would be different from chills in terms of both psychological and physiological responses.  With respect to psychophysiological responses, we predicted that chills would induce subjective pleasure, subjective arousal, and physiological arousal whereas tears would induce subjective pleasure, relaxation, and physiological calming.  In addition, we asked participants to rate song expression in terms of happiness, sadness, calm, and fear in order to understand the emotional property of chills-inducing songs and tear-inducing songs...  [The] results show that tears involve pleasure from sadness and that they are psychophysiologically calming; thus, psychophysiological responses permit the distinction between chills and tears.  Because tears may have a cathartic effect, the functional significance of chills and tears seems to be different.
Which supports the contention that my experience of bawling the first time I listened to Ralph Vaughan Williams's Fantasia on a Theme by Thomas Tallis served the purpose of emotional catharsis.  I know my mood was better after the last chords died out, with the exception of the fact that I felt a little like a wrung-out dishrag; and despite the fact that I don't exactly like crying, I listen to these tear-evoking pieces of music over and over.  So there must be something there I'm seeking, and I don't think it's pure masochism.  The authors write:
The current results found that the mixed emotion of chills was simultaneous pleasure, happiness, and sadness.  This finding means that chills provide mainly a positive experience but the sadness factor is necessary even though a favourite song is the elicitor.  Given that music chills activate reward-related brain regions, such an emotional property could make chills a unique experience and separate chills from other mixed emotional experiences.  Furthermore, as the mixed emotion of tears was simultaneous pleasure and sadness, it was different from the mixed emotion of chills.  The tears response contributes to the understanding of the pleasure of sad music.  As people generally feel displeasure for sad things, this is a unique mixed emotional response with regard to music.  Although previous studies showed that sad music induced relatively weak pleasure, the current tears’ results showed that sad songs induced strong pleasure.  It is difficult to account for why people feel sad music as pleasurable; however, the current results suggested that the benefit of cathartic tears might have a key role in the pleasure generated by sad music.  Therefore, the two types of peak emotional responses may uniquely support knowledge of mixed emotion.
So that's pretty awesome, and it's nice to know that I'm not alone in my sometimes overwhelming response to music.  And now I think I'll go listen to Shostakovich's Symphony #5 and have a nice long cry.  I know I'll feel better afterwards.

Thursday, April 27, 2017

Going to the dogs

I am the proud owner of two dogs, both rescues, who are at this point basically members of the family whose contributions to the household consist of barking at the UPS guy, sleeping most of the day, getting hair all over everything, and making sure that we get our money's worth out of the carpet steamer we bought five years ago.

First, there's Lena the Wonder-Hound:


And her comical sidekick, Grendel:


Both of them are sweet and affectionate and spoiled absolutely rotten.  Lena's ancestry is pretty clear -- she's 100% hound, probably mostly Blue-tick Coonhound, Redbone, and Beagle -- but Grendel's a bit of a mystery.  Besides his square face and coloration, other significant features are: (1) a curly tail; (2) a thick undercoat; and (3) a tendency to snore.  This last has made us wonder if he has some Pug or Bulldog in his background somewhere, but that's only speculation.

This all comes up because of a recent delightful study in one of my favorite fields, cladistics.  The idea of cladistics is to create a tree of descent for groups of species based on most recent common ancestry, as discerned from overlap in DNA sequences.  And a group of researchers -- Heidi G. Parker, Dayna L. Dreger, Maud Rimbault, Brian W. Davis, Alexandra B. Mullen, Gretchen Carpintero-Ramirez, and Elaine A. Ostrander of the Comparative Genomics Branch of the National Human Genome Research Institute -- have done this for 161 breeds of dog.

The authors write:
The cladogram of 161 breeds presented here represents the most diverse dataset of domestic dog breeds analyzed to date, displaying 23 well-supported clades of breeds representing breed types that existed before the advent of breed clubs and registries.  While the addition of more rare or niche breeds will produce a denser tree, the results here address many unanswered questions regarding the origins of breeds.  We show that many traits such as herding, coursing, and intimidating size, which are associated with specific canine occupations, have likely been developed more than once in different geographical locales during the history of modern dog.  These data also show that extensive haplotype sharing across clades is a likely indicator of recent admixture that took place in the time since the advent of breed registries, thus leading to the creation of most of the modern breeds.  However, the primary breed types were developed well before this time, indicating selection and segregation of dog populations in the absence of formal breed recognition.  Breed prototypes have been forming through selective pressures since ancient times depending on the job they were most required to perform.  A second round of hybridization and selection has been applied within the last 200 years to create the many unique combinations of traits that modern breeds display.  By combining genetic distance relationships with patterns of haplotype sharing, we can now elucidate the complex makeup of modern dogs breeds and guide the search for genetic variants important to canine breed development, morphology, behavior, and disease.
Which is pretty cool.  What I found most interesting about the cladogram (which you can see for yourself if you go to the link provided above) is that breeds that are often clustered together, and known by the same common name -- such as "terrier" -- aren't necessarily closely related.  This shouldn't be a surprise, of course; all you have to do is look at the relationships between birds called "buntings" or "sparrows" or "tanagers" to realize that common names tell you diddly-squat about actual genetic distance.  But it was still surprising to find that (for example) Bull Terriers and Staffordshire Terriers are more closely related to Bulldogs and Mastiffs than they are to (for example) Scottish Terriers; that Corgis are actually related to Greyhounds; and that Schnauzers, Pugs, Pomeranians, and Schipperkes are all on the same clade.  The outgroup (most distantly related branch) of the entire clade is the peculiar Basenji, a Central African breed with a strange, yodel-like bark, a curly tail, and pointed ears, whose image has been recorded almost unchanged all the way back to the time of the ancient Egyptians.

Anyhow, it's an elegant bit of research, and sure to be of interest to any other dog owners in the studio audience.  Me, I'm wondering where Grendel fits into the cladogram.  Considering his peculiar set of traits, he might have a branch all his own, and give the Basenji a run for its money as the oddest breed out there.

Wednesday, April 26, 2017

In your right mind

Another peculiarity of the human brain is lateralization, which is the tendency of the brain to have a dominant side.  It's most clearly reflected in hand dominance; because of the cross-wiring of the brain, people who are right-handed have a tendency to be left brain dominant, and vice versa.  (There's more to it than that, as some people who are right handed are, for example, left eye dominant, but handedness is the most familiar manifestation of brain lateralization.)

It bears mention at this juncture that the common folk wisdom that brain lateralization has an influence on your personality -- that, for instance, left brain dominant people are sequential, mathematical, and logical, and right brain dominant people are creative, artistic, and holistic -- is complete nonsense.  That myth has been around for a long while, and has been roundly debunked, but still persists for some reason.

I first was introduced to the concept of brain dominance when I was in eighth grade.  I was having some difficulty reading, and my English teacher, Mrs. Gates, told me she thought I was mixed-brain dominant -- that I didn't have a strongly lateralized brain -- and that this often lead to processing disorders like dyslexia.  (She was right, but they still don't know why that connection exists.)  It made sense.  When I was in kindergarten, I switched back and forth between writing with my right and left hand about five times until my teacher got fed up and told me to simmer down and pick one.  I picked my right hand, and have stuck with it ever since, but I still have a lot of lefty characteristics.  I tend to pick up a drinking glass with my left hand, and I'm strongly left eye dominant, for example.

Anyhow, Mrs. Gates identified my mixed-brainness, and the outcome apropos of my reading facility, but she also told me that there was one thing that mixed-brain people can learn faster than anyone else.  Because of our nearly-equal control from both sides of the brain, we can do a cool thing, which Mrs. Gates taught me and I learned in fifteen seconds flat.  I can write, in cursive, forward with my right hand while I'm writing the same thing backwards with my left.  (Because it's me, they're both pretty illegible, but it's still kind of a fun party trick.)

[image courtesy of the Wikimedia Commons]

Fast forward to today.  Some recent research has begun to elucidate the evolutionary reasons behind lateralization.  It's been known for years that lots of animals are lateralized, so it stands to reason that it must confer some kind of evolutionary advantage, but what that might be was unclear... until now.

Research by a team led by Onur Güntürkün, of the Institute of Cognitive Neuroscience at Ruhr-University Bochum, in Germany, has looked at lateralization in animals from cockatoos to zebra fish to humans, and has described the possible evolutionary rationale for having a dominant side of the brain.

"What you do with your hands is a miracle of biological evolution," Güntürkün says.  "We are the master of our hands, and by funneling this training to one hemisphere of our brains, we can become more proficient at that kind of dexterity.  Natural selection likely provided an advantage that resulted in a proportion of the population -- about 10% -- favoring the opposite hand. The thing that connects the two is parallel processing, which enables us to do two things that use different parts of the brain at the same time."

Additionally, Güntürkün says, our perceptual systems have also evolved that kind of division of labor.  Both left and right brain have visual recognition centers, but in humans the one on the right side is more devoted to image recognition, and the one on the left to word and symbol recognition.  And this is apparently a very old evolutionary innovation, long predating our use of language; even pigeons have a split perceptual function between the two sides of the brain (and therefore between their eyes).  They tend to tilt their heads so their left eye is scanning the ground for food while their right one scans the sky for predators.

So what might seem to be a bad idea -- ceding more control to one side of the brain than the other, making one hand more nimble than the other --turns out to have a distinct advantage.  And if you'll indulge me in a little bit of linguistics geekery, for good measure, even our word "dexterous" reflects this phenomenon.  "Dexter" is Latin for "right," and reflects the commonness of right-handers, who were considered to be more skillful.  (And when you find out that the Latin word for "left" is "sinister," you get a rather unfortunate lens into attitudes toward southpaws.)

Anyhow, there you have it; another interesting feature of our brain physiology explained, and one that has a lot of potential for increasing our understanding of neural development.  "Studying asymmetry can provide the most basic blueprints for how the brain is organized," Güntürkün says.  "It gives us an unprecedented window into the wiring of the early, developing brain that ultimately determines the fate of the adult brain.  Because asymmetry is not limited to human brains, a number of animal models have emerged that can help unravel both the genetic and epigenetic foundations for the phenomenon of lateralization."