Skeptophilia (skep-to-fil-i-a) (n.) - the love of logical thought, skepticism, and thinking critically. Being an exploration of the applications of skeptical thinking to the world at large, with periodic excursions into linguistics, music, politics, cryptozoology, and why people keep seeing the face of Jesus on grilled cheese sandwiches.
Showing posts with label experts. Show all posts
Showing posts with label experts. Show all posts

Friday, October 20, 2023

Internet expertise

What is it with people trusting random laypeople over experts?

Okay, yeah, I know experts can be wrong, and credentials are not an unshakable guarantee that the person in question knows what they're talking about.  But still.  Why is it so hard to accept that an actual scientist, who has a Ph.D. in the field and has done real research, in general will know more about the topic than some dude on the internet?

The topic comes up because of a conversation I had with my athletic trainer yesterday.  He is pretty knowledgeable about all things fitness-related -- so while he's not a researcher or a scientist (something he'd tell you up front), he's certainly Better Than The Average Bear.  And he ran into an especially ridiculous example of the aforementioned phenomenon, which he was itching to show me as soon as I got there.

Without further ado, we have: the woman who thinks that the amino acid L-glutamine is so named because it is important for developing your glutes:

And of course, it must be right because she heard it from "the TikTok Fitness Girls, and they don't lie."

The whole thing reminds me of something I heard every damn year from students, which is that the ingredient sodium erythorbate in hotdogs and other processed meat products is made from ground-up earthworms, because "earthworm" and "erythorbate" sound a little bit alike.  (Actually, sodium erythorbate is an antioxidant that is chemically related to vitamin C, and is added to meat products as a preservative and antibacterial agent.)

But to return to the broader point, why is it so hard to accept that people who have studied a subject actually... know a lot about the subject?  Instead, people trust shit like:

And I feel obliged to make my usual disclaimer that I am not making any of the above up.

I wonder how much of this attitude, especially here in the United States, comes from the egalitarian mindset being misapplied -- that "everyone should have the same basic rights" spills over into "everyone's opinion is equally valid."  I recall back when George W. Bush was running for president, there was a significant slice of voters who liked him because he came across as a "regular guy -- someone you could sit down and have a beer with."  He wasn't an "intellectual elite" (heaven knows that much was true enough).  

And I remember reacting to that with sheer bafflement.  Hell, I know I'm not smart enough to be president.  I want someone way more intelligent than I am to be running the country.  Why is "Vote Bush -- He's Just As Dumb As You Are" considered some kind of reasonable campaign slogan?

I think the same thing is going on here -- people hear about the new health miracle from Some Guy Online, and it sounds vaguely plausible, so they give more credence to him than they do to an actual expert (who uses big complicated words and doesn't necessarily give you easy solutions to your health problems).  If you don't have a background in biological science yourself, maybe it sounds like it might work, so you figure you'll give it a try.  After that, wishful thinking and the placebo effect do the rest of the heavy lifting, and pretty soon you're naked in the park sunning your nether orifice.

There's a willful part of this, though.  There comes a point where it crosses the line from simple ignorance into actual stupidity.  To go back to my original example, a thirty-second Google search would tell you that L-glutamine has nothing to do with your glutes.  (In fact, the two words don't come from the same root, even though they sound alike; glutamine comes from the Latin gluten, meaning "sticky," and glutes comes from the Greek γλουτός, meaning buttocks.)  To believe that L-glutamine will develop your glutes because the TikTok Fitness Girls say so, you need to be not only (1) ignorant, but (2) gullible, and (3) uninterested in learning any better.

And that, I find incomprehensible.

I'll end with the famous quote from Isaac Asimov, which seems to sum up the whole bizarre thing about as well as anyone could: "There is a cult of ignorance in the United States, and there has always been.  The strain of anti-intellectualism has been a constant thread winding its way through our political and cultural life, nurtured by the false notion that democracy means that 'my ignorance is just as good as your knowledge.'"

****************************************



Thursday, August 30, 2018

Going to the source

One of the hardest things for skeptics to fight is the tendency by some people to swallow any damnfool thing they happen to see online.

I had credited this tendency to gullibility.  If you see a catchy meme implying that if you drink a liter of vinegar a day, your arthritis will be cured ("Doctors hate this!  Get well with this ONE WEIRD TRICK!"), and think it sounds plausible, it's just because you don't have the background in science (or logic) to sift fact from fiction.

It turns out, the truth is apparently more complex than this.

According to a trio of psychologists working at the Johannes Gutenberg University Mainz and the Université Libre de Bruxelles, the problem isn't that silly ideas sound plausible to some people; it's that their mindset causes them to weight all information sources equally -- that one guy's blog is just as reliable as a scientific paper written by experts in the field.

(And yes, I'm fully aware of the irony of One Guy writing that in his blog.)

[Image licensed under the Creative Commons Karen Thibaut, Belmans in labo, CC BY-SA 3.0]

The paper, "Using Power as a Negative Cue: How Conspiracy Mentality Affects Epistemic Trust in Sources of Historical Knowledge," was written by Roland Imhoff, Pia Lamberty, and Olivier Klein, and appeared in the Personality and Social Psychology Bulletin a couple of months ago.  The authors write:
Classical theories of attitude change point to the positive effect of source expertise on perceived source credibility persuasion, but there is an ongoing societal debate on the increase in anti-elitist sentiments and conspiracy theories regarding the allegedly untrustworthy power elite.  In one correlational and three experimental studies, we tested the novel idea that people who endorse a conspiratorial mind-set (conspiracy mentality) indeed exhibit markedly different reactions to cues of epistemic authoritativeness than those who do not: Whereas the perceived credibility of powerful sources decreased with the recipients’ conspiracy mentality, that of powerless sources increased independent of and incremental to other biases, such as the need to see the ingroup in particularly positive light.  The discussion raises the question whether a certain extent of source-based bias is necessary for the social fabric of a highly complex society.
So people with a "conspiracy mentality" fall for conspiracies not because they're ignorant or gullible, but because their innate distrust of authority figures causes them to trust everyone equally -- they often frame it as being "open-minded" or "unbiased" -- regardless of what the credentials, background, expertise, or (even) sanity of the source.

In an interview in PsyPost, study co-author Roland Imhoff explained the angle they took on this perplexing social issue:
The very idea for the study was born in a joint discussion with my co-author Olivier Klein at a conference of social psychological representations of history.  We were listening to talks about all kinds of construals, biases and narratives about what happened in the ancient or not so ancient past.   Having the public debate about ‘alternative facts’ from after Trump’s inauguration still in the back of our minds, we wondered: how do we even know what we know, how do we know who to trust when it comes to events we all have not experienced in first person? 
While previous research had insisted that this is predominantly a question of trusting ingroup sources (i.e., my government, my national education institutions), we had a lingering suspicion that people who endorse conspiracy theories might have a different system of epistemic trust: not trusting those who are in power (and allegedly corrupt).
Which points out a problem I'd always found baffling -- why, to many people, is "being an intellectual elite" a bad thing?  It was one of the (many) epithets I heard hurled at Barack Obama -- that being Harvard-educated, he couldn't possibly care about, or even be aware, of the problems of ordinary middle-class America.  Conversely, this card was played the other way by George W. Bush.  He was a "regular guy," the type of fellow you could enjoy having a beer with on Saturday night and discussing the latest sports statistics.

And my thought was: don't you want our leaders to be smarter than you are?  I mean, seriously.  I know that I and the guys I have a beer with on Saturday night aren't qualified to run the country.  (And to my bar buddies, no disrespect intended.)  There's no way in hell I'm smart enough to be president.  One of the things I want in the people we elect to office is that they are smart -- smart enough to make good decisions based on actual factual knowledge.

That, apparently, is not the norm, which the election of Donald Trump -- clearly one of the least-qualified people ever to hold the highest office in the land -- illustrated with painful clarity.  But it wasn't only a flip of the middle finger at the Coastal Elites that got him there.  The study by Imhoff et al. suggests that it was because of a pervasive tendency to treat all sources of information as if they were equal.

"[T]he data consistently suggests [people with a conspiracy mentality] just ignore source characteristics," Imhoff said.  "To them a web blog is as trustworthy as an Oxford scholar.  As we have formulated, they have terminated the social contract of epistemic trust, that we should believe official sources more than unofficial ones."

I blame part of this on people like Rush Limbaugh, Sean Hannity, Ann Coulter, and (of course) Alex Jones, who have gone out of their way for years to convince everyone that the powers-that-be are lying to you about everything.  Now, the powers-that-be do lie sometimes.  Also, being an Oxford scholar is no guarantee against being wrong.  But if you cherry-pick your examples, and then act as if those instances of error or dishonesty are not only universal, but are deliberate attempts to hoodwink the public for nefarious purposes -- you've set up a vicious cycle where the more facts and evidence you throw at people, the less they trust you.

As I've pointed out before: if you can teach people to disbelieve the hard data, it's Game Over.  After that, you can convince them of anything.

******************************************

This week's Skeptophilia book recommendation is from one of my favorite thinkers -- Irish science historian James Burke.  Burke has made several documentaries, including Connections, The Day the Universe Changed, and After the Warming -- the last-mentioned an absolutely prescient investigation into climate change that came out in 1991 and predicted damn near everything that would happen, climate-wise, in the twenty-seven years since then.

I'm going to go back to Burke's first really popular book, the one that was the genesis of the TV series of the same name -- Connections.  In this book, he looks at how one invention, one happenstance occurrence, one accidental discovery, leads to another, and finally results in something earthshattering.  (One of my favorites is how the technology of hand-weaving led to the invention of the computer.)  It's simply great fun to watch how Burke's mind works -- each of his little filigrees is only a few pages long, but you'll learn some fascinating ins and outs of history as he takes you on these journeys.  It's an absolutely delightful read.

[If you purchase the book from Amazon using the image/link below, part of the proceeds goes to supporting Skeptophilia!]




Thursday, October 19, 2017

Purging the experts

One of the political trends I understand least is the increasing distrust of scientists by elected officials.

It's not like this disparagement of experts is across the board.  When you're sick, and the doctor runs tests and diagnoses you with a sinus infection, you don't say, "I don't believe you.  My real estate agent told me it sounded like I had an ulcer, so I'm gonna go with that."  When you get on an airplane, you don't say to the pilot, "You damn elite aviation specialists, you're obviously biased because of your training.  I think you should hand over the controls to Farmer Bob, here."  When you have your car repaired, you wouldn't say to the mechanic, "I'm not going to do the repairs you suggest, because you have an obvious monetary interest in the car being broken.  I'll get a second opinion from my son's kindergarten teacher, Mrs. Hinkwhistle, who is a disinterested party."

But that's how scientists are treated by politicians.  And it's gotten worse.  Just yesterday, Scott Pruitt, who is the de facto leader of the Environmental Protection Agency despite his apparent loathing of both the environment and the agency, announced that there was going to be a purge of scientists on EPA advisory boards.


"What’s most important at the agency is to have scientific advisers that are objective, independent-minded, providing transparent recommendations,” Pruitt said when he spoke to a group at the Heritage Foundation, an anti-environmental, pro-corporate lobby group.  "If we have individuals who are on those boards, sometimes receiving money from the agency … that to me causes questions on the independence and the veracity and the transparency of those recommendations that are coming our way."

Well, of course environmental scientists get funding from the EPA, you dolt.  One of the EPA's functions is providing grants for basic research in environmental science.  Saying that environmental scientists can't be on EPA advisory boards is a little like excluding doctors from being on medical advisory boards.

Can't have that, after all.  Those doctors are clearly biased to be in favor of policies that promote better health care services, because then they get money for providing those services.  Better populate the medical advisory boards with people who know nothing whatsoever about medicine.

Of course, I am morally certain that the purging of trained scientists from EPA advisory boards is not simply because of this administration's anti-science bent, although that clearly exists as well.  The fight between corporate stooges like Scott Pruitt and the scientific community stems from the fact that much of what the scientists are saying runs counter to economic expediency.  You know, such things as:
  • Climate change exists and is anthropogenic in origin
  • Dumping mining waste into streams and lakes is a bad idea
  • Corporations need strictures on the impact of what they do on the environment, because they have a poor track record of policing themselves
  • Reducing the allowable amounts of air pollutants improves air quality and eases such conditions as asthma and chronic bronchitis
  • Oil pipelines have a nasty habit of breaking and leading to damaging oil spills
  • It's a stupid idea to store pressurized natural gas in unstable underground salt caverns
All of which we environmental types -- by which I mean, people who would like future generations to have drinkable water, breathable air, and a habitable world -- have had to fight in the past year.  The Trump administration's approach to environmental policy is like the Hydra; you cut off one foul, pollution-emitting head, and it grows two more.

The whole thing is driven by a furious drive toward deregulation, which in turn comes out of unchecked corporate greed.  Jennifer Sass, senior scientist for the National Resources Defense Council, nailed it:  "Pruitt’s purge has a single goal: get rid of scientists who tell us the facts about threats to our environment and health.  There’s a reason he won’t apply the same limits to scientists funded by corporate polluters.  Now the only scientists on Pruitt’s good list will be those with funding from polluters supporting Trump’s agenda to make America toxic again."

Michael Halpern, deputy director of the Center for Science and Democracy, agreed.  Halpern said that if Pruitt succeeds in his purge, he "would be willfully setting himself up to fail at the job of protecting public health and the environment."

The problem is, stories like this get buried in the ongoing shitstorm that has characterized the leadership of the United States in the last ten months.  It's another Hydra, and people simply can't pay attention to all of the horrible news at the same time.  That's what they're counting on -- that with outrages over kneeling athletes and disrespect by the president of military widows and allegations of sexual impropriety, we'll just ignore the fact that while all this other stuff is happening, our leaders are gutting every protection the environment has gained in the last fifty years.

You'd think that with the natural disasters this year -- unprecedented hurricanes and wildfires and floods -- we'd wise up and say, "You know, maybe it's time we started paying attention to the damage we've done."  But unfortunately, we're heading in exactly the opposite direction.  My fear is that by doing this, we're making the eventual backlash from the environment unstoppable.

And it would be a Pyrrhic victory, but I hope Scott Pruitt is around to watch it happen.

Tuesday, February 28, 2017

Ignoring the experts

The new book The Death of Expertise: The Campaign Against Established Knowledge and Why It Matters, by Thomas Nichols, could not have been published at a better time.

We have an administration that is relentlessly committed to creating their own set of "alternative facts" and labeling as fake news anything that contradicts the narrative.  Criticism is met with reprisal, honest journalism with shunning, facts and evidence with accusations of bias.  The message is "don't listen to anyone but us."

Nichols's contention is that we got here by a steady progress over the last few decades toward mistrusting experts.  Why should we rely on the pointy-headed scientists, who are not only out of touch with "real people" but probably are doing their research for some kind of evil purpose?  You know those scientists -- always unleashing plagues and creating superweapons, all the while rubbing their hands together in a maniacal fashion.

I have to mention, however, that this was something that always puzzled me about 1950s horror films.  Those scientists who were part of an evil plot to destroy the Earth -- what the hell was their motivation?  Don't they live here too?

Be that as it may, Nichols makes a trenchant point; our lazy, me-centered, fundamentally distrustful culture has created an atmosphere where anyone who knows more than we do is automatically viewed with suspicion.  We use WebMD to diagnose ourselves, and argue with the doctor when (s)he disagrees.  We rate our folksy "look at the weather we're having, climate change can't be real" anecdotes as somehow having more weight than the hard data of actual trained climate scientists.  We accept easy solutions to complicated problems ("Build a wall") instead of putting in the hard work of understanding the complexity of the real situation.

What does she know, anyhow?  [image courtesy of the Wikimedia Commons]

Nichols was interviewed a couple of days ago in the Providence Journal, and shared some pretty disturbing observations about the predicament our culture is in.  "The United States is now a country obsessed with the worship of its own ignorance," Nichols said.  "Worse, many citizens today are proud of not knowing things.  Americans have reached a point where ignorance, especially of anything related to public policy, is an actual virtue."

Of course, Nichols is not the first person to comment upon this.  Isaac Asimov, in his 1980 essay "The Cult of Ignorance," wrote something that has become rightly famous: "There is a cult of ignorance in the United States, and there always has been.  The strain of anti-intellectualism has been a constant thread winding its way through our political and cultural life, nurtured by the false notion that democracy means that 'my ignorance is just as good as your knowledge.'"

This, Nichols says, is not only pernicious, it's demonstrably false.  "People can accept the idea that they are not seven feet tall and can't play basketball.  [But] They hate the idea that anybody is smarter than they are and should be better compensated than they are.  This is a radical egalitarianism that is completely nuts."

What is weirdest about this is that we unhesitatingly accept the expertise of some people, and unhesitatingly reject the expertise of others.  "You put your life in the hands of an expert community all day long," Nichols says.  "Every time you take an aspirin or an over-the-counter medication, every time you talk to your pharmacist, every time your kids go to school, every time you obey the traffic directions of a police officer or go through a traffic light.  When you get on an airplane, you assume that everybody involved in flying that airplane from the flight attendant to the pilot and the ground crew and the people in the control tower knows what they are doing."

And yet when we are told that the overwhelming majority of climate scientists accept anthropogenic climate change, a substantial percentage of us go, "Meh, what do they know?"

The difficulty is that once you have fallen into the trap of distrusting expertise, it's hard to see how you could free yourself from it.  As the adage goes, you can't logic your way out of a position that you didn't logic your way into.  Add into the mix not only the rampant anti-intellectualism characteristic of our current society, but the fundamental distrust of all media that is being inculcated into our minds by the rhetoric from the Trump administration, and you've created a hermetically sealed wall that no facts, no evidence, no argument can breach.

So Nichols's conclusions are interesting, enlightening, and deeply troubling.  His arguments are aimed toward the very people who will be the most resistant to accepting them.

And with our current leadership deepening divisions, distrust, and suspicion of experts, it's hard to see how any of this will change any time soon.

Monday, July 25, 2016

Fooling the experts

Today we consider what happens when you blend Appeal to Authority with the Dunning-Kruger Effect.

Appeal to Authority, you probably know, is when someone uses credentials, titles, or educational background -- and no other evidence -- to support a claim.  Put simply, it is the idea that if Stephen Hawking said it, it must be true, regardless of whether the claim has anything to do with Hawking's particular area of expertise.  The Dunning-Kruger Effect, on the other hand, is the idea that people tend to wildly overestimate their abilities, even in the face of evidence to the contrary, which is why we all think we're above average drivers.

Well, David Dunning (of the aforementioned Dunning-Kruger Effect) has teamed up with Cornell University researchers Stav Atir and Emily Rosenzweig, and come up with the love child of Dunning-Kruger and Appeal to Authority.  And what this new phenomenon -- dubbed, predictably, the Atir-Rosenzweig-Dunning Effect -- shows us is that people who are experts in a particular field tend to think that expertise holds true even for disciplines far outside their chosen area of study.

[image courtesy of the Wikimedia Commons]

In one experiment, the three researchers asked people to rate their own knowledge in various academic areas, then asked them to rank their level of understanding of various finance-related terms, such as "pre-rated stocks, fixed-rate deduction and annualized credit."  The problem is, those three finance-related terms actually don't exist -- i.e., they were made up by the researchers to sound plausible.

The test subjects who had the highest confidence level in their own fields were most likely to get suckered.  Simon Oxenham, who described the experiments in Big Think, says it's only natural.  "A possible explanation for this finding," Oxenham writes, "is that the participants with a greater vocabulary in a particular domain were more prone to falsely feeling familiar with nonsense terms in that domain because of the fact that they had simply come across more similar-sounding terms in their lives, providing more material for potential confusion."

Interestingly, subsequent experiments showed that the correlation holds true even if you take away the factor of self-ranking.  Presumably, someone who is cocky and arrogant and ranks his/her ability higher than is justified in one area would be likely to do it in others.  But when they tested the subjects' knowledge of terms from their own field -- i.e., actually measured their expertise -- high scores still correlated with overestimating their knowledge in other areas.

And telling the subjects ahead of time that some of the terms might be made up didn't change the results.  "[E]ven when participants were warned that some of the statements were false, the 'experts' were just as likely as before to claim to know the nonsense statements, while most of the other participants became more likely in this scenario to admit they’d never heard of them," Oxenham writes.

I have a bit of anecdotal evidence supporting this result from my experience in the classroom.  On multiple-choice tests, I have to concoct plausible-sounding wrong answers as distractors.  Every once in a while, I run out of good wrong answers, and just make something up.  (On one AP Biology quiz on plant biochemistry, I threw in the term "photoglycolysis," which sounds pretty fancy until you realize that it doesn't exist.)  What I find was that it was the average to upper-average students who are the most likely to be taken in by the ruse.  The top students don't get fooled because they know what the correct answer is; the lowest students are equally likely to pick any of the wrong answers, because they don't understand the material well.  The mid-range students see something that sounds technical and vaguely familiar -- and figure that if they aren't sure, it must be that they missed learning that particular term.

It's also the mid-range students who are most likely to miss questions where the actual answer seems too simple.  Another botanical question I like to throw at them is "What do all non-vascular land plants have in common?"  There are three wrong answers with appropriately technical-sounding jargon.

The actual answer is, "They're small."

Interestingly, the reason non-vascular land plants are small isn't simple at all.  But the answer itself just looks too easy to merit being the correct choice on an AP Biology quiz.

So Atir, Rosenzweig, and Dunning have given us yet another mental pitfall to watch out for -- our tendency to use our knowledge in one field to overestimate our knowledge in others.  But I really should run along, and make sure that the annualized credit on my pre-rated stocks exceeds the recommended fixed-rate deduction.  I'm sure you can appreciate how important that is.

Wednesday, April 9, 2014

Wine, violins, and trusting your senses

There's still time to put in your guess and enter the 50/50 contest for when Skeptophilia will reach its one-millionth hit!  The cost to enter is $10 (PayPal link on the right, or contact me by email).  Be sure to add a note telling me your guess!

*******************************

I guess I know too much about neuroscience to trust my own senses.  It's a point I've made before; we get awfully cocky about our own limited perspective, when rightfully we should have remarkably little faith in what we see or hear (or remember, for that matter).  Oh, our perceptions are enough to get by on; we wouldn't have lasted long as a species if our sight and hearing led us astray more often than not.

But the devil is in the details, they say, and in this case it proves remarkably (and perhaps regrettably) true.  What you think your senses are telling you is probably not accurate.

At all.

And the worst part is, it doesn't matter if you're an expert.  It might even be worse if you are.  Not only does your confidence blind you to your own mistakes, at times your expectations about what you're experiencing seem to predispose you to blundering more than an amateur would in similar circumstances.

I first ran into this rather troubling phenomenon last year, when a study came out that indicated that wine experts couldn't tell the difference between an expensive wine and a cheap one -- if they were deprived of the information on the label:
French academic Frédéric Brochet... presented the same Bordeaux superior wine to 57 volunteers a week apart and in two different bottles – one for a table wine, the other for a grand cru. 
The tasters were fooled. 
When tasting a supposedly superior wine, their language was more positive – describing it as complex, balanced, long and woody.  When the same wine was presented as plonk, the critics were more likely to use negatives such as weak, light and flat.
Then Brochet pissed off the wine snobs even worse with a subsequent experiment in which it became apparent that the tasters couldn't even tell the difference between a red and a white wine:
[Brochet] asked 54 wine experts to test two glasses of wine– one red, one white. Using the typical language of tasters, the panel described the red as "jammy' and commented on its crushed red fruit. 
The critics failed to spot that both wines were from the same bottle. The only difference was that one had been coloured red with a flavourless dye.
Now lest you think that this phenomenon only applies to wine snobbery, a study has come out from Claudia Fritz at the University of Paris that shows that the same expert-and-expectation bias can occur with our perceptions of sounds -- when she demonstrated that expert violinists couldn't reliably tell the difference between a Stradivarius and a newly-fashioned modern violin:
“During both sessions, soloists wore modified welders’ goggles, which together with much-reduced ambient lighting made it impossible to identify instruments by eye,” the researchers write. In addition, the new violins were sanded down a bit to “eliminate any tactile clues to age, such as unworn corners and edges...” 
In the concert hall, the violinists were given free reign: They could ask for feedback from a designated friend or colleague, and a pianist was on hand so they could play excerpts from sonatas on the various violins. 
Afterwards, they rated each instrument for various qualities, including tone quality, projection, articulation/clarity, “playability,” and overall quality. Finally, they briefly played six to eight of the instruments and guessed whether each was old or new. 
The results: Six of the soloists chose new violins as their hypothetical replacement instruments, while four chose ones made by Stradivari. One particular new violin was chosen four times, and one Stradivarius was chosen three times, suggesting those instruments were the clear favorites.
You can understand how these results might upset classical violinists, perhaps even more than Brochet's experiment ruffled the feathers of the wine tasters.  Stradivarius, after all, is considered the touchstone for sound quality in a string instrument.

[image courtesy of photographer Håkan Svensson and the Wikimedia Commons]

There are 650 known Stradivari instruments, and their market value is estimated to be in the hundreds of thousands to millions of dollars.  Each.  The idea that a new -- albeit excellent -- violin could compete with a Strad in sound quality is profoundly unsettling to a lot of people.

Reasonably speaking, however, I don't know why it should be (other than the monetary aspect, of course).  Wines and music are both rich sensory experiences, and our appreciation of either (or both) is the result of not only the stimulation of millions of sensory neurons, but the release of a complex broth of neurochemicals that creates a feedback loop with our sense organs, emotional centers, and cognitive processes.  We shouldn't expect that experiencing either wine or music would be a predictable thing; if it was, they probably wouldn't have the resonance they do.

So it's not surprising, really, that our expectations about the taste of a wine or the sound of a violin should change our perceptions.  It's just one more kick in the pants to our certainty, however, that what we see and hear and feel is accurate in its details.  The idea doesn't bother me much, honestly.  Nothing that a little Riunite on ice can't fix.